binary: notes on binary encoding
This commit is contained in:
parent
b1595f8cb3
commit
654e6184f9
1 changed files with 1 additions and 1 deletions
|
@ -13,7 +13,7 @@ _Encoding_ is the process of establishing a correspondence between sets of binar
|
|||
|
||||
> An encoding system maps each symbol to a unique sequence of bits. A computer then interprets that sequence of bits and displays the apppropriate symbol to the user.
|
||||
|
||||
The length of the binary number (bytes, 16-bit, 32-bit etc) that is used to represent a given data set is determined by the number of variations that you require to capture the entire range of the dataset. For example, say we know that there are 18 levels to a computer game. To encode a reference for each level we would need a binary number that is capable of at least 18 total variations. In this insta<p>Pellentesque habitant morbi tristique senectus et netus et malesuada fames ac turpis egestas. Vestibulum tortor quam, feugiat vitae, ultricies eget, tempor sit amet, ante. Donec eu libero sit amet quam egestas semper. Aenean ultricies mi vitae est. Mauris placerat eleifend leo. Quisque sit amet est et sapien ullamcorper pharetra. Vestibulum erat wisi, condimentum sed, commodo vitae, ornare sit amet, wisi. Aenean fermentum, elit eget tincidunt condimentum, eros ipsum rutrum orci, sagittis tempus lacus enim ac dui. Donec non enim in turpis pulvinar facilisis. Ut felis. Praesent dapibus, neque id cursus faucibus, tortor neque egestas augue, eu vulputate magna eros eu erat. Aliquam erat volutpat. Nam dui mi, tincidunt quis, accumsan porttitor, facilisis luctus, metus</p>
|
||||
The length of the binary number (sometimes called the _word length_ corresponding to bytes, 16-bit, 32-bit etc) that is used to represent a given data set is determined by the number of variations that you require to capture the entire range of the dataset. For example, say we know that there are 18 levels to a computer game. To encode a reference for each level we would need a binary number that is capable of at least 18 total variations. In this instance a 32-bit ($2^{5}$) number would be best because the next smallest (16-bit) would not be sufficient. Our levels would have representations as follows:
|
||||
|
||||
```
|
||||
00001 (1)
|
||||
|
|
Loading…
Add table
Reference in a new issue