Just like spiders, bits are everywhere. Even though we don’t necessarily work with bits directly, you’d be surprised where you might encounter bits.
Let’s talk about 01100010 01101001 01110100 01110011, baby
Well, what is a bit? The bit is a basic unit of computation, which is commonly represented as a 0 or 1. When we think of 0 and 1, we can usually interpret them as true/false, on/off, or +/-. We can even group bits together to form nibbles (4 bits) or bytes (8 bits) and through bytes, we can create characters.
01001000 01100101 01101100 01101100 01101111 00101100 00100000 01010111 01101111 01110010 01101100 01100100
The following bytes is equivalent to…
72 101 108 108 111 44 32 87 111 114 108 100 (in decimal form)
48 65 6C 6C 6F 2C 20 57 6F 72 6C 64 (in hexadecimal)
Hello, World (in ASCII)
The largest value a single byte can represent is 255, where the number of possible values of a byte is 256 (0000 0000 is equivalent to zero). But where else have we seen the number 255?
My favorite color is #000000
That’s right. We can represent digital color in bytes. Each pair within a hexadecimal representation of color can be broken down to a byte value, thereby creating a three numbered value represented various levels of red, green, and blue.
AD D8 E6 = 173 216 230
But who decided 2^8 values would be the benchmark for deciding color values? Well, the human eye can distinguish about 150 gradations, so 2^7 or 128 gradations would not be enough so 2^8 or 255 would definitely be enough for the human eye.
Process this, bits.
If all we need to represent color are 3 bytes or 24 bits, imagine the possibilities with 32-bits or even 64-bits. A processor handles the instructions of a computer program, performing basic arithmetic, logical, and control operations.
Each of our computers come with a processor. You might have come across a time when you are downloading an application, a requirement might be the type of processor needed (i.e. 32-bit or 64-bit). The number of bits in a processor refers to the size of datatypes it can handle, as well as the amount of computational values it can store and memory it can access. Some applications require only 32 bits to run, where as others require 64 bits. Obviously, 64-bit processors perform better than 32-bit processors as they can access 4 billion times as much memory.
A lot of operations we perform today on our devices may seem simple, but underneath the hood, a lot of action is happening. Bits can perform some of the simplest operations, but as time and technology moved forward, the millions and billions of bits could perform memory intensive tasks.