/*

Bit

Bit, abbreviation for binary digit, the smallest unit of information in a computer. A bit is represented by the numbers 1 and 0, which correspond to the states on and off, true and false, or yes and no.

Bits are the building blocks for all information processing that goes on in digital electronics and computers. Bits actually represent the state of a transistor in the logic circuits of a computer. The number 1 (meaning on, yes, or true) is used to represent a transistor with current flowing through it—essentially a closed switch. The number 0 (meaning off, no, or false) is used to represent a transistor with no current flowing through it—an open switch. All computer information processing can be understood in terms of vast arrays of transistors (3.1 million transistors on the Pentium chip) switching on and off, depending on the bit value they have been assigned.

Bits are usually combined into larger units called bytes. A byte is composed of eight bits. The values that a byte can take on range between 00000000 (0 in decimal notation) and 11111111 (255 in decimal notation). This means that a byte can represent 28 (2 raised to the eighth power) or 256 possible states (0-255). Bytes are combined into groups of 1 to 8 bytes called words. The size of the words used by a computer’s central processing unit (CPU) depends on the bit-processing ability of the CPU. A 32-bit processor, for example, can use words that are up to four bytes long (32 bits).

Computers are often classified by the number of bits they can process at one time, as well as by the number of bits used to represent addresses in their main memory (RAM). Computer graphics are described by the number of bits used to represent pixels (short for picture elements), the smallest identifiable parts of an image. In monochrome images, each pixel is made up of one bit. In 256-color and gray-scale images, each pixel is made up of one byte (eight bits). In true color images, each pixel is made up of at least 24 bits.

The term bit was introduced by John Tukey, an American statistician and early computer scientist. He first used the term in 1946, as a shortened form of the term binary digit.



Share On Facebook ! BitTweet This ! (Click On It For Url Shortening) Share On Friend Feed ! Add To Del.icio.us ! Share On Digg ! Share On Reddit ! Share On StumbleUpon !


Search This Blog

Your Ad Here
Future Begins Here !
Copyright © 2009 - 2010. CHILLAPPLE Group.