Search Result for "byte": 
Wordnet 3.0

NOUN (1)

1. a sequence of 8 bits (enough to represent one character of alphanumeric data) processed as a single unit of information;


WordNet (r) 3.0 (2006):

byte n 1: a sequence of 8 bits (enough to represent one character of alphanumeric data) processed as a single unit of information
The Jargon File (version 4.4.7, 29 Dec 2003):

byte /bi:t/, n. [techspeak] A unit of memory or data equal to the amount used to represent one character; on modern architectures this is invariably 8 bits. Some older architectures used byte for quantities of 6, 7, or (especially) 9 bits, and the PDP-10 supported bytes that were actually bitfields of 1 to 36 bits! These usages are now obsolete, killed off by universal adoption of power-of-2 word sizes. Historical note: The term was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer; originally it was described as 1 to 6 bits (typical I/O equipment of the period used 6-bit chunks of information). The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System /360. The word was coined by mutating the word ?bite? so it would not be accidentally misspelled as bit. See also nybble.
The Free On-line Dictionary of Computing (30 December 2018):

Byte A popular computing magazine. (http://byte.com). (1997-03-27)
The Free On-line Dictionary of Computing (30 December 2018):

byte bite /bi:t/ (B) A component in the machine data hierarchy larger than a bit and usually smaller than a word; now nearly always eight bits and the smallest addressable unit of storage. A byte typically holds one character. A byte may be 9 bits on 36-bit computers. Some older architectures used "byte" for quantities of 6 or 7 bits, and the PDP-10 and IBM 7030 supported "bytes" that were actually bit-fields of 1 to 36 (or 64) bits! These usages are now obsolete, and even 9-bit bytes have become rare in the general trend toward power-of-2 word sizes. The term was coined by Werner Buchholz in 1956 during the early design phase for the IBM Stretch computer. It was a mutation of the word "bite" intended to avoid confusion with "bit". In 1962 he described it as "a group of bits used to encode a character, or the number of bits transmitted in parallel to and from input-output units". The move to an 8-bit byte happened in late 1956, and this size was later adopted and promulgated as a standard by the System/360 operating system (announced April 1964). James S. Jones adds: I am sure I read in a mid-1970's brochure by IBM that outlined the history of computers that BYTE was an acronym that stood for "Bit asYnchronous Transmission E..?" which related to width of the bus between the Stretch CPU and its CRT-memory (prior to Core). Terry Carr says: In the early days IBM taught that a series of bits transferred together (like so many yoked oxen) formed a Binary Yoked Transfer Element (BYTE). [True origin? First 8-bit byte architecture?] See also nibble, octet. [Jargon File] (2003-09-21)