[wellylug] [OT] Internet & Citylink

Pete Black pete at marchingcubes.com
Tue Mar 1 10:21:12 NZDT 2005


As far as I know, the byte corresponds directly to the size of a char on 
a given architecture.

If a machine uses a different number of bits to specify a char, that is 
it's byte-length. Hence my musing on the shift to a 16-bit byte length 
as unicode becomes the standard.

Also, my understanding is that a word correlates to the width of the 
CPU's bus - in that 64 bit machines use a 64 bit word, 32 bit machines 
use a 32 bit word, 16 bit machines use a 16-bit word etc.

This is confused with CPUS like the 386SX that have 32-bit internal 
'word size', 16-bit external bus 'word size' and 24 bit memory 
addressing 'word size', although  I think the 386SX was considered to 
have a 32bit word size due to the width of it's internal bus.

-Pete

>On Mon, 28 Feb 2005 23:33:23 +1300, Pete Black <pete at marchingcubes.com> wrote:
>  
>
>>Also, while 8 bits is the conventional size of a byte (char) in modern 
>>computer systems, it is by no means the only size used for the byte.
>>
>>I believe older(like, 1970s old)  IBM systems used a 12 bit byte, so you 
>>can't even be sure you are talking about specific quantities of bits 
>>when specifying kilobytes/sec etc.
>>    
>>
>
>My impression (I could be wrong) was that bytes were always eight bits,
>but that various architectures had different /word/ sizes.  The PDP-8
>had a 12 bit word size, for instance.
>
>donald
>
>
>  
>




More information about the wellylug mailing list