intel CPU history 

intel has made a fuck ton of CPUs. do you know how many? lots of them. every new process requires new factories and all kinds of waste. but did you know there's a DARK secret? ok maybe not DARK or a secret, but something really worth digging in to. so let's look at the history here

intel CPU history 

1971 intel releases their chip, the 4004. it's a 4-bit CPU max clocked around 750 KILOHERTZ. it has 2250 transistors and 16 pins. you'd use it with ROM, DRAM and shift registers. this bad boi could do some math shit, hit up maybe 16 registers (though you could only store like one number digit in 4 bits) and and 640 BYTES of RAM. it ran at 15 volts for some reason and used 5v logic levels. this shit was fucking DOPE

intel CPU history 

1972 calls. intel released the 8008, a fucking BYTE-oriented CPU. it was a bit slower but it was a step up: instead of calculators this was intended to drive a CRT terminal. but instead of 16 registers it had 7, with the rest being used for the program counter or call stack. yes, call stacks went in the registers. the 4004 had a 3 level call stack, the 8008 had 7. it had 18 pins, with 8 of them being the data bus and supported 14-bit addresses with 48 instructions

intel CPU history 

separate chips would be used to actually handle things like mapping memory, bank switching, IO registers and stuff. this is the kind of architecture you would have in a retrocomputer as many discrete chips were used to handle separate tasks, while modern 'system on a chip' dies include these all on one die. this saves cost, power and also allows faster computers as electronic signals are limited by the speed of light so modules have to be closer

intel CPU history 

what we'd call modern x86 'CPUs' are actually systems on chips like that but with a TON of shit packed in, including what use to be northbridge/southbridge chips. another more pressing reason for this kind of integration is pin count. each electronic chip needs power pins, data pins, clock pins, etc. this really still adds up to the point we have 'system on modules' or 'system in packages' which are just chips on a single PCB all connected with useful pins on a connector

intel CPU history 

anyway it's now 1974. intel released the 4004 which was a straight upgrade from the 4004. you get 8 more registers, INTERRUPTS and a larger subroutine stack. interrupts are a key feature in CPUs- it allows devices to make the CPU jump to certain sections of code, 'interrupt handlers'. this is a blessing and a curse because on one hand interrupts are necessary but also hard to handle. let me explain *pulls up whiteboard*

intel CPU history 

say you have a program that needs to check email. it can check it when it has free time, or get a notification (an interrupt) and decide to do something about it. the main problem here is that when the CPU has an interrupt, it jumps to a different code address and that's 'all'. the programmer has to write code to save all the current task's state and be very careful about which resources it touches. and then what if you get an interrupt during that?

intel CPU history 

you effectively have a multitasking CPU that will switch tasks, and as such you need to program as if your code is running in a somewhat distributed environment. you have to lock resources that are in use and deal with things like critical sections. this is the price of being able to be responsive to things like mouse moves and hard drives being ready with data

intel CPU history 

ANYWAY it's still 1974 and intel releases the legendary 8080 CPU. it's an 8-bit CPU with a 16-bit address bus. 40 pins, 16 for addressing, 8 for data. this thing did away with call stack as registers and instead left you with a painful 7 8-bit general purpose registers, 4 of which can be used in pairs to make 16-bit addresses. it also does away with requiring chips for muxing peripherals, instead it supports IO 'ports' separate from the address space

intel CPU history 

running at 2mhz instead of the 8008's 500khz, this thing is what armchair historians like me collectively call 'hot shit' and luddites call 'ground zero'. this chip went on to be used on the first microcomputer: the Altair 8800. then it was used (or a clone was) for various 8-bit home microcomputers during the 70s and 80s. this shit was what they call 'cash' to the point intel decided compatibility with the 8080 was important from then to now

intel CPU history 

where do you go from here? by releasing the 8085 in 1976. what's the difference? it runs at 5v, instead of its weird 15v supply. you know what else happened that year? zilog released the z80 which is what most home microcomputers used. this shit had more useful instructions, it had a secondary set of registers just for running interrupts and extra good stuff. this CPU is still produced to this day and used in calculators and shit

intel CPU history 

1978: with zilog effectively taking the 8080 and saying 'this is mine now', intel had to think fast and improve. naturally intel added more bits and made the 8086! it's 16-bit! it was intended as a stop-gap until they could work out a better architecture. narrator: they didn't

so you have 8 16-bit CPU registers, but 4 of them double as 8-bit registers. it also borrowed the z80's idea of having indexing registers for addresses.

intel CPU history 

the 8086 was used by the IBM PC to give us ... PC shit i guess? DOS? that's super cool. but i have a slight bone to pick: memory segmentation. see, 16-bit memory addresses only go up to 64k. not great right? so x86 used two registers: a segment and an offset, to get access of up to 1 megabyte of memory. you're probably wondering how that works given 32-bit systems can access 4GB of memory. well, intel made a tradeoff

Follow

intel CPU history 

@jookia they used the 8088 in the first PC, but the differences are minor to your overall point.

· · Web · 0 · 0 · 1
Sign in to participate in the conversation
Octodon

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!