The dawn of the integrated circuit








SAKSHAM SIDANA

Cue 1958. World War 2 has settled, peacetime has been restored, with the help of a computer less powerful than the smartphone we use today. Computing has so far seen significant advancements, from the Enigma machine all the way to ENIAC - the world’s first general-purpose computer.

Groundbreaking as the ENIAC was, it involved huge numbers of discrete components laboriously connected to each other by over 17000 vacuum tubes(which transferred electrons from a heated cathode to an anode in a glass body). As a result, the computer occupied a whole room, notwithstanding the fact that the tubes were highly vulnerable to damage and required regular replacement.

Consequently, the transistor was invented in 1947 - a piece that would act as ‘on’ and ‘off’ switches when given specific signals. Although this was an extraordinary breakthrough, it still didn’t solve the space problem. Components still had to be wired together individually, taking up time, and stopping digital systems from improving performance, due to the sheer quantity of components involved. Jack Morton, a Bell Labs employee, referred to this as the ‘tyranny of numbers’.

We’re now launched back to 1958, when Jack Kilby, an employee of the renowned Texas Instruments, came up with a solution known as an Integrated Circuit(IC). Allowing for a complex circuit to be housed on a miniature chip, the invention overcame the need for a multitude of cables to be individually connected. Soon, Noyce iterated the prototype that Kilby had patented, choosing instead to use silicon because of its greater stability and melting point compared to germanium(which Kilby used) - this was known as the monolithic integrated circuit. Engineers had finally found a way to produce more powerful systems, whilst saving energy, time, and money.

The Apollo missions of the 1960s were one of the first real-world applications of ICs; the microchips provided an economical and lightweight method of producing the Apollo Guidance Computer(AGC). The success of the lunar landing mission in 1969 not only was historical in the field of space exploration but was also the perfect exemplar for the efficacy of the Integrated circuit.

Little did Kilby and Boyce know that their Nobel prize-winning invention would become the backbone of the coming technological revolution. Microchips began to find themselves in hearing aids and soon made their way into most digital devices. Notorious for co-founding Intel, Robert Moore stated that ‘the number of transistors and resistors on a chip doubles every 24 months’ - this became known as ‘Moore’s Law’, and though it was not technically a ‘law’, it has remained an accurate prediction for the past 50 years.

Furthermore, the newfound ability to package millions or billions of components into computer processors(CPU) allowed them to process instructions at record speeds. Correspondingly, the mass production capabilities of microchips made them increasingly affordable. Suddenly, it dawned upon the world.

A far fetched dream of affordable personal computing had now become a reality.

By the 1980s, developments in microchip technology allowed for the PC to become a common article in homes across the globe. Software firms were prospering, the Internet was going to revolutionise how we communicate, and an endless world of opportunity had suddenly opened up.

In other words, technology firms had now begun their own ‘space race’: a competition to cram the most components into the smallest possible dimensions.

This gave rise to the era of the flat-screen television, the smartphone, the tablet … all things we take for granted today. Nevertheless, the IC is also used in a plethora of industrial and healthcare applications, from intelligent drug delivery systems to adaptive cruise control in modern vehicles.

Looking forward, it could be debated whether Moore’s Law will hold up at the same pace - after all, how physically small can a transistor get? Experts have proposed switching to silicon alternatives(such as carbon nanotubes), but for now, the future of microchips is a grey area.

To conclude, the integrated circuit is a multi-faceted invention that has revolutionised the way humans interact with both technology and each other; without many knowing of its existence, it has quietly shaped the way we conduct our lives. Period.