Skip to main content

The dawn of the integrated circuit








SAKSHAM SIDANA

Cue 1958. World War 2 has settled, peacetime has been restored, with the help of a computer less powerful than the smartphone we use today. Computing has so far seen significant advancements, from the Enigma machine all the way to ENIAC - the world’s first general-purpose computer.

Groundbreaking as the ENIAC was, it involved huge numbers of discrete components laboriously connected to each other by over 17000 vacuum tubes(which transferred electrons from a heated cathode to an anode in a glass body). As a result, the computer occupied a whole room, notwithstanding the fact that the tubes were highly vulnerable to damage and required regular replacement.

Consequently, the transistor was invented in 1947 - a piece that would act as ‘on’ and ‘off’ switches when given specific signals. Although this was an extraordinary breakthrough, it still didn’t solve the space problem. Components still had to be wired together individually, taking up time, and stopping digital systems from improving performance, due to the sheer quantity of components involved. Jack Morton, a Bell Labs employee, referred to this as the ‘tyranny of numbers’.

We’re now launched back to 1958, when Jack Kilby, an employee of the renowned Texas Instruments, came up with a solution known as an Integrated Circuit(IC). Allowing for a complex circuit to be housed on a miniature chip, the invention overcame the need for a multitude of cables to be individually connected. Soon, Noyce iterated the prototype that Kilby had patented, choosing instead to use silicon because of its greater stability and melting point compared to germanium(which Kilby used) - this was known as the monolithic integrated circuit. Engineers had finally found a way to produce more powerful systems, whilst saving energy, time, and money.

The Apollo missions of the 1960s were one of the first real-world applications of ICs; the microchips provided an economical and lightweight method of producing the Apollo Guidance Computer(AGC). The success of the lunar landing mission in 1969 not only was historical in the field of space exploration but was also the perfect exemplar for the efficacy of the Integrated circuit.

Little did Kilby and Boyce know that their Nobel prize-winning invention would become the backbone of the coming technological revolution. Microchips began to find themselves in hearing aids and soon made their way into most digital devices. Notorious for co-founding Intel, Robert Moore stated that ‘the number of transistors and resistors on a chip doubles every 24 months’ - this became known as ‘Moore’s Law’, and though it was not technically a ‘law’, it has remained an accurate prediction for the past 50 years.

Furthermore, the newfound ability to package millions or billions of components into computer processors(CPU) allowed them to process instructions at record speeds. Correspondingly, the mass production capabilities of microchips made them increasingly affordable. Suddenly, it dawned upon the world.

A far fetched dream of affordable personal computing had now become a reality.

By the 1980s, developments in microchip technology allowed for the PC to become a common article in homes across the globe. Software firms were prospering, the Internet was going to revolutionise how we communicate, and an endless world of opportunity had suddenly opened up.

In other words, technology firms had now begun their own ‘space race’: a competition to cram the most components into the smallest possible dimensions.

This gave rise to the era of the flat-screen television, the smartphone, the tablet … all things we take for granted today. Nevertheless, the IC is also used in a plethora of industrial and healthcare applications, from intelligent drug delivery systems to adaptive cruise control in modern vehicles.

Looking forward, it could be debated whether Moore’s Law will hold up at the same pace - after all, how physically small can a transistor get? Experts have proposed switching to silicon alternatives(such as carbon nanotubes), but for now, the future of microchips is a grey area.

To conclude, the integrated circuit is a multi-faceted invention that has revolutionised the way humans interact with both technology and each other; without many knowing of its existence, it has quietly shaped the way we conduct our lives. Period.

Comments

Anonymous said…
It is amazing how far we have come - the average smartphone has roughly 100,000 times the computing power of the Apollo 11 guidance computer! A fascinating read about a topic that has shaped modern society

Popular posts from this blog

A CALL TO CREATIVITY

Hello and welcome to The Looking Glass, WBGS' very own Academic Blog.  This year we are planning to breathe new life into this amazing blog as the Academic Head Boy team for 2025- 2026! However, at the Looking Glass we need your help to catapult this blog into it's GOLDEN AGE.  We need your articles, your essays, your opinions and your finest work to MAKE THE LOOKING GLASS GREAT AGAIN! If you have read something interesting or watched something that sparked a thought on social media -  WRITE ABOUT IT! If you entered a competition, however big or small - WRITE ABOUT IT! If you are interested in a specific field, issue or period - WRITE ABOUT IT! If you have produced artwork, a piece of music or creative writing - WE WILL PUBLISH IT! Your creative skills have been called to action - now we must muster to create, discover and explore.  You are the creative minds of the future. The Plato's, the Newtons, the Angelo's, the Nietzsche's. This is your calling.  This is Y...

Complexity for complexity’s sake? The Ars subtilior repertory

MR B. F. EASTLEY, MATHEMATICS TEACHER This essay provides a brief overview of a fascinating period of musical development during the latter half of the fourteenth century, during which some of the most sophisticated music ever written was composed and performed. The ‘Ars subtilior’ or ‘subtler art’ (a 20th century musicologist’s title) is a repertory of several hundred songs by French, Italian, Flemish, and Spanish musicians. This music is quite distinct from other contemporary compositions due to its dazzling complexity in all aspects – particularly rhythmic – but also harmonic, textual, and sometimes visual.

Clair de Lune: The history of the world’s most overplayed piano piece

CHAMOD SAMARASINGHE Classical music is an unusual art. It is dominated by a few pieces which are far more popular than everything else which has been composed within the past few centuries. When compared to Beethoven’s fifth symphony, Bach’s toccata in d minor, Handel’s messiah and fur Elise (and a few others), everything else is a comparative blur to most. Scholars could argue that this is due to their memorable nature and overall simplicity (for the listener, not the composer), but there is one notable exception to this rule: Clair de Lune by Claude Debussy. While the opening melody is certainly ear-catching and repetitive, everything else seems deliberately ambiguous, perfectly melancholy, and at times downright unusual.