The history of computers in our review so far (in the
first two articles of the history of the origin of computers) has taken us from
its inception almost five thousand years ago to first-generation mechanical and
electronic computers. In today’s review,
however, we string together the important inventions and advanced technology
that brought us to the very heart of the contemporary computer of the modern
digital age. All forms of computers described in this history review of their
origin were so different in appearance that it would be challenging to draw
visible parallels between them by external observation alone.
However, their purpose was the same; all these
computers have been refined with the same goal: to serve humanity in its
progress, help it solve complex mathematical problems and puzzles, and thus contribute
to forming a more advanced society. Today’s modern computers have added an
extra dimension to this: global networking, information sharing, socialising,
and personal entertainment.
A decisive turning point in computer science and the
emergence of the modern computer was enabled by the unimaginable development of
electronics and electrical engineering. Thus, with the invention of tubes and
electronic circuits, first-generation computers were created, and advances in
electronic technology launched second-generation computers into the computer
orbit. What invention do we have to thank for this new leap towards modern
The invention of the
transistor paves the way for second-generation computers.
Vacuum tubes brought a new wind into the computer
story, but computers needed a lot of them to operate. These vacuum tubes generated
large amounts of heat that warmed the computers up as if they were ovens. There
was an urgent need to invent something more useful. This is where the
semiconductor electronic element comes into the computer scene – transistors
far exceed the vacuum tubes: they are much smaller, only a few millimetres in
size, consume much less energy and, therefore, produce very little heat. They are
more reliable and have a longer service life. After 1955, transistors began to
be built into computers, and such computers are now often called second-generation
The transistor concept dates back several decades to
1925 when Julius Edgar Lilienfeld introduced it. The first working transistor
was assembled by John Barden, Walther Brattain, and William Bradford Shockley
at Bell Labs in New Jersey a year later. But here, the path of the transistor
does not end; it took until 1947 for it to become reliable enough that it was helpful
for installation into computers.
In 1954, IBM introduced its first commercial
computers, which had built-in transistors.
MOSFET transistors triggered
a new computer revolution.
At Bell Labs, they wanted to improve the excellent
properties of transistors. They succeeded in 1959 when Mohamed Atalla and Dawon
Kahng invented the Metal Oxide Semiconductor Field Effect Transistor or MOSFET.
This was the first transistor to be made on a genuinely miniature scale that
could be mass-produced for a wide range of use. Its features were: very low
power consumption, low voltage, and low current operation, which made it
significantly more energy-efficient and easier to integrate into a silicon
wafer (why this feature is crucial will be explained below).
With MOSFET transistors, the door to one of the most
important discoveries in computer technology was wide open, and this discovery
led to third-generation computers.
opens new doors – the “culprit” is the integrated circuit.
What exactly is an integrated circuit, and what does
it mean for computer development? Without thinking twice, an integrated circuit
could be described as many miniature transistors as possible on a silicon wafer
(with a bunch of output wires) – also called a chip. Jack Kilby discovered the
integrated circuit in 1958 (he received the Nobel Prize in Physics in 2000),
and soon after, Robert Noyce perfected it.
The first computer based on silicon integrated
circuits led astronauts to the moon. It was built specifically for the Apollo
The invention of the integrated circuit fundamentally
shook and changed computers: they became faster, smaller, more powerful, and
cheaper. Unlike previous computers, these third-generation computers already
had a keyboard and a screen. And we all know which way this is going, right?
Chips or integrated circuits were a fundamental step
in the 1970s, from which we climbed faster and faster towards smaller and more
mobile computers, and thus travelled deeper and deeper into the digital society
in which our lives have changed significantly; our business lives as well as
personal. And how little time we had to prepare for it!
Modern computers have flooded
the world – with microprocessors at their heart.
Technology, of course, was not satisfied with what had
been achieved, development was rushing forward, and with the new improvement of
integrated circuits, it sealed our destiny and turned a new page in human
history. Microprocessors have happened – for the first time in history, experts
have managed to install an enormous number of transistors, even millions, on a
single integrated circuit, on a single chip.
With the microprocessor, all logical and mathematical
operations, memory, programme, programme codes, and communication channels,
which were previously all on several processors in the computer, and were
interconnected – ended up on one tiny chip. In 1971, the first microprocessor
available for mass consumption came to the market – the Intel 4004 chipset. In
addition to powerful computers for business use and supercomputers, the mass
production of personal computers for personal use began. Apple and IBM personal
computers entered the market.
The invention of the microprocessor brought fourth-generation
computers to the scene. These computers became so small that we could put them
on a table and that we still use in countless improved versions (laptops,
smartphones, tablets) today. Not only have personal computers changed
technically, but they also found an extraordinary place in the life of modern
humanity with the invention of the internet in the 1990s.
They have become our window into the world, our
connection with fellow human beings, our personal allies, available always –
night and day, our personal advisers and in some way even our friends. Without digitalisation,
we can no longer imagine everyday life, as it has brought us a lot – it has
given us the whole world on a plate. It will be an essential tool for the new
generations, through which they will anchor new knowledge and show humanity new
They say that the fifth generation of computers is
already on the horizon, waiting to be developed, with advanced computer
microelectronic technologies that will enable parallel data processing and much
faster computing, and with nanotechnology, artificial intelligence, quantum
computing. So, the story of computer history is by no means over…
This article is part of a joint project of the Wilfried Martens Centre for European Studies and the Anton Korošec Institute (INAK), Following the path of digitalisation in Slovenia and Europe. This project receives funding from the European Parliament. The information and views set out in this article are those of the author and do not necessarily reflect the official opinion of the European Union institutions/Wilfried Martens Centre for European Studies/ Anton Korošec Institute. Organisations mentioned above assume no responsibility for facts or opinions expressed in this article or any subsequent use of the information contained therein.