During the decade leading up to 2020, high-quality internet access became available almost everywhere in the world – for those who could afford it. Mobile phones were built into clothing and projected sounds directly into the ears of their owners. The user was now interfaced with the Noos in a truly mobile sense.
Computers became "ambient" - sensitive and responsive to the presence of people - and then "ubiquitous" - completely integrated into commonplace objects and activities. Technology practically disappeared into our everyday surroundings.
During the previous 120 years or so, five world-changing leaps forward in computer technology had occurred in quick succession. From the electromechanical era that began in 1900, we moved to relay-based technology; from there to the vacuum tube; onwards to the transistor and, finally, thanks to a largely unknown radar scientist at the British Ministry of Defence by the name of Drummer, we had arrived by 1952 at the age of the integrated circuit. The basic building-block for technological advancements that were previously considered to be in the realm of science fiction had arrived... and the world would never be the same again.
The purpose of Drummer’s integrated circuit was to cram as many transistors as possible onto a microchip, making the technology far smaller than had ever been thought possible - and capable of being mass-produced. From 1965, transistor densities, the size of hard drives and the amount of information capable of being transmitted along an optical fiber were doubling approximately every two years.
By 1995, the ground-breaking chips of the day had more than nine million transistors. In 2001, a cutting-edge microprocessor had around 40 million transistors. By 2015, each such processor contained more than 15 billion transistors. Computers that were the size of a room in the year 1900 were now the size of a walnut. And - literally - billions of times more powerful. (than the computer - not the walnut).
By 1995, the ground-breaking chips of the day had more than nine million transistors. In 2001, a cutting-edge microprocessor had around 40 million transistors. By 2015, each such processor contained more than 15 billion transistors. Computers that were the size of a room in the year 1900 were now the size of a walnut. And - literally - billions of times more powerful. (than the computer - not the walnut).
In 2017, under instructions from the US military, IBM built the first supercomputer capable of performing more calculations per second than the average human brain. By 2020, mass-produced personal computers were at the same level.
The Turing Test, invented in 1950, required humans to ask questions of both a computer and a human and try to figure out which was which. In 2018 a computer passed the test for the first time. Two years later, a computer wrote and successfully tested its own version of the Turing Test.
But then, in the year 2020, scientists reached a startling and somewhat unexpected brick wall – they realised that 150 billion transistors was, for reasons that remained largely unclear, the uppermost physical limit of an integrated circuit. Transistors had by now become so small they were at the atomic level. There was, it seemed, nowhere to go...
It was time for the sixth paradigm shift – three-dimensional molecular computing.
Instead of flat "two-dimensional" microchips, two Russian computer scientists operating out of a poorly-funded and mostly volunteer research facility in Khaborovsk built the world’s first three-dimensional "micro cube" processor to operate at an atomic level. It was the much-anticipated (and for many scientists, long-overdue) breakthrough of advanced nanotechnology.
Suddenly, across the world, technology exploded.
No comments:
Post a Comment