Once upon a time computers were humans hired to manually perform computations for any number of tasks. As early as 1757 Astronomer Edmundo Haley used a group of people to make parallel calculations leading to the time the comet carrying his name would return.

They got it right.

Even though NASA was an early adopter of electronic computers they still retained hundreds of women who check their work. This was chronicled in the book and movie Hidden Figures. In the first half of the last century engines were designed to effect a physical action. Advances were constrained by the physical laws of thermodynamics, friction, inertia and gravity.

Today logic engines played on silicon chips don’t produce physical action. They instead are designed to manipulate ideas. Gains are emerging in each of three key features relevant to their future performance of these silicon engines: 1-raw computational power (calculations per second), 2-The logic they use to operate (the mathematical structure of the software), and 3-the kinds of materials available to fabricate the engines themselves. The intersection of advances in these three contemporaneous issues resulted in a phase change in silicon engines. In 1965 Intel Co-founder Gordon Moore predicted with his knowledge of silicon wafers, that the ability of silicon to allow more and more speedy calculations would double every 2 years and so they have. It is not a law of physics, but an accurate prediction of a computer genius.

Famous physicist Richard Feynman actually predicted the exponential growth of computer capacity when he said there is always more room at the bottom. That is indeed the case as we approach the manipulation of atoms. Chip manufacturers now fabricate over 50 billion microprocessors a year. Calculations per second have increased a trillion fold.

The implications for applications are akin to the difference between spreadsheets and a voice command or between calculating and inferring. The migration from digital logic is as consequential as the migration from slide rules which hung from my belt throughout college to digital microprocessors a half century ago.

In 1986 Geoffrey Hinton a psychologist and computer scientist envisioned artificial intelligence or AI but it was 20 years before the computer processing chips became powerful enough to carry out his vision for machine learning.

Now, while computationally difficult AI allows the reading of CT Scans, understanding voice commands, navigating a robot and analyzing patterns in biological functions. The history of progressions in manufacturing in silicon domains has been similarly materials-centric. It began with the invention of a process to fabricate something that didn’t exist in nature, semi-conductor-grade pure silicon.

What followed was a non-stop search for elements and combinations of them to advance computational speed by shrinking the dimensions of the switches. A computer circa 1980 used a couple of dozens elements from the periodic table, today they they use over 70 elements. That expansion was largely in pursuit of finding ways to continue to collapse the size of transistors without noise, while allowing them to to function at high speed. There is an ultimate physical limit to shrinking the dimensions regardless of materials, but engineers have a long way to go.

Since 1990, the raw, even simplistic metric of transistor density has increased some 10,000 fold. But the world’s’ fastest supercomputer brought online in 2020 clocked in at three million times more powerful than the top machine in 1990.

In the 1960s mainframe era, every dollar spent on hardware bought the capability to perform about one calculation per second. By the year 2000, a single dollar bought 10,000 calculations per second. Today, a single dollar buys, 10 billion calculations per second and its now rentable at any time from anywhere. Few of us can get our heads around such advances but it does allow us to understand how everything can be improved with computer computations. For example, my errors in typing this paper are commonly corrected before I notice them.

Today, billions of smartphone users take for granted computers with AI driven natural language voice recognition, and voice controlled machines from TVs to toasters. AI systems well beyond Watson-class now populate data centers in the cloud. We are no longer hearing about AI turning on us. It was a stupid headline grabbing idiots who really do not understand computers who were trying to scare us.

So far the use of AI has been dominated by making search smarter or providing advice about such things as consumer purchasing decisions or hyper personalizing ads to match buyers and sellers. The still large untapped deployment of AI will involve far more difficult computational problems and will facilitate more important tasks; real time controls of manufacturing or transportation systems; collaborative research with scientists; and collaborative medical diagnostics with clinicians and much more.

The future of ambient computing is epitomized by the smart watch. While there are a myriad of smart watchmakers, Apple again defined the category when it introduced its watch in 2013. Global sales of watches of all kinds is just over one billion units a year. Within just five years after introduction, smart watches captured nearly 10% of the total market and Apple alone now sells more watches than all the Swiss watch makers combined.

It took 30 years to shrink a room sized main frame down to a desktop. Another 30 years would pass before that much computer power collapsed inside a wristwatch which is connected to the cloud. It can now access more computers in a server rack occupying 10 square feet than existed on the planet in 1980. Long before another thirty years pass, the compute-communicate-display features of computingagain now linked to the Cloudwill continue to shrink enough in size to disappear inside rather than outright replace so many everyday things.

The future is indeed bright and will get brighter when society shrugs off the evil efforts to control it with insane climate change claims and vaccine control programs.

Note: Portions of this essay are excerpted from the outstanding book The Cloud Revolution with permission of the author Mark Mills and the publisher Encounter Books.