The Singularity The digital era has been dubbed the “Second Machine Age” and hailed as ushering in an economic transformation that will see brilliant technologies deliver deep and wide prosperity.
However, there are understandable fears that this revolution will destroy more jobs than it will create. Only time will demonstrate whether such fears are justified.
Over the last 12 months however, attention has been focused on a small but influential group of people who have been at the leading edge of the digital revolution and who believe that recent advances in technology raise serious issues of grave and global concern.
This small but high profile camp of digital doubters gained a new recruit last week with Andrew Haldane, the Chief Economist of the bank of England, joining its ranks.
Digital doubters can be loosely described as public intellectuals who believe the transition from a physical to a digital economy, populated with machines operating with artificial intelligence (AI), poses a serious and even existential threat to humanity.
Putting it bluntly, the concern is that machines, equipped with artificial intelligence and processing power greater than the human brain, will take over.
You can’t control the demon
Three prominent proponents who have gone public during the last 12 months with their concerns are the internationally respected British scientist, Stephen Hawking; the highly successful entrepreneur, engineer and inventor who co-founded financial payments platform PayPal, Elon Musk, and who is also bank-rolling the Tesla electric car project; and Microsoft founder, Bill Gates.
Hawking, in an appearance on the BBC in December said: “The development of full artificial intelligence could spell the end of the human race.” According to Hawking’s, the success in creating artificial intelligence would be the biggest event in human history.
“Unfortunately, it might also be the last,” he warned.
Elon Musk spoke at a Massachusetts Institute of Technology AeroAstro symposium in October. He told his audience that if he had to guess what our biggest existential threat was then he would nominate artificial intelligence. He ranked it more dangerous and probable than a nuclear war.
“With artificial intelligence we are summoning the demon.” Musk said. “You know those stories where there’s the guy with the pentagram and the holy water. He’s sure he can control the demon. Doesn’t work out!”
Bill Gates shared his concerns when he appeared as a guest on “Ask Me Anything”, a Q and A session on Reddit.
“I am in the camp that is concerned about artificial intelligence.” Gates’ told the Q and A. “First, the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that though artificial intelligence is strong enough to be a concern. I don’t understand why some people are not concerned.”
Growing, fast and slow
Andrew Haldane outlined his views in a speech at the University of East Anglia entitled ‘Growing, fast and slow.’
The speech as he described it, is “a cocktail of economics, history, sociology and psychology.”
That’s a fair description. But the impact of growth - be it negative or positive - finds its way into every nook and cranny of human activity.
In neo-classical economic theory, the driver of growth has been technology and innovation. It’s Haldane’s contention that growth is not a function of innovation alone, but other factors that come under the heading of sociological forces are also necessary.
Nor is it always the case that innovation is benign in either its appearance or impact.
Up until 1750 and the arrival of the industrial revolution, growth was virtually non-existent. For three millennia prior to the Industrial Revolution, growth per head averaged only 0.01 per cent a year. Global living standards were essentially flat.
As Haldane describes: “Since 1750, it has taken around 50 years for living standards to double. Prior to 1750, it would have taken 6000 years.”
The second transformative revolution, the era of mass industrialization, started in the second half of the 19th century and the third is the IT revolution which began in the second half of the 20th century.
A common denominator of these three revolutions is they took quite some time, usually decades, for their growth potential to take off.
Even with Moore’s law operating it is only recently that the digital revolution may have reached critical velocity. It’s clear that there has been a significant step up in the commercialization of such technologies as robotics, genetics, 3D printing, Big Data, and the ‘internet of things’.
The new industrial revolution
Haldane quotes what he calls a beautiful metaphor fashioned by economist Brian Arthur, describing the transition now under way from a physical economy to a digital one.
Arthur likens earlier industrial revolutions to the body developing its physiological or muscular system. By analogy, the economic system was defining and refining its motor skills, largely through investment in physical capital.
The digital revolution is different. It is akin to the economic body developing its neurological or sensory system, defining and refining its cognitive skills through investment in intellectual capital.
The success of the recent wave of transformative skills is built on them creating a neural – brain-like – network of connections.
The “internet of things” uses multiple sensors (like a brain’s neurons) connected through the web (like a brain’s synapses) to create, in effect, a machine brain.
It is that brain-like wiring which has given rise to thinking as well as doing – the move from Artificial intelligence (AI) to Artificial General Intelligence (AGI).
We have entered the fourth industrial revolution.
Haldane points out that Moore’s Law means the processing power of the machine brain is ever rising.
This, he says, leads to the intriguing possibility that, at some point in the future, the processing power of the machine brain could exceed the human brain.
Back in 1958, John Von Neumann, regarded by many as the father of modern computing, called such an event the “singularity’.
As we see with Hawking, Musk and Gates, there is growing speculation about its likelihood.
Computer processing speeds have grown at an exponential rate. Even so, they still fall well short of the brain’s computational capacity.
But, if Moore’s Law remains operational, the gap will close.
As Haldane told his audience: “Were the singularity to be reached, the sky becomes the limit innovation wise.
Machine rather than man then becomes the mother of all invention.
With exponential rises in processing power, the economy could become super-intelligent and super efficient.”
“This would indeed be a fourth industrial revolution. But unlike its predecessors, and almost by definition, it would be near impossible for the human brain to imagine where the machine brain would take innovation and hence growth next.
The world would be one of blissful ignorance.”
Blissful? I doubt.