As we have moved through the 20th century and into the 21st, and computer processing power has increased exponentially, popular culture has been progressively gripped by the what-if scenario of computing machines becoming sentient. Fictional examples run from saviours to annihilators, but all characterizations hinge on one assumption: that computing machines can become sophisticated enough to accurately replicate the processes of the human brain – and therefore spring to ineffable internal life. Some thinkers, including Stephen Hawking: see the transition happening so soon that they have started publicly warning us away from developing this “Strong A.I.”
But over at Psychology Today, cognitive neuroscientist Bobby Azarian argues that strong A.I. — that is, the kind that we puny humans need to worry about taking over the world and wiping us out — is a non-starter; what he describes as a “myth.” And it has everything to do with what plagued Jill Watson in the tale we related last week: the gulf between the ability to replicate, which computers can do quite well, and the ability to understand.
This gulf has been proposed to be technical in origin. Due to the binary decision making process that is the deepest foundational building block of even the most sophisticated level of computing:
“[A] strict symbol-processing machine can never be a symbol-understanding machine. […]
[Physicist and public intellectual Richard] Feynman described the computer as ‘A glorified, high-class, very fast but stupid filing system,’ managed by an infinitely stupid file clerk (the central processing unit) who blindly follows instructions (the software program). Here the clerk has no concept of anything—not even single letters or numbers. In a famous lecture on computer heuristics, Feynman expressed his grave doubts regarding the possibility of truly intelligent machines, stating that, ‘Nobody knows what we do or how to define a series of steps which correspond to something abstract like thinking.’
While a powerful computer can potentially replicate the physical processes of the human brain exactly in a virtual space, there is always something missing from it to prevent it from coming into full, self-aware life of its own. Scientists speculate that this missing part may be electrochemical, unique to the organic machines that we call out brains. Which means there is hope for humanity’s survival yet!