As quantum computing hits the public headlines, research into neuromorphic networks accelerates

Martyn Warwick
By Martyn Warwick

Nov 30, 2021

  • With AI regarded as vital to the expansion of humanity’s capabilities, the technology is seen as a prime enabler
  • Imitates the biology of the human brain and makes logical deductions like a human being
  • A necessary alternative to, but not a replacement for, von Neumann architecture
  • Technology poised for massive growth over the next four years

Last week the Biden administration in the US added Chinese quantum computing companies to the nation’s Entity List and coverage of the technology and its potential application to artificial intelligence (AI) quickly spread across traditional news media. What has yet to receive much, if any, popular public coverage though is the growing scientific debate about the relative merits of quantum computing and neuromorphic computing when it comes to future generations of AI. 

In neuromorphic computing a computer imitates the biology of some of the neurons and synapses in the human brain via very-large-scale integration systems containing electronic circuits that mimic neuro-biological architectures present in the human nervous system. It uses algorithms to emulate the ways in which the human brain understands and interacts with the surrounding world to produce computing power and capabilities that are considerably closer to those provided by our cognitive abilities.

The astounding theoretical abilities of quantum computing and the increasing accretion of evidence from practical experiments shows that where some functions are concerned a quantum computer can run through tasks in minutes that would take years for an array of conventional computers to complete. However, one of the big problems with quantum computing is that they operate at temperatures close to absolute zero, (-270 C), a state that is very difficult to achieve and maintain under laboratory conditions, never mind a commercial environment.

On the other hand, the human biological brain, operating at body heat and consuming less than 20 Watts of power, (half that of a modern laptop) is an incredibly efficient processor and, in many respects and instances can easily outperform a supercomputer. So, scientists are working to develop a machine that can learn, store information, access it and use it to make logical deductions just like a human being can. Simultaneously they are trying, as they go, to frame a theory about how the human brain itself works. 

Neuromorphic computing approaches some of the functionality of the human brain via construction of fabricated neural systems of "neurons" (the areas that process information) and "synapses" (connections between information processing areas). Electrical signals control and modulate the pulses of electricity between the components. 

Today’s neural networks and machine learning systems work well with traditional-type algorithms and can be adapted either for low power consumption or fast processing but they cannot do both simultaneously. Neuromorphic systems can. They can deal with a myriad of instructions and tasks at the same time because they are massively parallel. They are also event-driven and able to react to changes in the environment so efficiently that only the sections of a neuromorphic computer that need power at any time actually get it. They are also fault-tolerant because data is co-located in so many places at once. 

von Neumann model will still work for CPU operations well into the future

Established von Neumann chip architecture comprises a memory unit (MU), a central processing unit (CPU) and data pathways so that data has to be called for, worked on and sent back to memory, retrieved again and so on, time and time and time again, until a task is completed because a fetch command and an operation on data cannot be completed at the same time. However, the architecture has served us well for many decades and will continue to do so even as its limitations become more apparent.

Nonetheless, because a neuromorphic chip holds data in many places, the processing of it is much more powerful and efficient with each neuron adapting its role to meet the requirements of the task in hand. That said the chips are still a long way from being able truly to emulate the workings of the human brain, not least because the focus to date has been on the task of developing the necessary hardware while the software development has taken something of a back seat. APIs and programming languages need to be signed and written if the technology is going to be used in commercial, non-scientific settings and that could be an expensive and time-consuming process.

Mooted roles for neuromorphic computing include in the cloud environment and at the network edge, in autonomous vehicles, data analytics, real-time image processing, and even smart home devices. However, it is in the field of AI that neuromorphic computing is likely to have the most impact. A recent report by Research and Markets of Dublin, Ireland, says the sector will experience a compound annual growth rate (CAGR) of 89 per cent between the beginning of this year and on through to 2026 and will be worth UDS$1,78 billion( and rising) per annum thereafter. This will because of increasing reliance on AI, which to develop further, requires more and more and more computing power.

Ray Kurzweil, ex of the Massachusetts Institute of Technology (MIT), computer scientist, inventor, futurologist and the man who has been described as “the true heir of Edison” has long prophesied the “singularity”, which is the point at which humankind will merge with the technology it has created. He believes the “singularity” is inevitable and forecasts that machines with human-level intelligence will be in existence by the early 2030s. He said, "I realise that most inventions fail not because the R&D department can’t get them to work, but because the timing is wrong - ‌not all of the enabling factors are at play where they are needed. Inventing is a lot like surfing, you have to anticipate and catch the wave at just the right moment." Maybe for geomorphic computing, surf’s up right now. 

As a philosopher, Kurzweil says he remains agnostic about the existence of a human soul. Asked about the possibility of a divine intelligence being behind everything, he responded, “Does God exist? I would say, 'Not yet.’”. Think about it and where the singularity might come in.

Email Newsletters

Sign up to receive TelecomTV's top news and videos, plus exclusive subscriber-only content direct to your inbox.