Neuromorphic Computing; the scopes and possibilities of a new era in computing

Brains are way ahead of the competition as far as the classic computing systems go. Apart from being small, lightweight and amazingly adaptable, they also ‘consume’ low energy and they are also set to be the model for the subsequent wave of advanced computing. 

These brain-inspired designs are known collectively as 'neuromorphic computing'. Even the most advanced computers don't is even barely equivalent to the human brain- or maybe most mammal brains- but our grey matter can give engineers and developers some tips on the way to make computing infrastructure more efficient, by mimicking the brain's synapses and neurones. 

Firstly, biology. Neurones are nerve cells and work because the cabling that carries messages from one a part of the body to the opposite. Those messages are passed from one neurone to a different until they reach that exact part of the body where they will produce an impact - by causing us to remember of the pain, move a muscle, or form a sentence, as an example. The way that neurones pass away messages to every other is across a niche spot named a synapse. Once a neurone has received enough input to trigger it, it passes a chemical or electrical impulse, referred to as a nerve impulse, onto succeeding neurone, or onto another cell, like a muscle or gland.

Up next, the tech. Neuromorphic computing software seeks to recreate these action potentials through spiking neural networks (SNNs). SNNs are made from neurons that signal to other neurons by generating their action potentials, conveying information as they're going. The strength and timing of the messages cause the neurones to remap the connections between them, allowing the SNN to 'learn' as inputs change, very much as the brain does.

Neuromorphic architecture with level-tuned neurons. The internal state of a primary neuron is used to enable a set of level-tuned neurons. Credit: Pantazi et al. ©2016, IOP Publishing

Neuromorphic architecture with level-tuned neurons. The internal state of a primary neuron is used to enable a set of level-tuned neurons. Credit: Pantazi et al. ©2016, IOP Publishing

On the hardware side, neuromorphic chips are a fundamental departure from the CPUs and GPUs employed in most computing hardware today. Traditional architectures are creaking for a short time now, with manufacturers finding it harder and harder to cram more transistors on one single chip as they run up against limits of physics, power consumption and heat generation. At an equivalent time, we're generating more and more computing data and consuming more and more computing power, meaning that the super-adaptable, super-powerful, super-low-energy computer in our heads is setting out to look increasingly interesting as a technology model.

Neuromorphic computing has its roots in computing systems, developed within the late 1980s, that were designed to model the workings of animal nervous systems. Since then, neuromorphic computing has been gathering pace, to the extent that many of the biggest tech names like Intel and IBM have already produced neuromorphic hardware.

Recently, at Penn State University in the US, a team of engineers is attempting to pioneer a sort of computing that mimics the efficiency of the brain's neural networks while exploiting the brain's analogue nature. Like synapses connecting the neurons within the brain that may be reconfigured, the synthetic, artificial neural networks the team is building are often reconfigured by applying a quick field of force to a sheet of graphene, the one-atomic-thick layer of carbon atoms. during this period of work, they show a minimum of 16 possible memory states, as against the 2 in most oxide-based memristors, or memory resistors. 

Saptarshi Das, the team leader and Penn State Professor of Engineering and Mechanics said, "We have powerful computers, no doubt that the matter is you've got to store the memory in one place and do the computing in different places." The shuttling of this data from memory to logic and back again takes a great deal of energy and slows the speed of computing. Also, this computer architecture requires loads of space. If the computation and memory storage can be located within the same space, this bottleneck may well be eliminated. 

The team thinks that ramping up this technology to a commercially viable scale is possible. With many of the biggest semiconductor companies actively pursuing neuromorphic computing, Das believes they're going to find this work of interest. 

Neuromorphic computing could also help spawn a brand new wave of AI (AI) applications. Current AI is sometimes narrow and developed by learning from stored data, developing and refining algorithms until they reliably match a selected outcome. Using neuromorphic tech's brain-like strategies, however, could allow AI to require on new tasks. Because neuromorphic systems can work just like the human brain- ready to address the uncertainty, adapt, and use messy, confusing data from the important world - it could lay the foundations for AIs to become more general.

For neuromorphic to create a more substantial impact, a variety of changes will get to happen across the broader tech industry. Sensor technologies aren't discovered during a way that works well with neuromorphic systems, for instance, and could've been needed to ds be redesigned to enable data to be extracted through a way that neuromorphic chips can process.

What's more, it is not just the hardware that has to change, it's people too. As per Mike Davies, director of Intel's Neuromorphic Computing Lab says, “These are varieties of functions that conventional computing isn't so efficient at and then we were trying to find new architectures which will provide breakthroughs- and that's where we actually need a real partnership with neuroscientists, with a replacement open-minded breed of machine-learning data scientists to believe rethinking computation during this way.” 

While the hardware is comparatively mature, one among the challenges facing the sector is within the basic software programming models and therefore the algorithmic maturity. Other than that, neuromorphic computing could lead to a much more integrated collaborative technology industry, one where computing becomes an end-to-end system design problem. Greater collaboration with neuroscientists seems likely because the brain has much more to inform us about what computing could do better, particularly around algorithms. 

We can expect that someday, with the huge pace of research under its belt, neuromorphic computing may even replace or improve the whole sector of artificial intelligence and create a new wave into the industrial revolution itself. 

Reference: Thomas F. Schranghamer, Aaryan Oberoi, Saptarshi Das. Graphene memristive synapses for high precision neuromorphic computing. Nature Communications, 2020; 11 (1) DOI: 10.1038/s41467-020-19203-z

Thumbnail credit: (created by) MF3d | Getty Images/iStockphoto