Brain-Inspired Computing Needs A Master Plan

Brain-Inspired Computing Needs A Master Plan

Introduction

With the increasing demand for computing power, the energy consumption of modern computing systems is also rising, making it challenging to support the sustainable future development of artificial intelligence technologies. This energy issue largely stems from the traditional digital computing systems employing the classic von Neumann architecture, where data processing and storage occur in different locations. In contrast, the human brain performs both data processing and storage in the same area and does so on a large scale in parallel. Inspired by biology, brain-inspired computing and neuromorphic systems can process unstructured data, perform image recognition, classify noisy and uncertain datasets, and contribute to building better learning and inference systems, fundamentally changing how we process signals and data in terms of energy efficiency and handling real-world uncertainties. However, current attention and investment in neuromorphic computing lag significantly behind digital AI and quantum technologies. How can we unlock the potential of neuromorphic computing? A recent article published in Nature argues that brain-inspired computing urgently needs a grand blueprint.

Research Areas: Brain-Inspired Computing, Neuroscience, Artificial Intelligence

Brain-Inspired Computing Needs A Master Plan

A. Mehonic & A. J. Kenyon | Authors

Ren Kana | Translation

JawDrin | Proofreading

Deng Yixue | Editing

Brain-Inspired Computing Needs A Master Plan

Paper Title:

Brain-inspired computing needs a master plan

Paper Link:

https://www.nature.com/articles/s41586-021-04362-w

Abstract

Brain-inspired computing (or brain-inspired computing) paints a promising vision: processing information in fundamentally different ways with extremely high energy efficiency, capable of handling the ever-increasing amounts of unstructured and noisy data generated by humans. To realize such a vision, we urgently need a grand blueprint that provides support in funding, research focus, etc., to coordinate different research groups together. Yesterday, we walked this way in digital technology; today, we do the same in quantum technology; will we do the same tomorrow in brain-inspired computing?

Background

Modern computing systems consume too much energy and cannot serve as sustainable platforms to support the widespread development of artificial intelligence technologies. We have focused solely on speed, accuracy, and the number of parallel operations per second, while neglecting sustainability, especially in the case of cloud-based systems. The habit of instant information retrieval has made us forget the energy and environmental costs that come with achieving these functionalities. For example, every Google search is supported by data centers that consume about 200 terawatt-hours of energy annually, and this consumption is expected to increase by an order of magnitude by 2030. Similarly, high-end AI systems, such as DeepMind’s AlphaGo and AlphaGo Zero, which can defeat human experts in complex strategy games, require thousands of parallel processing units, each consuming about 200 watts, far exceeding the approximately 20 watts consumed by the human brain.

While not all data-intensive computations require AI and deep learning, we still need to be concerned about the environmental costs brought by the increasingly widespread application of deep learning. Additionally, we should consider applications that do not require compute-intensive deep learning algorithms, such as the Internet of Things (IoT) and autonomous robotic agents, to reduce their energy consumption. If energy demands are too high, the IoT, which requires countless devices to operate, becomes impractical. Analyses show that the surging demand for computing power far exceeds the improvements brought by Moore’s Law, with the demand for computing power doubling now only taking two months. Through the combination of intelligent architecture and co-design of hardware and software, we are making increasingly significant progress. For example, NVIDIA’s graphics processing units (GPUs) have improved 317 times since 2012, far surpassing the predictions of Moore’s Law, although during this time, the power consumption of processors has increased from 25W to 320W. The remarkable performance improvements during the research and development phase indicate that we can do better. Unfortunately, traditional computing solutions alone cannot meet long-term needs. This regret is particularly evident when considering the astonishingly high training costs of most complex deep learning models. Therefore, we need new pathways.

Brain-Inspired Computing Needs A Master Plan

Figure 1: The rapid growth of computing power demand. (a) Shows the growth of computing power demand over the past 40 years, with computing power measured in petaFLOPS days: until 2012, the demand for computing power doubled every 24 months, still in line with Moore’s Law; however, it has recently shortened to about every 2 months. Different colors represent different application areas. (b) Shows the improvement in AI hardware efficiency over the past five years: state-of-the-art solutions have improved computing efficiency by over 300 times, and research and development solutions are expected to further improve. (c) Shows the increasing cost of training AI models since 2011.

The energy issue largely stems from the classic von Neumann architecture of digital computing systems, where data processing and storage occur in different locations, requiring the processor to spend most of its time and energy moving data. Fortunately, inspiration from biology has provided us with a new pathway—performing storage and processing in the same area, encoding information in a completely different way, or directly manipulating signals and employing large-scale parallelism, which will be elaborated in Box 1. In fact, energy efficiency and the exercise of advanced functionality can coexist—our brain is a testament to this. Of course, there is still much to explore regarding how our brains achieve both. Our goal is not merely to mimic biological systems but to find pathways inspired by the significant advancements in neuroscience and computational neuroscience over the past few decades. Our understanding of the brain is sufficient to inspire us.

Biological Inspiration

In biology, data storage is not independent of data processing; for example, the human brain—primarily composed of neurons and synapses—exercises both functions in a large-scale parallel and adaptive structure. The human brain contains an average of 1011 neurons and 1015 synapses, consuming about 20 watts of power; in contrast, a digital simulation of an artificial neural network of the same size consumes 7900 kilowatts. This six-order-of-magnitude difference is undoubtedly a challenge for us. The brain directly processes noisy signals with high efficiency, contrasting sharply with the signal-data conversion and high-precision computations in traditional computer systems, which consume significant time and energy. Even the most powerful digital supercomputers cannot match the brain’s capabilities. Therefore, brain-inspired computing systems (also referred to as neuromorphic computing) are expected to fundamentally change how we process signals and data, both in terms of energy efficiency and in handling real-world uncertainties.

Of course, the emergence of this idea has a traceable history in the development of science. The term “neuromorphic” describes devices and systems that mimic some functions of biological neural systems, coined by Carver Mead at the California Institute of Technology in the late 1980s. This inspiration comes from decades of work modeling the nervous system as equivalent circuits and building analog electronic devices and systems to provide similar functionalities.

Box 1

What are we talking about when we say neuromorphic systems?

The inspiration from the brain allows us to process information in a completely different way than existing traditional computing systems. Different brain-inspired computing platforms (or neuromorphic computing platforms) combine methods distinct from von Neumann computers: analogue data processing, asynchronous communication, massively parallel processing (MPP), or spiking deep residual networks (Spiking ResNet), among others.

The term “neuromorphic” encompasses at least three general research groups, distinguishable by their research goals: simulating neural functions (reverse engineering the brain), simulating neural networks (developing new computational methods), and designing new electronic devices.

(1) Simulating Neural Functions

Neuromorphic engineering studies how the brain uses the physical properties of biological structures such as synapses and neurons to perform computation. Brain-inspired engineers leverage the physical principles of analog electronics to simulate the functions of biological neurons and synapses to define the basic operations required for processing audio, video, or smart sensors, such as carrier tunneling, charge retention on silicon floating gates, and the exponential dependence of various device or material properties on various fields. More detailed information can be found in reference [41].

(2) Simulating Neural Networks

Brain-inspired computing seeks new data processing methods from a biological perspective, which can also be understood as the computational science of neuromorphic systems. This direction of research focuses on simulating the structure and function of biological neural networks (structure or function), completing storage and processing in the same area like the brain; or employing entirely different computation methods based on voltage spikes to simulate biological system action potentials.

(3) Designing New Electronic Devices

Of course, a skilled craftsman cannot work without materials. We need the devices and materials to support the realization of biomimetic functions. Recently developed electronic and photonic components with customizable features can help us mimic biological structures such as synapses and neurons. These neuromorphic tools provide exciting new technologies that can expand the capabilities of neuromorphic engineering and computing.

Among these new components, the most important are memristors: their resistance value is a function of their historical resistance value. The complex dynamic electrical response of memristors means they can be used in digital memory elements, variable weights in artificial synapses, cognitive processing elements, optical sensors, and devices that simulate biological neurons. They may possess some functions of real dendrites, and their dynamic responses can produce oscillatory behaviors similar to the brain, operating at the edge of chaos. They may link with biological neurons within a single system and can perform these functions with minimal energy.

Data discourse. We use the term “data” to describe information encoded in the physical responses of analog signals or sensors, or more standard digital information in pure theoretical computations. When referring to the brain “processing data,” we describe a complete set of signal processing tasks without relying on the traditional notion of signal digitization. Imagine brain-inspired computing systems operating at different levels from analog signal processing to processing massive databases: in the former case, we can avoid generating large datasets from the outset; in the latter case, we can significantly enhance processing efficiency while avoiding the impact of the von Neumann model. We have ample reason to explain why, in many scenarios, a digital representation of signals is necessary: high precision, reliability, and determinism. However, discoveries in transistor physics have shown that digital abstraction eliminates a lot of information, pursuing only minimal, quantized information: a bit. In this process, we incur enormous energy costs to exchange efficiency for reliability. Since AI applications are inherently still probabilistic, we must consider whether this exchange of efficiency for reliability makes sense. The computational tasks supporting AI applications executed by traditional von Neumann computers are highly compute-intensive and therefore energy-consuming. However, using spike-based signal simulation or hybrid systems may be able to perform similar tasks with high energy efficiency. Therefore, the advancements in AI systems and the emergence of new devices have rekindled interest in neuromorphic computing. These new devices offer exciting new ways to simulate some capabilities of biological neural systems, as detailed in Box 1.

The definition of “neuromorphic” varies. In short, it is a story about hardware: brain-inspired chips aim to integrate and utilize various useful features of the brain, including in-memory computing, spike-based information processing, fine-grained parallelism, noise and randomness resilient signal processing, adaptability, hardware learning, asynchronous communication, and analogue processing. Although there is no consensus on how many capabilities are required to be classified as neuromorphic, it is evident that this is a form of AI implemented differently than mainstream computing systems. However, we should not get lost in terminology but focus on whether the methods are effective.

The current neuromorphic technological approaches are still exploring between the poles of deconstruction and construction: deconstruction attempts to reverse-engineer the profound connections between the brain’s structure and function, while construction strives to find inspiration from our limited understanding of the brain. On the long road of deconstruction, perhaps the most important is the Human Brain Project (HBP), this high-profile and ambitious ten-year initiative funded by the EU since 2013. The HBP adopts two existing brain-inspired computing platforms and will further develop open-access brain-inspired computing platforms, namely, the SpiNNaker at the University of Manchester and the BrainScaleS at Heidelberg University. Both platforms implement highly complex silicon models of brain structure to help us better understand the workings of the biological brain. Meanwhile, construction is also arduous, with many teams using brain-inspired computing methods to enhance the performance of digital or analog electronic devices. Figure 2 summarizes the existing categories of neuromorphic chips, classifying them into four categories based on their different roles in analysis-synthesis and technology platforms. More importantly, we must recognize that neuromorphic engineering is not just for advanced cognitive systems; it can simultaneously provide benefits in energy, speed, and security in small edge devices with limited cognitive capabilities (at least by eliminating the need for continuous communication with the cloud).

Brain-Inspired Computing Needs A Master Plan

Figure 2. The landscape of brain-inspired chips. Mainstream brain-inspired chips are divided into four categories based on their different roles in deconstruction-construction (analysis-synthesis) research and different technology platforms: applications that simulate biological systems or brain-inspired computing can be bifurcated, further subdivided into those based on new architecture digital CMOS implementations (such as digital spiking rather than voltage simulation) or those using analog circuits. Regardless of where they fall, these brain-inspired chips possess at least some of the characteristics listed on the right, setting them apart from traditional CMOS chips. The top right corner labeled “a” indicates the presence of memristors.

References related to brain-inspired chips:

  • Neurogrid: Benjamin, B. V. et al. Neurogrid: a mixed-analog–digital multichip system for large-scale neural simulations. Proc. IEEE 102, 699–716 (2014).
  • BrainScaleS: Schmitt, S. et al. Neuromorphic hardware in the loop: training a deep spiking network on the BrainScaleS wafer-scale system. In 2017 Intl Joint Conf. Neural Networks (IJCNN) https://doi.org/10.1109/ijcnn.2017.7966125 (IEEE, 2017).
  • MNIFAT: Lichtsteiner, P., Posch, C. & Delbruck, T. A 128 × 128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43, 566–576 (2008).
  • DYNAP: Moradi, S., Qiao, N., Stefanini, F. & Indiveri, G. A scalable multicore architecture with heterogeneous memory structures for dynamic neuromorphic asynchronous processors (DYNAPs). IEEE Trans. Biomed. Circuits Syst. 12, 106–122 (2018).
  • DYNAP-SEL: Thakur, C. S. et al. Large-scale neuromorphic spiking array processors: a quest to mimic the brain. Front. Neurosci. 12, 891 (2018).
  • ROLLS: Qiao, N. et al. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses. Front. Neurosci. 9, 141 (2015).
  • Spirit: Valentian, A. et al. in 2019 IEEE Intl Electron Devices Meeting (IEDM) 14.3.1–14.3.4 https://doi.org/10.1109/IEDM19573.2019.8993431 (IEEE, 2019).
  • ReASOn: Resistive Array of Synapses with ONline Learning (ReASOn) Developed by NeuRAM3 Project https://cordis.europa.eu/project/id/687299/reporting (2021).
  • DeepSouth: Wang, R. et al. Neuromorphic hardware architecture using the neural engineering framework for pattern recognition. IEEE Trans. Biomed. Circuits Syst. 11, 574–584 (2017).
  • SpiNNaker: Furber, S. B., Galluppi, F., Temple, S. & Plana, L. A. The SpiNNaker Project. Proc. IEEE 102, 652–665 (2014). An example of a large-scale neuromorphic system as a model for the brain.
  • IBM TrueNorth: Merolla, P. A. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345, 668–673 (2014).
  • Intel Loihi: Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).
  • Tianjic: Pei, J. et al. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature 572, 106–111 (2019).
  • ODIN: Frenkel, C., Lefebvre, M., Legat, J.-D. & Bol, D. A 0.086-mm2 12.7-pJ/SOP 64k-synapse 256-neuron online-learning digital spiking neuromorphic processor in 28-nm CMOS. IEEE Trans. Biomed. Circuits Syst. 13, 145–158 (2018).
  • Intel SNN chip: Chen, G. K., Kumar, R., Sumbul, H. E., Knag, P. C. & Krishnamurthy, R. K. A 4096-neuron 1M-synapse 3.8-pJ/SOP spiking neural network with on-chip STDP learning and sparse weights in 10-nm FinFET CMOS. IEEE J. Solid-State Circuits 54, 992–1002 (2019).

Future Outlook

We do not mean that neuromorphic systems will (or should) replace traditional computing platforms. Instead, precise computing should retain digital computing, while neuromorphic systems can process unstructured data, perform image recognition, classify noisy and uncertain datasets, and contribute to building better learning and inference systems. For instance, in the cases of autonomous and IoT systems, they can save more energy than traditional methods. Quantum computing is also part of the future landscape. Although, according to the most optimistic estimates, practical applications of quantum computing are still some time away, it will undoubtedly revolutionize many computational tasks. However, IoT smart sensors, edge computing devices, or autonomous robotic systems that do not rely on cloud computing cannot exist without low-power computing components capable of handling uncertain and noisy data. Readers may imagine a future where digital systems, brain-inspired systems, and quantum systems collaborate, each playing its role while complementing each other’s strengths.
Brain-inspired computing is a broad interdisciplinary field, just as the development of semiconductor microelectronics relies on many different disciplines, including solid-state physics, electronic engineering, computer science, and materials science. Physicists, chemists, engineers, computer scientists, biologists, and neuroscientists all need to participate in its development process. The road is long and challenging, even getting researchers from different disciplines to understand each other’s terminology is already difficult. This research is a testament to that. As a study linking computer science (especially AI) and neuroscience (initially computational neuroscience), people in the “same room” have spent considerable time and effort ensuring that everyone understands the terms and concepts in the same way. After all, many concepts in today’s advanced AI systems—though not requiring complete biomimicry—still trace back to the fields of neuroscience from the 1970s and 1980s. We must embrace other disciplines, especially recognizing that many advancements in AI or neuroscience have been jointly driven with the help of different disciplines, such as innovations in materials science, nanotechnology, or electronic engineering. Additionally, we must realize that traditional CMOS technology may not be the best structure for effectively implementing brain-inspired algorithms, thus requiring comprehensive innovation. Early involvement and realization of interdisciplinary exchanges can prevent repeating failures in already explored directions or achieving nothing at all.
Moreover, we should not overlook the challenges of integrating new neuromorphic technologies at the system level. Besides developing brain-inspired algorithms and tools, a more pressing issue is how to replace existing mainstream AI systems with functionally equivalent neuromorphic alternatives. This further emphasizes the need for a comprehensive approach to brain-inspired computing.
Despite the above potential, there are currently no successful commercial cases of brain-inspired technology to reference. Existing systems and platforms primarily serve as research tools, as does quantum computing, which is a longer-term vision. However, this status quo should not become an obstacle to the development of brain-inspired computing; the demand for low-power computing systems is urgent. Nevertheless, we are very close to achieving this goal, as all additional functionalities stem from a fundamentally different computing approach provided by brain-inspired computing. A commercial ecosystem will inevitably emerge.

Seizing Opportunities

If neuromorphic computing is necessary, how can it be realized? Technical requirements come first. While gathering different research groups is necessary, it is not sufficient; incentives, opportunities, and infrastructure are also needed. The neuromorphic community is entirely different from other communities, lacking attention to quantum computing and understanding of the semiconductor industry’s overall situation. Projects around the world have begun to gather the necessary expertise, and momentum is building. Investment in neuromorphic research is far less than in digital AI or quantum technology. Given the maturity of digital semiconductor technology, this phenomenon is not surprising, but it is a missed opportunity. Indeed, there are several examples of medium-scale neuromorphic research and development investments, such as a series of brain-inspired studies at IBM’s AI Hardware Center (like the TrueNorth chip mentioned in Figure 2), the development of Intel’s Loihi processor, and the U.S. Brain Initiative Project, but current investments fall far short of what they deserve, as such technologies have the potential to disrupt digital AI.
The neuromorphic community is thriving and growing, but it inevitably lacks a common focus. Although conferences, symposiums, and journals continuously summarize and update people’s understanding of this field, the neuromorphic and brain-inspired areas are still unfinished business, such as persuading funding agencies and governments to recognize the importance of this field.
It is also timely for significant actions in the neuromorphic and brain-inspired fields. At the national level, governments should facilitate collaboration among industry, academia, and research to create mission-oriented research centers that promote the development of neuromorphic technologies. This strategy has proven effective in fields such as quantum technology and nanotechnology, with the U.S. National Nanotechnology Initiative being an excellent example. Such research centers can be physical or virtual institutions but must gather the best researchers from different fields together. These individuals need to adopt new approaches that differ from traditional information technology (where each abstract concept, including materials, devices, circuits, systems, algorithms, and applications, requires support from different communities). We need a comprehensive and synchronized design across the entire community. For instance, it is not enough for electronic engineers to consult computational neuroscientists only before designing systems; engineers and neuroscientists should collaborate throughout the entire process to ensure that principles from their respective fields are integrated into hardware and further work as fully as possible. All of this underscores that interdisciplinary collaborative creation is the top priority! Therefore, research centers should actively attract a wide range of researchers.
With material and economic foundations in place, a team of researchers with a sense of collaboration is critical. Electronic engineers rarely encounter ideas from neuroscience, and vice versa. While electronic engineers and physicists may have some knowledge of the biological characteristics of neurons and synapses, they are unlikely to understand cutting-edge computational neuroscience. A worthwhile approach is to establish master’s programs and doctoral training projects to cultivate neuromorphic engineers. The UK Research Council sponsors doctoral training centers (Centres for Doctoral Training, CDTs) that support areas requiring focused attention. CDTs can be independent institutions or collaborative entities. By establishing complementary teams across institutions, collaboration in interdisciplinary fields can be greatly beneficial. Such projects often closely collaborate with industry to cultivate high-quality researchers in ways that conventional doctoral training programs cannot match. This is certainly an experience worth emulating for the emerging neuromorphic engineering community, promoting interaction among communities and potentially giving birth to the next generation of researchers and research leaders. Pioneering examples include the Groningen University Cognitive Systems and Materials Research Project, the Technical University of Munich’s Master’s program in Neuroengineering, ETH Zurich’s Neuromorphic Engineering Simulation Circuit Design Course, Stanford University’s Large-Scale Neural Models, and the development course of the Visual Neuromorphic System at the Microelectrónica de Sevilla.
International cooperation is also urgently needed among nations, just as it is in research fields; collaboration among the best minds across borders is crucial for success. In interdisciplinary research like neuromorphic computing, international research networks and projects undoubtedly play a role. Early examples include the European Neurotech Consortium, focusing on neuromorphic computing technology, and the Chua Memristor Centre at Dresden University, which gathers many top researchers in materials, devices, and algorithms. Likewise, it is better to think ahead than to stop thinking.
So how can we attract government support? Government commitments to high-efficiency bio-inspired computing are part of promoting large-scale energy conservation and emission reduction. This can not only address climate change issues but also accelerate the incubation of more emerging low-carbon industries centered around big data, IoT, healthcare analytics, drug and vaccine development modeling, and robotics. If existing industries rely on larger-scale traditional digital data analysis, it will increase energy costs while exhibiting suboptimal performance. In contrast, we can create a virtuous cycle where we significantly reduce the carbon footprint of knowledge technology, which will also drive the next generation of disruptive industries while creating an environment for a series of emerging neuromorphic industries.
If this path sounds like a pipe dream, consider the successful experiences in the field of quantum technology. Relying on the National Quantum Technologies Programme, the UK government has invested approximately £1 billion in a series of quantum projects, creating research centers that unite industry and academia to translate quantum science into technologies for sensing and metrology, imaging, communication, and computation. An independent, fully equipped National Quantum Computing Centre is being built on the work of these centers and other researchers, developing general-purpose quantum computers. China has established a national laboratory for quantum information science worth billions, and the U.S. commissioned a national strategic overview for quantum information science in 2018, launching a five-year plan with an investment of $1.2 billion to support a series of national quantum research centers. Thanks to these national financial supports and policy dividends, a wave of starting quantum technology companies has emerged globally. An analysis found that private financing for related companies reached $450 million in 2017 and 2018—clearly illustrating the powerful guiding role of policy. Although neuromorphic computing technology is more mature than quantum technology and has the potential to disrupt existing AI technologies in a shorter time frame, such support for neuromorphic computing does not yet exist. Neuromorphic computing can be said to be the “Cinderella” among the three major directions of future computing.
Finally, let’s talk about the potential impact of the COVID-19 pandemic on this research perspective. The crisis has accelerated changes in how the world operates, as we have all witnessed: for example, more people are working from home. Although the reduction in commuting and travel brings direct benefits, such as some studies estimating that the global reduction in carbon dioxide emissions due to the pandemic could be as high as 17%, the new working methods come with their costs. To what extent can the emission reductions from reduced commuting and travel be offset by the increased emissions from data centers? Essentially, the pandemic further emphasizes the necessity of developing low-carbon computing technologies, such as neuromorphic computing systems.
Box 2

Prospects for AI Financing

Thanks to the strong demand for data processing and the development of hardware supporting existing computing and memory-intensive algorithms, investment in digital AI is booming. In April 2018, the UK government announced an additional £950 million to support the development of the digital AI industry, in addition to established research council funding. The French government announced an investment of €1.8 billion in AI from 2018 to 2022. Germany committed to investing €3 billion from 2018 to 2025. Japan invested ¥26 trillion in 2017. The U.S. government allocated $973 million in 2020 for civilian AI technology funding. Additionally, due to the frequent overlap of non-AI projects in analyses, it is challenging to obtain funding data for AI from the U.S. military. It is estimated that China will invest up to $8 billion in both civilian and military AI fields, with a $2.1 billion AI research park being built near Beijing. The European Commission has committed to investing $1.5 billion in AI from 2018 to 2050. Of course, compared to commercial investments, these amounts are not substantial. Conservative estimates suggest that U.S. commercial investments in AI companies reached $19.5 billion in 2019. By 2023, global commercial investments are expected to reach around $98 billion. If our current hardware systems cannot support the innovative potential of neuromorphic algorithms and architectures, we should consider the risks associated with such investment amounts. If neuromorphic technology can deliver efficiency savings and performance improvements as promised, then savvy investors will bet on new technologies and architectures while also betting on digital systems.
Due to the current lack of attention and government-level recognition for neuromorphic technology, its research funding is varied, primarily at the project level rather than the strategic level. Although optimistic estimates exist, such as the global brain-inspired chip market growing from $22.7 million in 2021 to $5.506 billion by 2026, the current conclusion remains: investment in neuromorphic systems lags far behind digital AI or quantum technology.
Brain-Inspired Computing Needs A Master Plan
Figure 3. Comparison of recent global public research funding in digital AI technologies. The data in the above figure is expressed in U.S. dollars at 2021 exchange rates, in millions of dollars. Some countries’ research funding is calculated annually (for example, the UKRI’s committed funding for 2020), while others have no specific time frame (for example, agreements in the UK AI field), and some are multi-year plans. The above figure showcases the scale of public funding in the digital technology field, but the more efficient development of neuromorphic and brain-inspired technologies will undoubtedly threaten the existing AI research investment ecosystem.

Conclusion

As we have discussed how to unlock the potential of neuromorphic computing, it has become clear: by establishing excellent research centers, providing targeted support for collaborative research; offering flexible funding mechanisms to achieve rapid progress; creating mechanisms for close collaboration with industry to attract commercial funding and incubate new industries and enterprises, especially by referencing the development of quantum technology; and developing training programs for the next generation of neuromorphic researchers and leaders; above all, promoting faster and broader implementation.
Brain-inspired computing is expected to change the way artificial intelligence is realized. In the context of technological advancements and growing demands, the opportunity is here, requiring both brave thinking and the support of courageous advocates for these ideas. Will we seize the opportunity?

References

  1. Jones, N. How to stop data centres from gobbling up the world’s electricity. Nature https://doi.org/10.1038/d41586-018-06610-y (12 September 2018).

  2. Wu, K. J. Google’s new AI is a master of games, but how does it compare to the human mind? Smithsonian https://www.smithsonianmag.com/innovation/google-ai-deepminds-alphazero-games-chess-and-go-180970981/ (10 December 2018).

  3. Amodei, D. & Hernandez, D. AI and compute. OpenAI Blog https://openai.com/blog/ai-and-compute/ (16 May 2018).

  4. Venkatesan, R. et al. in 2019 IEEE Hot Chips 31 Symp. (HCS) https://doi.org/10.1109/HOTCHIPS.2019.8875657 (IEEE, 2019).

  5. Venkatesan, R. et al. in 2019 IEEE/ACM Intl Conf. Computer-Aided Design (ICCAD) https://doi.org/10.1109/ICCAD45719.2019.8942127 (IEEE, 2019).

  6. Wong, T. M. et al. 1014. Report no. RJ10502 (ALM1211-004) (IBM, 2012). The power consumption of this simulation of the brain puts that of conventional digital systems into context.

  7. Mead, C. Analog VLSI and Neural Systems (Addison-Wesley, 1989).

  8. Mead, C. A. Author Correction: How we created neuromorphic engineering. Nat. Electron. 3, 579–579 (2020).

  9. Hodgkin, A. L. & Huxley, A. F. A quantitative description of membrane current and its application to conduction and excitation in nerve. J. Physiol. 117, 500–544 (1952). More complex models followed, but this seminal work remains the clearest and an excellent starting point, developing equivalent electrical circuits and circuit models for the neural membrane.

  10. National Nanotechnology Initiative. US National Nanotechnology Initiative https://www.nano.gov (National Nanotechnology Coordination Office, accessed 18 August 2021).

  11. About Groningen Cognitive Systems and Materials. University of Groningen https://www.rug.nl/research/fse/cognitive-systems-and-materials/about/ (accessed 9 November 2020).

  12. Degree programs: Neuroengineering. Technical University of Munich https://www.tum.de/en/studies/degree-programs/detail/detail/StudyCourse/neuroengineering-master-of-science-msc/ (accessed 18 August 2021).

  13. Course catalogue: 227-1033-00L Neuromorphic Engineering I. ETH Zürich http://www.vvz.ethz.ch/Vorlesungsverzeichnis/lerneinheit.view?lerneinheitId=132789&semkez=2019W&ansicht=KATALOGDATEN&lang=en (accessed 9 November 2020).

  14. Brains in Silicon. http://web.stanford.edu/group/brainsinsilicon/ (accessed 16 March 2022).

  15. Neuromorphs. Instituto de Microelectrónica de Sevilla http://www2.imse-cnm.csic.es/neuromorphs (accessed 9 November 2020).

  16. Neurotech. https://neurotechai.eu (accessed 18 August 2021).

  17. Chua Memristor Center: Members. Technische Universität Dresden https://cmc-dresden.org/members (accessed 9 November 2020).

  18. Subcommittee on Quantum Information Science. National Strategic Overview for Quantum Information Science. https://web.archive.org/web/20201109201659/https://www.whitehouse.gov/wp-content/uploads/2018/09/National-Strategic-Overview-for-Quantum-Information-Science.pdf (US Government, 2018; accessed 17 March 2022).

  19. Smith-Goodson, P. Quantum USA vs. quantum China: the world’s most important technology race. Forbes https://www.forbes.com/sites/moorinsights/2019/10/10/quantum-usa-vs-quantum-china-the-worlds-most-important-technology-race/#371aad5172de (10 October 2019).

  20. Gibney, E. Quantum gold rush: the private funding pouring into quantum start-ups. Nature https://doi.org/10.1038/d41586-019-02935-4 (2 October 2019).

  21. Le Quéré, C. et al. Temporary reduction in daily global CO2 emissions during the COVID-19 forced confinement. Nat. Clim. Change 10, 647–653 (2020).

  22. Gokmen, T. & Vlasov, Y. Acceleration of deep neural network training with resistive cross-point devices: design considerations. Front. Neurosci. 1010.3389/fnins.2016.00333 (2016).

  23. Marinella, M. J. et al. Multiscale co-design analysis of energy latency area and accuracy of a ReRAM analog neural training accelerator. IEEE J. Emerg. Selected Topics Circuits Systems 8, 86–101 (2018).

  24. Chang, H.-Y. et al. AI hardware acceleration with analog memory: microarchitectures for low energy at high speed. IBM J. Res. Dev. 63, 8:1–8:14 (2019).

  25. ARK Invest. Big Ideas 2021 https://research.ark-invest.com/hubfs/1_Download_Files_ARK-Invest/White_Papers/ARK–Invest_BigIdeas_2021.pdf (ARK Investment Management, 2021; accessed 27 April 2021).

  26. Benjamin, B. V. et al. Neurogrid: a mixed-analog–digital multichip system for large-scale neural simulations. Proc. IEEE 102, 699–716 (2014).

  27. Schmitt, S. et al. Neuromorphic hardware in the loop: training a deep spiking network on the BrainScaleS wafer-scale system. In 2017 Intl Joint Conf. Neural Networks (IJCNN) https://doi.org/10.1109/ijcnn.2017.7966125 (IEEE, 2017).

  28. Lichtsteiner, P., Posch, C. & Delbruck, T. A 128 × 128 120 dB 15 μs latency asynchronous temporal contrast vision sensor. IEEE J. Solid-State Circuits 43, 566–576 (2008).

  29. Moradi, S., Qiao, N., Stefanini, F. & Indiveri, G. A scalable multicore architecture with heterogeneous memory structures for dynamic neuromorphic asynchronous processors (DYNAPs). IEEE Trans. Biomed. Circuits Syst. 12, 106–122 (2018).

  30. Thakur, C. S. et al. Large-scale neuromorphic spiking array processors: a quest to mimic the brain. Front. Neurosci. 12, 891 (2018).

  31. Qiao, N. et al. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses. Front. Neurosci. 9, 141 (2015).

  32. Valentian, A. et al. in 2019 IEEE Intl Electron Devices Meeting (IEDM) 14.3.1–14.3.4 https://doi.org/10.1109/IEDM19573.2019.8993431 (IEEE, 2019).

  33. Resistive Array of Synapses with ONline Learning (ReASOn) Developed by NeuRAM3 Project https://cordis.europa.eu/project/id/687299/reporting (2021).

  34. Wang, R. et al. Neuromorphic hardware architecture using the neural engineering framework for pattern recognition. IEEE Trans. Biomed. Circuits Syst. 11, 574–584 (2017).

  35. Furber, S. B., Galluppi, F., Temple, S. & Plana, L. A. The SpiNNaker Project. Proc. IEEE 102, 652–665 (2014). An example of a large-scale neuromorphic system as a model for the brain.

  36. Merolla, P. A. et al. A million spiking-neuron integrated circuit with a scalable communication network and interface. Science 345, 668–673 (2014).

  37. Davies, M. et al. Loihi: a neuromorphic manycore processor with on-chip learning. IEEE Micro 38, 82–99 (2018).

  38. Pei, J. et al. Towards artificial general intelligence with hybrid Tianjic chip architecture. Nature 572, 106–111 (2019).

  39. Frenkel, C., Lefebvre, M., Legat, J.-D. & Bol, D. A 0.086-mm2 12.7-pJ/SOP 64k-synapse 256-neuron online-learning digital spiking neuromorphic processor in 28-nm CMOS. IEEE Trans. Biomed. Circuits Syst. 13, 145–158 (2018).

  40. Chen, G. K., Kumar, R., Sumbul, H. E., Knag, P. C. & Krishnamurthy, R. K. A 4096-neuron 1M-synapse 3.8-pJ/SOP spiking neural network with on-chip STDP learning and sparse weights in 10-nm FinFET CMOS. IEEE J. Solid-State Circuits 54, 992–1002 (2019).

  41. Indiveri, G. et al. Neuromorphic silicon neuron circuits. Front. Neurosci. 5, 73 (2011).

  42. Mehonic, A. et al. Memristors—from in‐memory computing, deep learning acceleration, and spiking neural networks to the future of neuromorphic and bio‐inspired computing. Adv. Intell Syst. 2, 2000085 (2020). A review of the promise of memristors across a range of applications, including spike-based neuromorphic systems.

  43. Li, X. et al. Power-efficient neural network with artificial dendrites. Nat. Nanotechnol. 15, 776–782 (2020).

  44. Chua, L. Memristor, Hodgkin–Huxley, and edge of chaos. Nanotechnology 24, 383001 (2013)

  45. Kumar, S., Strachan, J. P. & Williams, R. S. Chaotic dynamics in nanoscale NbO2 Mott memristors for analogue computing. Nature 548, 318–321 (2017).

  46. Serb, A. et al. Memristive synapses connect brain and silicon spiking neurons. Sci Rep. 10, 2590 (2020).

  47. Rosemain, M. & Rose, M. France to spend $1.8 billion on AI to compete with U.S., China. Reuters https://www.reuters.com/article/us-france-tech-idUSKBN1H51XP (29 March 2018).

  48. Castellanos, S. Executives say $1 billion for AI research isn’t enough. Wall Street J. https://www.wsj.com/articles/executives-say-1-billion-for-ai-research-isnt-enough-11568153863 (10 September 2019).

  49. Larson, C. China’s AI imperative. Science 359, 628–630 (2018).

  50. European Commission. A European approach to artificial intelligence. https://ec.europa.eu/digital-single-market/en/artificial-intelligence (accessed 9 November 2020).

  51. Artificial intelligence (AI) funding investment in the United States from 2011 to 2019. Statista https://www.statista.com/statistics/672712/ai-funding-united-states (accessed 9 November 2020).

  52. Worldwide artificial intelligence spending guide. IDC Trackers https://www.idc.com/getdoc.jsp?containerId=IDC_P33198 (accessed 9 November 2020).

  53. Markets and Markets.com. Neuromorphic Computing Market https://www.marketsandmarkets.com/Market-Reports/neuromorphic-chip-market-227703024.html?gclid=CjwKCAjwlcaRBhBYEiwAK341jS3mzHf9nSlOEcj3MxSj27HVewqXDR2v4TlsZYaH1RWC4qdM0fKdlxoC3NYQAvD_BwE. (accessed 17 March 2022).

(References can be scrolled up and down to view)

Neurodynamics Model Reading Club

With the development of various technical methods such as electrophysiology, network modeling, machine learning, statistical physics, and brain-inspired computing, our understanding of the interaction mechanisms and connection mechanisms of brain neurons, as well as functions such as consciousness, language, emotion, memory, and socialization, is gradually deepening, and the mysteries of the brain’s complex systems are being unveiled. To promote communication and cooperation among researchers in neuroscience, system science, computer science, and other fields, we initiated the 【Neurodynamics Model Reading Club】.

The Wisdom Club Reading Club is a series of paper reading activities aimed at a wide range of researchers, with the purpose of jointly exploring a scientific topic in depth, stimulating research inspiration, and promoting research collaboration. The 【Neurodynamics Model Reading Club】 is jointly initiated by the Wisdom Club and the Tianqiao Brain Science Research Institute, and has started on March 19, meeting every Saturday afternoon from 14:00 to 16:00 (or every Friday evening from 19:00 to 21:00, adjusted according to actual conditions), expected to last for 10-12 weeks. During this period, discussions will focus on the multiscale modeling of neural networks and their applications in brain diseases and cognition.

Brain-Inspired Computing Needs A Master Plan

For more details, please see:

Neurodynamics Model Reading Club Launch: Integrating Multidisciplinary Approaches in Computational Neuroscience

Recommended Reading

  • Frontiers in Brain-Inspired Computing: Biological Signal Classification Based on Organic Electrochemical Networks
  • Frontiers in Network Neuroscience: How Does the Brain Efficiently Process Information Locally and Globally?
  • Fei-Fei Li’s Team: How to Create Smarter AI? Let Artificial Life Evolve in Complex Environments
  • Complete Online Launch of “Zhangjiang: 27 Lectures on Complex Science”!
  • Become a Wisdom VIP and unlock all site courses/reading clubs
  • Join Wisdom and explore complexity together!

Click “Read the original text” to sign up for the reading club

Leave a Comment