What Is the Future of Brain-like Computing?

Brain-Like (Neuromorphic) Computing - Computerphile
New technologies in brain-like computing promise to process information in a completely different way, be extremely energy efficient and be able to handle the massive amounts of unstructured and noisy data that we generate at an accelerated rate. To realize this promise, a courageous and coordinated program is needed to bring together the diverse research community and provide them with the funding, focus, and support they need. We've done this in the past with digital technology, and we're doing it with quantum technology. Can we now do this for brain-like computing?
Modern computing systems consume too much energy and are not sustainable platforms for the complex AI applications that are increasingly becoming part of our lives. But we often don't see this, especially in cloud-based systems, and usually focus only on features such as: how fast are they; how accurate are they; how many parallel operations are there per second? We are so used to instant access to information that we ignore the energy and environmental consequences of computing systems that give us such access. However, every Google search has a cost: data centers currently use about 200 terawatt-hours of energy per year, which is expected to grow by about an order of magnitude by 2031. Similarly, the stunning achievements of high-end AI systems such as DeepMind's AlphaGo and AlphaZero, which require thousands of parallel processing units and can beat human experts in complex strategy games, consume about 200 watts of power per unit.
While not all data-intensive computing requires AI or deep learning, deep learning applications are so widespread that we must worry about their environmental costs. We should also consider applications including the Internet of Things (IoT) and autonomous robotic agents, which may not always need to be operated by computationally intensive deep learning algorithms, but must still reduce their energy consumption. The vision of the IoT cannot be realized if the energy demand of countless connected devices is too high. Recent analysis shows that the demand for computing power is growing much faster than the advances made by Moore's Law. Demand for computing power is now doubling every two months (Figure, paragraph 1A). Significant progress has been made through a combination of intelligent architecture and hardware-software co-design. For example, since 2012, NVIDIA GPUs (graphics processors) have improved performance by a factor of 317: far exceeding Moore's Law expectations (Figure 1b)-even though the power consumption of the units has increased from about 25 W to about 320 W over the same period. further impressive performance improvements have been demonstrated in the research and development phase (Figure 1b. red), and we may be able to make even more progress. Unfortunately, traditional computing solutions alone are unlikely to meet long-term needs. This is especially evident when we consider the surprisingly high training costs required for the most complex deep learning models (Fig. 1c), and we need alternative approaches.
a. The growth in demand for computing power, expressed in gigabits per second, over the past four decades. Until 2012, computing power demand doubled every 24 months; recently, this time was reduced to approximately every two months. The color legend indicates the different application domains. b, Improvements in AI hardware efficiency over the past five years. State-of-the-art solutions have improved computational efficiency by more than 300 times. Solutions in development are expected to improve further. c, Increase in AI model training costs since 2011. Such exponential growth is clearly unsustainable.
The energy problem is largely the result of digital computing systems that store data separately from the processing data. This is the classic von Neumann architecture of digital computing systems, where the processor spends most of its time and energy on transferring data. Fortunately, we can improve this situation by taking inspiration from biology, which takes a completely different approach - putting memory and processing in the same place, encoding information in a completely different way, or manipulating signals directly and using massively parallel processing, for example (Column 1). There is one kind of system that works well for energy efficiency and advanced functions: that is the brain. We realize that there is still much to learn about how the brain works, and our goal is not simply to model biological systems, but we can still learn from the significant advances in neuroscience and computational neuroscience over the past few decades. What we know about the brain is enough to use it for inspiration.
What do we mean by "neuromorphic" systems?
Taking inspiration from the brain allows us to process information in a completely different way than existing traditional computing systems. Different brain-inspired ("neuromorphic") platforms use a combination of different approaches: analog data processing, asynchronous communication, massively parallel information processing, or peak-based information representation. These properties distinguish them from von Neumann computers.
The term "neuromorphic" covers at least three broad research groups, distinguished by whether their goal is to stimulate neural function (reverse-engineering the brain), simulate neural networks (developing new computational methods), or design novel electronic devices.
"Neuromorphic engineering" studies how the brain uses the physical principles of biological synapses and neurons to "compute". Neuromorphic engineers work to simulate the function of biological neurons and synapses by defining fundamental operations using physical properties of simulated electrons such as carrier tunneling, charge retention on silicon floating grids, and the exponential dependence of various devices or material properties on fields. Transistors are used as analog circuit elements with rich dynamic properties, rather than binary switches.
"Neuromorphic computing" hopes to find new ways to process data from biology, which can be considered the computational science of neuromorphic systems. The research looks at simulating the structure and/or operation of biological neural networks, which could mean co-locating storage and computation, just like the brain; or it could take a completely different computational approach, based on voltage spikes to simulate the action potentials of biological systems.
Underpinning everything are the devices and materials needed to achieve bionic function. In this regard, recent developments herald new electronic and photonic devices whose properties we can tailor to mimic biological elements such as synapses and neurons. These "neuromorphic devices" can provide exciting new technologies to extend the capabilities of neuromorphic engineering and computation.
The most important of these new devices is the amnestic resistor: an electronic device whose resistance is a function of history. Their complex dynamic electrical responses mean they can be used as digital memory elements, variable weights in artificial synapses, cognitive processing elements, optical sensors, and devices that mimic biological neurons. In addition, they may have some of the functions of biological dendrites, and their dynamic responses can produce oscillatory behavior similar to that of the brain, although controversially, they operate at the "edge of chaos" and may also be associated with biological neurons in a system. But they can do all this with little energy consumption.
Bioinspiration
In biology, data storage and processing are inseparable. The same elements - mainly neurons and synapses - perform both functions in massively parallel and adaptive structures. The 1011 neurons and 1015 synapses contained in a typical human brain consume approximately 20 W of power, while a digital simulation of an artificial neural network of approximately the same size consumes 7.9 MW, six orders of magnitude difference that presents a challenge. The brain processes those noisy signals directly with extremely high efficiency. This is in contrast to the signal-data conversion and high-precision computation in traditional computer systems, which incur enormous costs in energy and time for even the most powerful digital supercomputers. Thus, brain-inspired or "neuromorphic" computing systems could change the way we process signals and data, both in terms of energy efficiency and the ability to deal with real-world uncertainty.
This is not a new idea; in the late 1980s, Carver Mead of the California Institute of Technology coined the term "neuromorphic" to describe devices and systems that mimic certain functions of the biological nervous system. The inspiration came from work over the past few decades that modeled the nervous system as equivalent circuits and built analog electronics and systems to provide similar functionality (Column 1).
A word for "data". We use this term to describe the information encoded in the physical response of analog signals or sensors, as well as the more standard computation-centric digital data. When we refer to the brain as "processing data," we are describing a complete set of signal processing tasks that do not rely on the digitization of signals in any traditional sense. We can imagine brain-inspired systems operating at different levels: from analog signal processing to the use of large digital data sets. In the former case, we can avoid generating large datasets in the first place; in the latter, we can greatly improve processing efficiency by moving away from the von Neumann model.
Of course, there is a reason we represent data numerically in many applications: we need high accuracy, reliability, and determinism. However, digital abstraction discards the vast amount of information found in transistor physics for the smallest quantum of information: a single bit. We trade efficiency for reliability, for which we pay a considerable energy cost. Artificial intelligence applications tend to be probabilistic in nature, so we must consider whether this trade-off is justified. When performed by traditional von Neumann computers, the computational tasks that underpin AI applications are very computationally intensive (and therefore energy-intensive). However, on analog or hybrid systems using spike-based information representations, we can perform similar tasks more energy-efficiently. As a result, there has been a recent resurgence of interest in neuromorphic computing, driven by the development of artificial intelligence systems and the emergence of new devices that offer new and exciting ways to stimulate certain functions of biological nervous systems (Column 1).
The definition of "neuromorphic" varies widely. Not, strictly speaking, it is a story of hardware: neuromorphic chips are designed to integrate and exploit a variety of useful features of the brain, including in-memory computing, spike-based information processing, fine-grained parallelism, signal processing, noise, and stochastic resistance, adaptation, hardware learning, asynchronous communication, and analog processing. While the number of such features that need to be implemented to be classified as neuromorphic is controversial, it is clearly a different approach to implementing AI on mainstream computing systems. However, we should not get lost in the terminology; the key question is whether this approach is useful.
The approach of neuromorphic technology is between reverse engineering (analysis) of the structure and function of the brain and the current lack of knowledge we have about the brain, but inspired by (synthesized from) what we know. Among the former approaches, perhaps the most important is the Human Brain Project, a high-profile and ambitious 10-year program funded by the European Union since 2013. This project supports the adoption and further development of two existing neuromorphic hardware platforms - spinnaker (at the University of Manchester) and BrainScaleS (at the University of Heidelberg) - as open access neuromorphic platforms. Both systems implement highly complex in silico models of brain structure to better understand the workings of the biological brain. At the other end of the spectrum, many research groups use selected bio-inspired methods to enhance the performance of digital or analog electronics. Figure 2 summarizes the range of available neuromorphic chips, grouped into four categories based on their location in the analytic-synthetic spectrum and their technology platform. It is important to remember that neuromorphic engineering is not only about advanced cognitive systems but also provides energy, speed, and security gains in small edge devices with limited cognitive capabilities (at least by eliminating the need for continuous cloud communication).
Neuromorphic chips can be classified as simulating biological systems or applying brain-inspired principles for novel computing applications. They can be further subdivided into those based on digital CMOS with novel architectures (e.g., spikes that can be simulated in the digital domain rather than implemented as analog voltages) and those implemented using some degree of analog circuitry. In all cases, however, they have at least some of the characteristics listed on the right, which distinguish them from conventional CMOS chips. Here, we classify recently developed neuromorphic chips. Details of each one can be found in the relevant references: Neurogrid, BrainsClases, MNIFAT, Dynap, Dynap-Sel, Rolls, Spirit, Reason, DeepSouth, Spinnaker, IBM TrueNorth, Intel Loihi, Tianjin, ODIN, and the Intel SNN chip.
Prospects
We do not believe that neuromorphic systems will or should replace traditional computing platforms. Instead, exact computing should remain numerical, while neuromorphic systems can process unstructured data, perform image recognition, classify noisy and uncertain data sets, and support new learning and inference systems. In autonomous and IoT systems, they can save significant amounts of energy compared to traditional systems. Quantum computing is also part of this vision. A practical quantum computer, although it is estimated to be several years away, would certainly revolutionize many computing tasks. However, IoT smart sensors, edge computing devices, and autonomous robotic systems are unlikely to adopt quantum computing without relying on cloud computing, and the need for low-power computing components that can handle uncertain and noisy data will remain. We can imagine a three-way synergy between digital systems, neuromorphic systems, and quantum systems.
Just as the development of semiconductor microelectronics relies on many different disciplines, including solid-state physics, electrical engineering, computer science, and materials science, neuromorphic computing is profoundly interdisciplinary and transdisciplinary in nature. Physicists, chemists, engineers, computer scientists, biologists, and neuroscientists all play important roles. Simply getting researchers from different disciplines to speak a common language is challenging. In our own work, we spend a lot of time and effort making sure that everyone in the room understands terms and concepts in the same way. The case for bridging the gap between the computer science (especially artificial intelligence) and neuroscience (initially computational neuroscience) communities is obvious. After all, many of the concepts found in today's most advanced AI systems emerged in neuroscience in the 1970s and 1980s, and of course, AI systems need not be entirely bio-realistic. We must incorporate other disciplines, recognizing that many of the advances we have made in AI or neuroscience have been enabled by different fields - for example, innovations in materials science, nanotechnology, or electrical engineering. In addition, traditional CMOS (complementary metal-oxide-semiconductor) technology may not be the best structure to effectively implement new brain-inspired algorithms, and innovation is needed across the board. Engaging these communities early can reduce the risk of wasting effort on directions that have been explored but failed, as well as the risk of reinventing the wheel.
In addition, we should not overlook the challenge of integrating new neuromorphic technologies at the system level. In addition to the development of brain-inspired devices and algorithms, there are pressing questions about how to replace existing mainstream AI systems with functionally equivalent neuromorphic alternatives, further emphasizing the need for a fully integrated approach to brain-like computing.
We should point out that despite the above potential, there is currently no convincing demonstration of commercial neuromorphic technology. Existing systems and platforms are primarily research tools, as is quantum computing, which remains a long-term prospect. We should not delay the development of brain-like computing as a result. Currently, there is an urgent need for low-power computing systems, and we are about to achieve this with a completely different approach to computing, and commercial systems are definitely on the horizon.
Seize the opportunity
If neuromorphic computing is needed, how can it be implemented? First, the technical requirements. Bringing the diverse research community together is necessary, but it is not enough; incentives, opportunities, and infrastructure are needed. The neuromorphic community is a completely different community, lacking a focus on quantum computing and a clear roadmap for the semiconductor industry. Early-stage momentum is building as projects around the globe are beginning to gather the expertise needed. To make this happen, funding is key. The scale of investment in neuromorphic research is much less than in digital artificial intelligence or quantum technologies (Column 2). While this is not surprising given the maturity of digital semiconductor technology, it would be a missed opportunity. There are some medium-sized investments in neuromorphic research and development, such as a series of brain-inspired projects at IBM's AI Hardware Center (including the TrueNorth chip), the development of Intel's Loihi processor, and the U.S. Brain Initiative project, but their total investment is far below a level that promises to disrupt digital AI technology as it should be.
The neuromorphic field is a large and growing one, but it lacks a focus. While there are many conferences, symposia, and journals appearing in the field, there is still much work to be done to try to persuade, by bringing together experts in different fields, the funding agencies, and governments to recognize the importance of this field.
The time is ripe for a bold initiative. At the national level, governments need to work with academic researchers and industry to establish mission-oriented research centers to accelerate the development of neuromorphic technologies. This approach has worked well in areas such as quantum technology and nanotechnology - the US National Nanotechnology Initiative (NNI) demonstrates this well and provides focus and incentives. These centers can be physical or virtual, but they must bring together the best researchers in different fields. Their approach must be different from that of traditional electronics, where each level of abstraction (materials, devices, circuits, systems, algorithms, and applications) belongs to a different domain. We need to design holistically and in parallel across the entire stack. It is not enough for circuit designers to consult with computational neuroscientists before designing a system; engineers and neuroscientists must collaborate throughout the process to ensure the fullest possible integration of bio-inspired principles into the hardware. Interdisciplinary co-creation must be the focus of our approach, and research centers must accommodate a broad range of researchers.
In addition to the necessary physical and financial infrastructure, we need a well-trained workforce. Electronic engineers are rarely exposed to the ideas of neuroscience and vice versa. Circuit designers and physicists may have some understanding of neurons and synapses but are unlikely to be familiar with cutting-edge computational neuroscience. There is a strong case for establishing master's programs and Ph.D. training programs to train neuromorphic engineers. The UK Research Council (UKRC) sponsors Doctoral Training Centres (CDTS), which are centers that support areas where there is an identified need for well-trained researchers. CDTS can be single or multiple institutions, and by establishing complementary teams across institutions, those collaborating on these programs will gain substantial benefits. Such programs often work closely with industry to build teams of highly skilled researchers, which is often not possible with traditional Ph.D. programs. This is a good example of developing something similar to stimulate interaction between nascent neuromorphic engineering fields and to provide the next generation of researchers and research leaders. Pioneering examples include the Cognitive Systems and Materials Research Program in Groningen, which aims to train dozens of Ph.D. students specializing in materials for cognitive (AI) systems; the neuro-engineering master's program at the Technical University of Munich; the neuromorphic engineering simulation circuit design course at ETH Zurich; large-scale neural modeling at Stanford University; and the development of visual neuromorphic systems at the Institute of Microelectronics in Seville. There is room for them to do more in this area.
A similar approach could work at the cross-national level. Just as in research, when the best people collaborate with the best people regardless of national boundaries, then that collaboration is most successful. This is critical in interdisciplinary research like neuromorphic computing, so international research networks and projects can certainly play a role. Early examples include the European Neurotechnology Consortium, which focuses on neuromorphic computing technologies, and the Zeiss Center for Memory Resistors at the University of Dresden, which brings together many of the best memory resistor researchers in the fields of materials, devices, and algorithms. Again, much more can and must be done.
How can such projects attract the attention of governments? The government's commitment to more energy-efficient bionic computing could be part of a broader push for large-scale decarbonization. This will not only address climate change, but also accelerate the emergence of new low-carbon industries around big data, the Internet of Things, healthcare analytics, drug and vaccine discovery modeling, and robotics. If existing industries rely on larger-scale traditional digital data analytics, it will increase energy costs while providing sub-optimal performance. Instead, we can create a virtuous cycle in which the carbon footprint of such knowledge technologies that drive the next generation of disruptive industries is significantly reduced, and in the process, a new set of neuromorphic industries is nurtured. If this sounds like a daunting task, consider quantum technologies. To date, the UK government has invested around £1 billion in a range of quantum initiatives, largely under the umbrella of the National Quantum Technologies Programme. A range of research centers, bringing together industry and academia, translate quantum science into technologies targeting sensors and metrology, imaging, communications, and computing. An independent National Quantum Computing Center builds on the work of these centers and other researchers to provide demonstration hardware and software to develop general-purpose quantum computers. China has established a multibillion-dollar China National Laboratory for Quantum Information Science, and the United States commissioned a National Strategy for Quantum Information Science outlined in 2018, which led to a five-year, $1.2 billion investment, in addition to supporting a series of national quantum research centers. Thanks to this research, there has been a global boom in the creation of quantum technology companies. An analysis found that private companies raised $450 million in 2017 and 2018. No such joint support exists for neuromorphic computing, despite the fact that it is more mature than quantum technology and has the potential to disrupt existing AI technologies in a much shorter time frame. Of the three branches of future computing that we envision, neuromorphic is severely underinvested.
Finally, a word about the possible impact of the COVID-19 pandemic on our arguments. There is a growing consensus that the crisis has accelerated many of the developments that have already occurred: for example, people have begun to work more from home. While there are immediate benefits to reducing commuting and travel - some estimate that the crisis has led to a reduction in global CO2 emissions of up to 17 percent - the new ways of working come at a cost. To what extent are the carbon savings from reduced travel offset by increased data center emissions? If anything, the popularity of COVID-19 further underscores the need to develop low-carbon computing technologies such as neuromorphic systems.
Our message on how to realize the potential of neuromorphic systems is clear: provide targeted support for collaborative research through the establishment of research centers of excellence; provide flexible funding mechanisms to enable rapid progress; provide mechanisms to work closely with industry to bring in commercial funding and generate new spin-offs and startups, similar to programs already in place for quantum technologies; provide the next generation of neuromorphic researchers and entrepreneurs, and to do all of this quickly and at scale.
Neuromorphic computing has the potential to transform our approach to artificial intelligence. Thanks to the combination of new technologies and the enormous and growing demand for efficient AI, we have new opportunities that require bold ideas, and bold initiatives to support them. Will we seize the opportunity?
Artificial Intelligence Funding Landscape
Investment in 'traditional' digital AI is booming due to the need to handle increasing volumes of data and the need to develop hardware to support existing compute- and memory-intensive algorithms. The UK government announced a £950 million digital AI "industry deal" in April 2018, supported by existing research councils. France announced €1.8 billion in government AI investments for 2018-2022, Germany committed €3 billion for 2018-2025, and Japan invested 26 trillion yen in 2017. 2020 U.S. government funding for civilian AI technologies is $973 million; as non-AI projects are often included in published analyses, U.S. military AI funding data are more difficult to obtain. China is estimated to be investing up to $8 billion in civilian and military AI and is building a $2.1 billion AI research park near Beijing, and the European Commission has committed to invest €1.5 billion over the 2018-2020 period. Commercial investments dwarf this. In the U.S., some estimates put total investment in AI companies at $19.5 billion in 2019, with global investment expected to reach about $98 billion by 2023. This amount must be considered risky if our current hardware systems cannot support potentially disruptive neuromorphic algorithms and architectures. If neuromorphic technologies can deliver the efficiency savings and performance gains they promise, then smart investors are betting on new technologies and architectures outside of digital systems.
Comparable data is not available because neuromorphic technologies currently lack attention and government-level visibility. In addition, research funding is fragmented at the project level, rather than at the strategic level. Although various projections have been published, for example, that the global market for neuromorphic chips will grow from $22.7 million in 2021 to $550.6 million in 2026, the safest conclusion is that funding for neuromorphic systems lags far behind digital artificial intelligence or quantum technologies.
Figure: Comparison of recent global public research funding for digital artificial intelligence technologies. Data are expressed in US dollar equivalent (2021 rates) and in millions of dollars. Some are snapshots in years (e.g., UKRI's 2020 funding commitment), some have no specific duration (e.g., the UK AI field agreement), and some are multi-year plans, but the graph shows the scale of public funding in digital technologies. The disruption of the AI ecosystem by the development of efficient neuromorphic technologies will put most investments at risk.
Related News
1、Chip Packaging Lead Time Has Grown to 50 Weeks
2、Eight Internet of Things (IoT) Trends for 2022
3、Demand for Automotive Chips Will Surge 300%
4、Volkswagen CFO: Chip Supply Shortage Will Continue Until 2024
5、BMW CEO: The Car Chip Problem Will Not Be Solved Until 2023
6、Shenzhen: This Year Will Focus on Promoting SMIC and CR Micro 12-inch Project
- UTMEL 2024 Annual gala: Igniting Passion, Renewing BrillianceUTMEL18 January 20242684
As the year comes to an end and the warm sun rises, Utmel Electronics celebrates its 6th anniversary.
Read More - Electronic Components Distributor Utmel to Showcase at 2024 IPC APEX EXPOUTMEL10 April 20243512
Utmel, a leading electronic components distributor, is set to make its appearance at the 2024 IPC APEX EXPO.
Read More - Electronic components distributor UTMEL to Showcase at electronica ChinaUTMEL07 June 20242145
The three-day 2024 Electronica China will be held at the Shanghai New International Expo Center from July 8th to 10th, 2024.
Read More - Electronic components distributor UTMEL Stands Out at electronica china 2024UTMEL09 July 20242362
From July 8th to 10th, the three-day electronica china 2024 kicked off grandly at the Shanghai New International Expo Center.
Read More - A Combo for Innovation: Open Source and CrowdfundingUTMEL15 November 20193273
Open source is already known as a force multiplier, a factor that makes a company's staff, financing, and resources more effective. However, in the last few years, open source has started pairing with another force multiplier—crowdfunding. Now the results of this combination are starting to emerge: the creation of small, innovative companies run by design engineers turned entrepreneurs. Although the results are just starting to appear, they include a fresh burst of product innovation and further expansion of open source into business.
Read More
Subscribe to Utmel !
- 38PMACR50KLF10
TT Electronics
- 38WKBAR20KLF20
TT Electronics
- 39PNCBR50KLF20
TT Electronics
- 38PLABR100LF20
TT Electronics
- 38WKABR50KLF10
TT Electronics
- 39WRABR200KLF20
TT Electronics
- 39WRBBR100KLF10
TT Electronics
- 38WKBAR100LF10
TT Electronics
- 38WKAAR20KLF10
TT Electronics
- 39PRACR1MEGLF20
TT Electronics