Exploring the Business Potential of Neuromorphic Computing
Ever since the dawn of the computing age, scientists and engineers have broadly shared a single long-term goal. It is to build a computer that could rival the human brain in complexity and capacity. For years, that lofty goal remained distant. It represented a pie-in-the-sky idea that would require a massive technological leap forward to facilitate. Fast-forward to today, however, and the idea of a computer that emulates the human brain no longer seems so far off.
That is the logic behind a growing subsector of AI development known as neuromorphic computing. It represents a movement toward building software and hardware that functions like the human brain and nervous system. The idea is that such computers would make a natural fit for powering next-generation AI. After all, the human brain powers real intelligence—why not the artificial variety?
For businesses, this is a branch of technology that is not exactly right around the corner. However, it is something that will begin trickling into the business technology ecosystem sooner rather than later. Therefore, it makes sense for businesses to get a head start on understanding neuromorphic computing right now. To aid in that process, here is an overview of the business implications of neuromorphic computing. We will discuss what it is, how far along the technology is on the development curve, how businesses might apply it, and the challenges associated with doing so. Let’s dive in.
What Is Neuromorphic Computing?
Broadly speaking, neuromorphic computing is a computer engineering approach that calls for hardware and software design modelled after the human brain and nervous system. That means it attempts to build processors that rely on electronic neurons and synapses instead of transistors and connective layers.
The advantages of that approach are obvious. While a transistor has three connections and only two states, each neuron in the human brain may have up to 10,000 connections to other neurons. This allows for a denser connective web that can simultaneously analyze and store vast quantities of data. Additionally, the human brain consumes the equivalent of about 12 watts of power, making it the single most energy-efficient computer on Earth.
The Benefits of Neuromorphic Computing for Businesses
Once commercialized, neuromorphic computing could revolutionize the way businesses handle their computing needs. One of the biggest advantages it would come with is energy efficiency. By flipping the architecture of today’s microchips on their head—relatively few neurons and vast numbers of synapses versus an endlessly-growing quantity of transistors with scarce interconnections—power needs would drop considerably.
On top of that, neuromorphic computing would instantly place massive computing capacity at the disposal of businesses. How much power? Consider that in 2013, a simulation of a human brain run on the world’s fourth-most powerful supercomputer only managed to replicate about 1% of the computing power of the real thing. In other words, a business onboarding machines based on the neuromorphic computing technology of today would be like upgrading from a UNIVAC 1 to a Frontier Supercomputer. Even that may understate the leap forward involved.
This would give businesses the ability to develop products and services that require vast real-time data processing capabilities, such as enabling wide-scale autonomous vehicle use and other services which require massive edge-deployed computing. Additionally, commercialized neuromorphic computing technology is almost certainly needed to facilitate true Artificial General Intelligence (AGI) and the Artificial Super Intelligence (ASI) that would follow it.
The State of Neuromorphic Computing Development
Right now, multiple well-known—and well-funded—tech giants are in the various stages of developing neuromorphic computing technology. For example, IBM already has a functioning neuromorphic processing chip known as TrueNorth, featuring a million neurons and 256 million synapses. It also has an operational power range between 65-100mW—putting it in an energy efficiency class by itself.
Intel also has a neuromorphic computing chip called Loihi 2, featuring over a million neurons and 120 million synapses. It also features 6 Lakemont management cores, borrowing architecture from Intel’s low-power Quark series of microcontrollers. This provides a ready bridge for connections to existing x86-based infrastructures. As a result, Loihi 2 is a little more power-hungry, consuming a bit under one watt during ordinary use.
Beyond the tech giants, there is also an effort underway by an EU-funded organization called The Human Brain Project aimed at developing neuromorphic computing technology. It has already yielded a working neuromorphic supercomputer called BrainScaleS, which boasts almost 4 million neurons and almost 880 million synapses.
For the sake of comparison, the human brain contains around 100 billion neurons and over 100 trillion synapses. So, while the recent breakthroughs in neuromorphic computing development show promise, they do still have quite a long way to go before they reach true revolutionary status.
The Business Challenges Represented by Neuromorphic Computing
As the nascent neuromorphic computing technology described above starts filtering down into real-world business use, it will present a series of challenges to those who would make use of it. First and foremost is the fact that, like most cutting-edge technologies, early neuromorphic computing infrastructure likely will not feature much standardization.
That is what is behind the inclusion of x86 control architecture in the Intel Loihi 2 chip. It represents their attempt to link the already-world-standard conventional microchip architecture with the next generation of neuromorphic processors. For early adopting businesses, this will mean making a fairly large bet on whose vision of the future of neuromorphic computing will win the day. At scale, this could represent billions of dollars of potentially dead-end investments.
On top of that, finding people with the skills necessary to operate and exploit the technology will prove challenging. Right now, there is already a large and growing AI and machine learning skills gap that businesses must contend with. The addition of neuromorphic computing will unleash capabilities that will only make that worse, not better.
All of this means businesses trying to stay ahead of the curve may need to start strategic planning for what neuromorphic computing could mean to their operations right now. Doing so could and should inform things like internal talent development, recruiting efforts, and current-generation technology procurement. With the right early-stage planning, it should be possible to navigate the challenges presented by neuromorphic computing while putting it to effective use.
Your Next-Generation Technology Partner
Although the widespread adoption of neuromorphic computing devices and the applications they enable is not imminent, businesses can expect to see real life applications within the next three to five years. At the outset, this should take the form of technology demonstrations and early product concept development. However, like most technological breakthroughs, expect this one to gather significant momentum thereafter.
When the time comes, Outsource IT can be a critical partner to businesses navigating the new neuromorphic computing landscape. In the meantime, we can help businesses streamline and improve the management of their current IT infrastructure. To get started, contact a knowledgeable Outsource IT account manager, and ask them about our fixed cost managed IT services.