A Microchip To Reshape Artificial Intelligence
A new microchip design has been announced by the computer giant IBM that could power a new generation of much smarter devices that don’t rely on the Cloud or the Internet for their intelligence.
Named NothPole, the new design has the potential to enable smarter, more efficient, network-independent devices. This possibility is opened with a prototype AI chip that is said to be more than a dozen times more energy efficient than current industry-leading products.
Modeled on the human brain, the new chip paves the way for a different sort of AI, one that doesn’t rely on big cloud and data companies like Amazon or Google. It could also have military applications, powering drones, robots and augmented-reality equipment against adversaries who can target electronic emissions.
IBM claims its 14nm analogue AI chip, which can encode 35 million phase-change memory devices per component, can model up to 17 million parameters. It simulates how the human brain operates, performing computations directly within memory.
Unlike traditional chips that separate memory from processing circuits, the NorthPole chip combines the two, like synapses in the brain that hold and process information based on their connection to other neurons.
Publishing in the journal Science, IBM researchers say it is a “neural inference architecture that blurs this boundary by eliminating off-chip memory, intertwining compute with memory on-chip, and appearing externally as an active memory.”
This is an important creation because current computers have at least two characteristics that limit AI development.
Power Consumption: The human brain, runs on the equivalent of 12 watts can retain and retrieve the information to sustain a detailed conversation while simultaneously absorbing, correctly interpreting, and making decisions about the enormous amount of sensory data required to drive a car. In contrast, A conventional computer takes 175 watts just one simple task, like processing ay spreadsheet.
This is a huge limiting factor for autonomy and energy inefficiency is one reason why many of today’s AI tools depend on enormous cloud server farms that consume enormous amounts of energy.
Physical Constraints: Moore's Law is an observation that the number of transistors in a computer chip doubles every two years or so. As the number of transistors increases, so does processing power. The law also states that, as the number of transistors increases, the cost per transistor falls.
Moore's Law is still delivering exponential improvements, albeit at a slower pace. Industry experts have not reached a consensus on exactly when Moore's law will cease to apply, but they do agree that the atomic limit of how many transistors we can fit on a chip is likely to happen soon.
The future of AI requires innovations in energy efficiency, from the way models are designed down to the hardware that runs them> It is becoming evident that big improvements in AI energy efficiency are essential to keep pace with AI’s rapidly expanding carbon footprint. However, the vision of less power-hungry AI is emerging.
IBM: IBM: IBM: Princeton University: Univerity of Washington: DefenseOne: ScienceMuseumGroup:
CarandDriver: Synopsis: RealCearDefense: Techradar: GROUND:
Image: Mohamed Nohassi
You Might Also Read:
Neuromorphic Computing Changes Machine Learning:
___________________________________________________________________________________________
If you like this website and use the comprehensive 6,500-plus service supplier Directory, you can get unrestricted access, including the exclusive in-depth Directors Report series, by signing up for a Premium Subscription.
- Individual £5 per month or £50 per year. Sign Up
- Multi-User, Corporate & Library Accounts Available on Request
- Inquiries: Contact Cyber Security Intelligence
Cyber Security Intelligence: Captured Organised & Accessible