Researchers develop spintronics platform for energy-efficient generative AI
Researchers at Tohoku University and the University of California, Santa Barbara have developed new computing hardware that utilizes Gaussian probability bits made from stochastic spintronic devices. This innovation is expected to provide an energy-efficient platform for power-hungry generative AI.
As Moore’s Law slows down, domain-specific hardware architectures, such as probabilistic computing using natural probabilistic building blocks, are gaining attention to address difficult computational problems. Just as quantum computers are suited to problems rooted in quantum mechanics, stochastic computers are designed to handle algorithms that are inherently probabilistic.
These algorithms can be applied in areas such as combinatorial optimization and statistical machine learning. Notably, the 2024 Nobel Prize in Physics was awarded to John Hopfield and Jeffrey Hinton for their groundbreaking work in machine learning.
Stochastic computers have traditionally been limited to binary variables or probability bits (p-bits), making them inefficient for continuous variable applications. Researchers at the University of California, Santa Barbara and Tohoku University extended the p-bit model by introducing Gaussian probability bits (g-bits). These complement the p bits with the ability to generate Gaussian random numbers. Similar to p-bits, g-bits serve as a fundamental building block for stochastic computing, enabling optimization and machine learning with continuous variables.
One machine learning model that benefits from g-bits is the Gauss-Bernoulli-Boltzmann Machine (GBM). The g-bit allows GBM to run efficiently on stochastic computers, opening new opportunities for optimization and learning tasks. For example, current generative AI models, such as diffusion models that are widely used to create realistic images, videos, and text, rely on computationally expensive iterative processes. Using g bits allows the stochastic computer to process these iterative stages more efficiently, reducing energy consumption and speeding up the production of high-quality output.
Other potential applications include portfolio optimization and mixed variable problems where the model needs to handle both binary and continuous variables. Traditional p-bit systems struggle with such tasks because they are discrete in nature and require complex approximations to handle continuous variables, leading to inefficiencies. Combining p and g bits overcomes these limitations and allows stochastic computers to directly and effectively address a wider range of problems.
This research was presented at the 70th IEEE International Electronic Devices Conference.
Details: Nihal Sanjay Singh et al, Beyond Ising: Mixed Continuous Optimization with Gaussian Probabilistic Bits using Stochastic MTJs. 70th Annual IEEE International Electronic Devices Conference: iedm24.mapyourshow.com/8_0/ses … s.cfm?ScheduleID=418
Provided by Tohoku University
Citation: Researchers develop spintronics platform for energy-efficient generative AI (December 11, 2024) https://phys.org/news/2024-12-spintronics-platform-energy-efficient-generative Retrieved December 11, 2024 from .html
This document is subject to copyright. No part may be reproduced without written permission, except in fair dealing for personal study or research purposes. Content is provided for informational purposes only.