Tech

New chip uses AI to shrink large language models’ energy footprint by 50%

Share
Share
New chip uses AI to shrink large language models' energy footprint by 50%
Ramin Javadi. Credit: Karl Maasdam

Oregon State University College of Engineering researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.

“We have designed and fabricated a new chip that consumes half the energy compared to traditional designs,” said doctoral student Ramin Javadi, who, along with Tejasvi Anand, associate professor of electrical engineering, presented the technology at the IEEE Custom Integrated Circuits Conference in Boston.

“The problem is that the energy required to transmit a single bit is not being reduced at the same rate as the data rate demand is increasing,” said Anand, who directs the Mixed Signal Circuits and Systems Lab at OSU. “That’s what is causing data centers to use so much power.”

The new chip itself is based on AI principles that reduce electricity use for signal processing, Javadi said.

“Large language models need to send and receive tremendous amounts of data over wireline, copper-based communication links in data centers, and that requires significant energy,” he said. “One solution is to develop more efficient wireline communication chips.”

When data is sent at high speeds, Javadi explains, it gets corrupted at the receiver and has to be cleaned up. Most conventional wireline communication systems use an equalizer to perform this task, and equalizers are comparatively power-hungry.

“We are using those AI principles on-chip to recover the data in a smarter and more efficient way by training the on-chip classifier to recognize and correct the errors,” Javadi said.

Javadi and Anand are working on the next iteration of the chip, which they expect to bring further gains in energy efficiency.

More information:
A 0.055pJ/bit/dB 42Gb/s PAM-4 Wireline Transceiver with Consecutive Symbol to Center (CSC) Encoding and Classification for 26dB Loss in 16nm FinFET.

Provided by
Oregon State University


Citation:
New chip uses AI to shrink large language models’ energy footprint by 50% (2025, May 8)
retrieved 8 May 2025
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
Researchers develop AI motion ‘translation’ model for controlling different kinds of robots
Tech

Researchers develop AI motion ‘translation’ model for controlling different kinds of robots

MotionGlot is a model that can generate motion trajectories that obey user...

Amazon’s slightly terrifying new warehouse robot comes with “a sense of touch”
Tech

Amazon’s slightly terrifying new warehouse robot comes with “a sense of touch”

Amazon’s latest warehouse robot can feel items that it handles Vulcan has...

How temperature increase drives energy loss in fuel cells
Tech

How temperature increase drives energy loss in fuel cells

Quantum mechanics simulations reveal the impact of temperature on energy conversion efficiency...

This soft robot ‘thinks’ with its legs
Tech

This soft robot ‘thinks’ with its legs

The authors with their robot. From left to right: Alberto Comoretto, Harmannus...