Tech

New chip uses AI to shrink large language models’ energy footprint by 50%

Share
Share
New chip uses AI to shrink large language models' energy footprint by 50%
Ramin Javadi. Credit: Karl Maasdam

Oregon State University College of Engineering researchers have developed a more efficient chip as an antidote to the vast amounts of electricity consumed by large-language-model artificial intelligence applications like Gemini and GPT-4.

“We have designed and fabricated a new chip that consumes half the energy compared to traditional designs,” said doctoral student Ramin Javadi, who, along with Tejasvi Anand, associate professor of electrical engineering, presented the technology at the IEEE Custom Integrated Circuits Conference in Boston.

“The problem is that the energy required to transmit a single bit is not being reduced at the same rate as the data rate demand is increasing,” said Anand, who directs the Mixed Signal Circuits and Systems Lab at OSU. “That’s what is causing data centers to use so much power.”

The new chip itself is based on AI principles that reduce electricity use for signal processing, Javadi said.

“Large language models need to send and receive tremendous amounts of data over wireline, copper-based communication links in data centers, and that requires significant energy,” he said. “One solution is to develop more efficient wireline communication chips.”

When data is sent at high speeds, Javadi explains, it gets corrupted at the receiver and has to be cleaned up. Most conventional wireline communication systems use an equalizer to perform this task, and equalizers are comparatively power-hungry.

“We are using those AI principles on-chip to recover the data in a smarter and more efficient way by training the on-chip classifier to recognize and correct the errors,” Javadi said.

Javadi and Anand are working on the next iteration of the chip, which they expect to bring further gains in energy efficiency.

More information:
A 0.055pJ/bit/dB 42Gb/s PAM-4 Wireline Transceiver with Consecutive Symbol to Center (CSC) Encoding and Classification for 26dB Loss in 16nm FinFET.

Provided by
Oregon State University


Citation:
New chip uses AI to shrink large language models’ energy footprint by 50% (2025, May 8)
retrieved 8 May 2025
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles
This ‘meh’ iPhone 17 Air camera tip might give us more insight than meets the eye
Tech

This ‘meh’ iPhone 17 Air camera tip might give us more insight than meets the eye

A ‘leaked’ image shows a screen protector for the iPhone 17 family...

JCB launches £649 rugged phone with wild features, but even cheaper phones threaten its survival
Tech

JCB launches £649 rugged phone with wild features, but even cheaper phones threaten its survival

JCB’s rugged phone trio enters a saturated market with a price that...

Invasive lake weed turned to clean energy in Ethiopia
Tech

Invasive lake weed turned to clean energy in Ethiopia

Fishermen on a water hyacinth-infested lake. In Ethiopia, this fast-spreading aquatic weed...