Categories: News

This new method could reduce the energy needs of AI applications by 95% — but may also need whole new forms of hardware

Engineers reveal alternative to floating-point multiplication

New method could reduce AI energy consumption by up to 95%

But new calculation method would also need alternative hardware to existing GPUs

As artificial intelligence (AI) technologies evolve, the demand for computing power – and consequently, electricity – has surged, as have concerns about its energy consumption.

Now, engineers from BitEnergy AI offers a potential solution – a new method of computation which could reduce the energy needs of AI applications by up to 95%.

Linear-Complexity Multiplication could apparently reducing the energy needs of AI applications by 95% by changing how AI calculations are performed, moving away from the traditional use of floating-point multiplication (FPM) in favor of integer addition.

From floating-point multiplication to linear-complexity multiplication

FPM is typically used in AI computations because it allows systems to handle very large or small numbers with high precision – however, it is also one of the most energy-intensive operations in AI processing. The precision FPM offers is necessary for many AI applications, particularly in areas like deep learning, where models require detailed calculations.

The researchers claim despite cutting energy consumption, there is no impact on the performance of AI applications. However, while the Linear-Complexity Multiplication method shows great promise, its adoption faces certain challenges.

One significant drawback is that the new technique requires different hardware to what is currently in use. Most AI applications today run on hardware optimized for floating-point computations, such as GPUs made by companies like Nvidia. The new method would require redesigned hardware to function effectively.

The team notes the hardware needed for its method has already been designed, built, and tested. However, this new hardware will need to be licensed and there is no telling how this hardware will be made available to the broader market.

Are you a pro? Subscribe to our newsletter Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed! Contact me with news and offers from other Future brands Receive email from us on behalf of our trusted partners or sponsors

Estimates suggest ChatGPT alone currently consumes approximately 564 MWh of electricity daily, enough to power 18,000 US households. Some critics predict that in just a few years, AI applications could consume around 100 TWh of electricity annually, putting them on par with the energy-hungry Bitcoin mining industry.

Via TechXplore

Original Author: Efosa Udinmwen | Source: TechRadar

Akshit Behera

Share
Published by
Akshit Behera

Recent Posts

Trump administration’s deal is structured to prevent Intel from selling foundry unit | TechCrunch

The deal allows the U.S. to take more equity in Intel if the company doesn't…

8 months ago

3 Apple Watches are rumored to arrive on September 9 – these are the models to expect

We're expecting two new models alongside the all-new Apple Watch Series 11. | Original Author:…

8 months ago

Fujitsu is teaming with Nvidia to build probably the world’s fastest AI supercomputer ever at 600,000 FP8 Petaflops – so Feyman GPU could well feature

Japan’s FugakuNEXT supercomputer will combine Fujitsu CPUs and Nvidia GPUs to deliver 600EFLOPS AI performance…

8 months ago

Microsoft fires two more employees for participating in Palestine protests on campus

Microsoft has fired two more employees who participated in recent protests against the company’s contracts…

8 months ago

Microsoft launches its first in-house AI models

Microsoft announced its first homegrown AI models on Thursday: MAI-Voice-1 AI and MAI-1-preview. The company…

8 months ago

Life 3.0 – Being Human in the Age of Artificial Intelligence by Max Tegmark

A comprehensive review of Max Tegmark's Life 3.0, exploring the future of artificial intelligence and…

8 months ago