AI researchers have made another breakthrough.

AI researchers have made another breakthrough.

The study, published in the journal Nature Computational Science, suggests a new way of computation that enables large language models (LLMs) to run 100 times faster and 10,000 times more energy-efficiently than they are today.

The point is simple. Existing GPUs have to constantly exchange data between the calculator and memory. It’s like reading a sentence by going back and forth between a bookcase and inserting it back into place. However, the proposed method processes storage and operation simultaneously in the same place. This simple method reduces delay and almost eliminates power consumption.

More surprisingly, we have already implemented GPT-2-level performance on this hardware without re-learning.

If this technology becomes a reality, we can expect a future where models like GPT-5 will soon operate in my hands, offline, and with very little energy than now, not in the data center.

Even if the technology is fast, it is too fast.

출처: https://www.nature.com/articles/s43588-025-00854-1

tslaaftermarket

Share
Published by
tslaaftermarket

Recent Posts

Let’s go back to the Korean deaths

[Cambodia Story 2] Let's go back to the Korean deaths The media first reported the…

3시간 ago

In the days of young people from provincial

In the days of young people from provincial areas flocking to Seoul in the 1980s,…

4시간 ago

What’s scarier than Trump’s tariffs

🚨 What's scarier than Trump's tariffs… is what's happening in emerging markets right now Recently,…

8시간 ago

Broadcom cooperates with OpenAI to develop customized chips

> 1) USD-KRW 1430 Less Than KRW 1430 Amid Oral Intervention by Foreign Exchange AuthoritiesLast…

8시간 ago

Theme stocks jump on JPMorgan’s strength

10/14 Theme stocks jump on JPMorgan's strength amid gains on U.S. stock, Trump, Bessent comments…

8시간 ago

Coin, phishing, and windbreak

[Coin, phishing, and windbreak] These three words seem to be enough to talk about the…

14시간 ago