AI researchers have made another breakthrough.

AI researchers have made another breakthrough.

The study, published in the journal Nature Computational Science, suggests a new way of computation that enables large language models (LLMs) to run 100 times faster and 10,000 times more energy-efficiently than they are today.

The point is simple. Existing GPUs have to constantly exchange data between the calculator and memory. It’s like reading a sentence by going back and forth between a bookcase and inserting it back into place. However, the proposed method processes storage and operation simultaneously in the same place. This simple method reduces delay and almost eliminates power consumption.

More surprisingly, we have already implemented GPT-2-level performance on this hardware without re-learning.

If this technology becomes a reality, we can expect a future where models like GPT-5 will soon operate in my hands, offline, and with very little energy than now, not in the data center.

Even if the technology is fast, it is too fast.

출처: https://www.nature.com/articles/s43588-025-00854-1

tslaaftermarket

Share
Published by
tslaaftermarket

Recent Posts

an empty tank

$2.5 trillion. an empty tank In early 2023,It is the money that was piled up…

3주 ago

Tesla FSD Approaches European Approval

26/3/24 #TesslerNews Summary Tesla FSD Approaches European Approval…Test video released in the NetherlandsTesla's FSD (Superbidden)…

3주 ago

Trump’s unconventional remarks:

🚨Trump's unconventional remarks: "Iran agreed to never possess nuclear weapons" We won this war We…

3주 ago

Focusing on geopolitics and macro insecurity

"Focusing on geopolitics and macro insecurity" a. On Friday, U.S. stocks weakened by 1% (Dow…

3주 ago

Five-day reprieve. Is it real surrender

Five-day reprieve. Is it real surrender Trump's 48-hour ultimatum of the Hormuz blockade, and his…

3주 ago

What I felt watching the war in Iran.

What I felt watching the war in Iran. It tastes like America. It's not the…

3주 ago