AI researchers have made another breakthrough.

AI researchers have made another breakthrough.

The study, published in the journal Nature Computational Science, suggests a new way of computation that enables large language models (LLMs) to run 100 times faster and 10,000 times more energy-efficiently than they are today.

The point is simple. Existing GPUs have to constantly exchange data between the calculator and memory. It’s like reading a sentence by going back and forth between a bookcase and inserting it back into place. However, the proposed method processes storage and operation simultaneously in the same place. This simple method reduces delay and almost eliminates power consumption.

More surprisingly, we have already implemented GPT-2-level performance on this hardware without re-learning.

If this technology becomes a reality, we can expect a future where models like GPT-5 will soon operate in my hands, offline, and with very little energy than now, not in the data center.

Even if the technology is fast, it is too fast.

출처: https://www.nature.com/articles/s43588-025-00854-1

tslaaftermarket

Share
Published by
tslaaftermarket

Recent Posts

A point that changes into a certain important

A point that changes into a certain important transition phase is called an inflection point.…

19 분 ago

Is the ‘Russian Engine Mistake’ story true?

Is the 'Russian Engine Mistake' story true? Why South Korea Becomes World's 7th Space Power…

1시간 ago

The price of McDonald’s Big Mac in the U.S. has

The price of McDonald's Big Mac in the U.S. has become 40% more expensive compared…

2일 ago

Supply and demand segments

11/24 Supply and demand segments (comprehensive of major investment firms) - U.S. stocks on Wednesday…

2일 ago

Alternative Economic Model for Korean

< Alternative Economic Model for Korean Progress - The 'European Model' is on the verge…

2일 ago

Governor Williams ‘has room to cut interest rates’

Governor Williams 'has room to cut interest rates' 1) Dollar-Won rises 20 won in one…

2일 ago