Meta has unveiled its llama3 model, the llama38B 70B 400B. What makes it different from the open AI is that

Lamar 3 LLM Model

Meta has unveiled its llama3 model, the llama38B 70B 400B. What makes it different from the open AI is that

  1. I can run the server myself.
  2. I can fine tune and add mine. Korean pack and more
  3. I can learn the data I have
  4. You don’t have to pay for electricity, but you don’t have to pay for it.
  5. Token usage is unlimited. GPT4 gets disconnected when using a certain amount

This is the difference. In 8B, B is the abbreviation for billion bilions. Usually, LLM inference operations are performed with four-bit operations

16 billion*4bit = 8GB required

  • – 16B*0.5Byte = 8GB

Then 400B that many people are interested in

400GB of memory is required. I think 400B will work properly if 5-6 H100 memories are attached with 80GB Ni NVLink. Whether it is H200 or 141GB 4.8TB/sec speed, so if you connect 3-4 H200, the 400B model will work.

In other words, 400B is faster and more accurate than ChatGPT4, so any medium-sized company can operate the service. That’s not such a burdensome level between cloud-on-promise (its own servers).

There are a lot of people who are reluctant to upload my code to OpenAI that requires data while using GPT4. I upload some for debugging purposes, but I don’t feel comfortable giving my source to others.

There is a need for a model that operates data on its own when GPT is required in various fields such as defense, public institutions, governments, public corporations, large corporations, mid-sized companies, healthcare, patents, RnD, etc.

How much data will be accumulated with open AI every day? Meta is said to have opened up in the end, but it did not open the source code, but opened the operating environment. Where is it that has been opened to use even on-promise?

Someone else might come out and make a complete open-source project like Linux. Linux is not released without Microsoft Windows. The ability to take mine while opening step by step in the United States and Europe is great.

It’s a complicated world.

Perhaps servers or edge chips that have built-in NPUs for 4-bit LLM CNN inference that provide 512GB of GDDR7 and implement 1TB/sec bandwidth in 256bit BUS will be popular. If the Lamar 3 400B is ported to one chip and one module, there will be demand.

These days, our company NPU tried to implement the latest YOLO model and LLM at the same time. As I slipped in several tasks, I don’t have the motivation to open my hands and do research together. Who doesn’t have a VC to fund? If I had accumulated the sales I had exported abroad in cash without doing national projects, I would have bought a few buildings. But why do I do R&D with a deficit?

Humans do not live only in the world and bread, but live in the word of God. LLM is a large language model. The world was created through words, and the words are logos and translated into wisdom, and it is said that there is life in them. And that is the light that smells like jade to humans.

Perhaps LLM is a principle that models and shows us this celestial world. It’s really fun to study the Bible with GPT4. If we have a chance, we’ll show it in another column

People will not live in buildings, but with artificial intelligence and LLM to give people the light of welfare and convenience

lama3 #llama3 #400B #LLM #LMM #VC #YOLO #NPU #AI #Artificial Intelligence #Compiler #GPT4 #InvestmentIncentive

tslaaftermarket

Share
Published by
tslaaftermarket

Recent Posts

Tesla News Summary Is Getting Scented to Surge

Tesla News Summary Is Getting Scented to Surge Tesla slashes Canadian Autopilot feature price by…

1일 ago

Tesla News Summary Adjusted for Options Expiration Date

Tesla News Summary Adjusted for Options Expiration Date SpaceX Successfully Captures Starship Super Heavy Booster…

2일 ago

Let’s start the new year vigorously

Let's start the new year vigorously. Happy new year, and I sincerely hope that you…

2일 ago

In the last essay, I told

In the last essay, I told you that under the Trump administration, inflation could be…

2일 ago

One of the biggest issues of the year

One of the biggest issues of the year is the Trump administration's tariffs. First of…

2일 ago

We’re continuing our annual outlook.

We're continuing our annual outlook. However, since it's a weekday essay, the progress is not…

2일 ago