Article Image

IPFS News Link • Robots and Artificial Intelligence

Distributed AI Inference Will Capture Most of the LLM Value

• https://www.nextbigfuture.com, by Brian Wang

AI training – feed large amounts of data into a learning algorithm to produce a model that can make predictions. AI Training is how we make the AI that is useful.

AI inference is where we do useful and valuable things with the trained AI.

Nvidia revealed the H200 chip running the latest Open source Llama 3 model can make seven times more revenue than the chip and operationg the chip costs over four years.

This means be able to build, deploy and operate the most AI inference will mean getting the most AI revenue.

Here I go over the details of how Tesla's plan for a distributed AI inference system will let them deploy 10 to 100 times more AI capacity than other competitors.


www.BlackMarketFridays.com