Article Image

IPFS News Link • Robots and Artificial Intelligence

Tesla Expanding AI Data Centers to 500 Megawatts and AI Inference Will Be 1000s of Times Today...

• https://www.nextbigfuture.com, by Brian Wang

Elon Musk announced Tesla's plans to expand its AI hardware capabilities, aiming to increase power and cooling capacity from 130MW to over 500MW within the next 18 months. The expansion will support a mix of Tesla's own AI hardware and Nvidia/other chips, with a goal to have half of the AI compute building dedicated to each. This move is part of Tesla's strategy to enhance its AI capabilities, including the development of the Tesla AI5 (HW5) computer, which is expected to be 10 times more powerful than the current HW4 model and is slated for release in the second half of 2025.

Tesla will create new AI builds using Dojo and then immediately validate the build on a huge data center inference cluster, feeding them stored videos and simulations to determine if the build is better. This would be a rapid fire training flywheel. Tesla will bring new build validation into the data center, speeding new build validation by 100 times.

Tesla should add 10-20 million cars in 2026 and 2027 and those will have Hardware 5 (HW5 – AI5). This will be 10-20 Gigawatts of inference compute. If the HW5 chips are comparable to Nvidia H100s or B200s then they will have about 4-20 Petaflops of compute. 10 million cars would have 40,000 Exaflops to 400,000 Exaflops of inference compute.

Elon has said that the future 100 million cars (say around 2032-2034) would have five to ten times more compute.

This will be 1000 to 10,000 times the scale of AI computing today.


PirateBox.info