Article Image

IPFS News Link • Robots and Artificial Intelligence

Six Big AI Projects in the Race Beyond 2 Trillion Parameters

• https://www.nextbigfuture.com, by Brian Wang

There is OpenAI, Anthropic, Google/Deepmind, Meta, a UK government project and a stealth project.

It takes $1-2 billion per year of resources to be in the game. There is $1 billion needed for hardware and that hardware will need to be updated every 2-3 years. There is a need for hundreds of AI specialists and staff and some of the lead people will need $1-2 million and stock options.

There are over a hundred other projects and some of those could step up and impact the race.

GPT-5 should be finished by the of 2023 and released early in 2024. It should have 2-5 trillion parameters.

Anthropic plans to build a model called Claude-Next which should be 10X more capable than today's most powerful AI. Anthropic has already raised $1 billion and will raise another $5 billion and will spend $1 Billion over the next 18 months. Anthropic estimates its frontier model will require on the order of 10^25 FLOPs, or floating point operations. This is several orders of magnitude larger than even the biggest models today. Anthropic relies on clusters with "tens of thousands of GPUs." Google is one of the Anthropic funders.

Google and Deepmind are working together to develop a GPT-4 competitor called Gemini. The Gemini project is said to have begun in recent weeks after Google's Bard failed to keep up with ChatGPT. Gemini will be a large language model that will have a trillion(s) parameters like GPT-4 or GPT-5. They will be using tens of thousands of Google's TPU AI chips for training. It could take months to complete. Whether Gemini will be multimodal is unknown.


thelibertyadvisor.com/declare