Meta platforms will implement in-house adaptive chips this year to power AI Drive – Memo

Spread the love


Facebook owner Meta Platforms plans to deploy a new version of a custom chip in its data centers this year aimed at supporting its artificial intelligence (AI) push, according to an internal company document seen by Reuters on Thursday.

The chip, the second generation of Meta's in-house silicon line announced last year, will help Meta reduce its reliance on market-dominating Nvidia chips and control the spiraling costs associated with AI workloads in the race to launch AI products.

The world's biggest social media company is spending billions of dollars trying to boost its computing capacity for power-hungry generative AI products, which it pushes into the Facebook, Instagram and WhatsApp apps and into hardware devices like Ray-Ban smartwatches. Special chips and reconfigure data centers accordingly.

We are on whatsapp channels. Click to join.

At Meta's operating scale, successfully deploying its own chip could cut annual energy costs and billions of chip purchase costs by hundreds of millions of dollars, said Dylan Patel, founder of Silicon Research Group SemiAnalysis.

The chips, infrastructure and power needed to run AI applications have become a big sinkhole of investment for tech companies, partially offsetting the gains made in the rush of excitement around the technology.

A Meta spokesman confirmed plans to bring the updated chip into production in 2024, saying it would work in coordination with the hundreds of thousands of off-the-shelf graphics processing units (GPUs) — the go-to chips for AI — the company is buying.

“We see our internally developed accelerators as highly complementary to commercially available GPUs in delivering the right mix of performance and efficiency on meta-specific workloads,” the spokesperson said in a statement.

Meta CEO Mark Zuckerberg said last month that the company plans to have about 350,000 flagship “H100” processors from Nvidia, which produces GPUs used mostly for AI, by the end of this year. Combined with other suppliers, Meta will accumulate a total of 600,000 H100s equivalent computing capacity, he said.

After executives decided to pull the plug on the first iteration of the chip in 2022, deploying its own chip as part of that plan is a positive turn for Meta's in-house AI silicon project.

The company instead chose to buy billions of dollars worth of Nvidia's GPUs, which have a near-monopoly on an AI process called training, which involves teaching them how to organize enormous data sets into models.

The new chip, referred to internally as “Artemis,” can only perform a process called inference, like its predecessor, in which models are called upon to use their algorithms to make ranking judgments and generate responses to user prompts.

Reuters reported last year that Meta was also working on a more ambitious chip capable of handling both training and inference like GPUs.

The Menlo Park, Calif.-based company shared details about the first generation of its Meta Training and Inference Accelerator (MTIA) program last year. The ad portrayed that version of the chip as a learning opportunity.

Despite those initial missteps, the Inference chip is more efficient at crunching Meta's recommended patterns than power-hungry Nvidia processors, according to Patel.

“A lot of money and energy is being spent that could be saved,” he said.

Also read today's other top stories:

Facebook is 20 years old! Founded in 2004 by Mark Zuckerberg and three friends, the social media platform has over 3 billion monthly users. It will continue to expand and invest in innovations such as artificial intelligence and virtual reality. Learn more here.

Can AI replace air traffic controllers? ATCs play a vital role in avoiding collisions and ensuring the safe flow of air traffic. Although technology can automate some aspects of it, humans are likely to be needed for a long time. Check out the reason here.

White robots raising questions! Social robots have physical bodies that are non-threatening versions of humans or animals and are designed to interact with people. Then why are they white? Dive in here.



Source link

Leave a Comment