Summary: Meta, led by CEO Mark Zuckerberg, is investing billions in Nvidia’s H100 graphics cards to build a massive compute infrastructure for AI research and projects. By end of 2024, Meta aims to have 350,000 of these GPUs, with total expenditures potentially reaching $9 billion. This move is part of Meta’s focus on developing artificial general intelligence (AGI), competing with firms like OpenAI and Google’s DeepMind. The company’s AI and computing investments are a key part of its 2024 budget, emphasizing AI as their largest investment area.

  • qupada@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    10 months ago

    The estimated training time for GPT-4 is 90 days though.

    Assuming you could scale that linearly with the amount of hardware, you’d get it down to about 3.5 days. From four times a year to twice a week.

    If you’re scrambling to get ahead of the competition, being able to iterate that quickly could very much be worth the money.