Zuckerberg says Meta will need 10x more computing power to train Llama 4 than Llama 3

  • AI
  • August 1, 2024
  • 0 Comments

Meta, which develops one of the biggest foundational open-source large language models, Llama, believes it will need significantly more computing power to train models in the future. Mark Zuckerberg said on Meta’s second-quarter earnings call on Tuesday that to train Llama 4 the company will need 10x more compute than what was needed to train […]

© 2024 TechCrunch. All rights reserved. For personal use only.

  • Related Posts

    Meta turns to solar — again — in its data center-building boom

    The announcement comes as Meta CEO Mark Zuckerberg maintains the company’s ambitious AI strategy, which will require hefty capital investments in data centers. © 2024 TechCrunch. All rights reserved. For…

    Continue reading
    Sam Altman’s ousting from OpenAI has entered the cultural zeitgeist

    The lights dimmed as five actors took their places around a table on a makeshift stage in a New York City art gallery turned theater for the night. Wine and…

    Continue reading

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    You Missed

    Meta turns to solar — again — in its data center-building boom

    • By staff
    • January 31, 2025
    • 0 views

    Microsoft is forming a new unit to study AI’s impacts

    • By staff
    • January 31, 2025
    • 1 views

    OpenAI launches o3-mini, its latest ‘reasoning’ model

    • By staff
    • January 31, 2025
    • 1 views

    Sam Altman’s ousting from OpenAI has entered the cultural zeitgeist

    • By staff
    • January 31, 2025
    • 1 views

    AI startup Perplexity sued for alleged trademark infringement

    • By staff
    • January 31, 2025
    • 2 views

    DeepSeek lights a fire under Silicon Valley

    • By staff
    • January 31, 2025
    • 1 views