
Meta Unveils a Collection of New Llama 4 AI Models
Meta has unveiled a collection of AI models from the Llama 4 family. The new suite includes Llama 4 Scout, Llama 4 Maverick, and Llama 4 Behemoth. Scout and Maverick are open-source and available on Llama.com and Hugging Face. Behemoth, the largest model, is still in training.
Llama 4 Scout is Meta’s smallest, general-purpose AI model running on a single NVIDIA H100 GPU. It features 17 billion active parameters and 16 experts, totaling 109 billion parameters, with a 10 million-token context window.
Parameters reflect the model’s problem-solving capacity, meaning models with more parameters have better reasoning capabilities than those with fewer. The “experts” represent the subset of the total parameters activated for each query.
Meta says Scout outperforms Google’s Gemma 3 and Mistral 3.1 in tasks like summarization, personalization, and code reasoning.
Llama 4 Maverick also includes a model with 17 billion active parameters but scales to 129 experts and 400 billion total parameters. It has a significantly shorter context window (at 1 million tokens) compared to Scout.
According to Meta, Maverick exceeds comparable models such as OpenAI’s GPT-4o and Google’s Gemini 2.0 in coding, reasoning, multilingual, long-context, and image benchmarks. It also delivers comparable coding and reasoning output with the much larger DeepSeek v3.1, the AI model that caused US tech stocks to plunge. Still, it’s not as advanced as Google’s Gemini 2.5 Pro, Anthropic’s Claude 3.7 Sonnet, and OpenAI’s GPT-4.5.
Meta also unveiled its largest-scale AI model, Llama 4 Behemoth, which is still in training. Behemoth features 288 billion active parameters and 16 experts, totaling 2 trillion parameters. It acts as a “teacher” model for its new AI models.
According to Meta’s benchmarks, Behemoth outperforms GPT-4.5, Claude 3.7, Sonnet, and Gemini 2.0 Pro (but not 2.5 Pro) on several STEM skills like math problem-solving. However, a former member of Meta’s AI team has alleged that some benchmark results were artificially inflated.
Meta is serious about establishing a leading position in the AI space. It reportedly boasts roughly twice as many AI resources as its competitors – around 350,000 Nvidia H100 chips – and it’s developing its own AI chips. As a reference, OpenAI and xAI reportedly have around 200k H100s powering their AI projects.
Beyond language models, Meta has also been working on AI-powered agents for businesses and humanoid robots to tackle household chores.