Close Menu
  • Home
  • Daily
  • AI
  • Crypto
  • Bitcoin
  • Stock Market
  • E-game
  • Casino
  • World
  • Affiliate News
  • English
    • Português
    • English
    • Español

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Best RPGs For Feeling Like A God

June 21, 2025

Bitcoin Futures Turn Bearish Despite ETF Inflows

June 21, 2025

Thailand-Cambodia border dispute: How a leaked phone call between a former strongman and a young leader could topple a government

June 20, 2025
Facebook X (Twitter) Instagram
MetaDaily – Breaking News in Crypto, Markets & Digital Trends
  • Home
  • Daily
  • AI
  • Crypto
  • Bitcoin
  • Stock Market
  • E-game
  • Casino
  • World
  • Affiliate News
  • English
    • Português
    • English
    • Español
MetaDaily – Breaking News in Crypto, Markets & Digital Trends
Home » Ai2’s new small AI model outperforms similarly-sized models from Google, Meta
AI

Ai2’s new small AI model outperforms similarly-sized models from Google, Meta

adminBy adminMay 1, 2025No Comments2 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email


‘Tis the week for small AI models, it seems.

On Thursday, Ai2, the nonprofit AI research institute, released Olmo 2 1B, a 1-billion-parameter model that Ai2 claims beats similarly-sized models from Google, Meta, and Alibaba on several benchmarks. Parameters, sometimes referred to as weights, are the internal components of a model that guide its behavior.

Olmo 2 1B is available under a permissive Apache 2.0 license on the AI dev platform Hugging Face. Unlike most models, Olmo 2 1B can be replicated from scratch; Ai2 has provided the code and data sets (Olmo-mix-1124, Dolmino-mix-1124) used to develop it.

Small models might not be as capable as their behemoth counterparts, but importantly, they don’t require beefy hardware to run. That makes them much more accessible for developers and hobbyists contending with the limitations of lower-end and consumer machines.

There’s been a raft of small model launches over the past few days, from Microsoft’s Phi 4 reasoning family to Qwen’s 2.5 Omni 3B. Most of these — and Olmo 2 1B — can easily run on a modern laptop or even a mobile device.

Ai2 says that Olmo 2 1B was trained on a data set of 4 trillion tokens from publicly available, AI-generated, and manually created sources. Tokens are the raw bits of data models ingest and generate — 1 million tokens is equivalent to about 750,000 words.

On a benchmark measuring arithmetic reasoning, GSM8K, Olmo 2 1B scores better than Google’s Gemma 3 1B, Meta’s Llama 3.2 1B, and Alibaba’s Qwen 2.5 1.5B. Olmo 2 1B also eclipses the performance of those three models on TruthfulQA, a test for evaluating factual accuracy.

Techcrunch event

Berkeley, CA
|
June 5

BOOK NOW

This model was pretrained on 4T tokens of high-quality data, following the same standard pretraining into high-quality annealing of our 7, 13, & 32B models. We upload intermediate checkpoints from every 1000 steps in training.

Access the base model: https://t.co/xofyWJmo85 pic.twitter.com/7uSJ6sYMdL

— Ai2 (@allen_ai) May 1, 2025

Ai2 warns that that Olmo 2 1B carries risks, however. Like all AI models, it can produce “problematic outputs” including harmful and “sensitive” content, the organization says, as well as factually inaccurate statements. For these reasons, Ai2 recommends against deploying Olmo 2 1B in commercial settings.





Source link

Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleNvidia takes aim at Anthropic’s support of chip export controls
Next Article Tether CEO defends decision to skip MiCA registration for USDT
admin
  • Website

Related Posts

Mira Murati’s Thinking Machines Lab closes on $2B at $10B valuation

June 20, 2025

Could OpenAI fill Microsoft’s shoes?

June 20, 2025

A timeline of the US semiconductor market in 2025

June 19, 2025

Nvidia’s AI empire: A look at its top startup investments

June 19, 2025

Comments are closed.

Our Picks

Voluptatem aliquam adipisci dolor eaque

April 24, 2025

Funeral of Pope Francis Coincides with King’s Day Celebrations in the Netherlands and Curaçao

April 24, 2025

Curaçao’s Waste-to-Energy Plant Remains Unfeasible Due to High Costs

April 23, 2025

Dutch Ministers: No Immediate Threat from Venezuela to ABC Islands

April 23, 2025
Don't Miss
Affiliate Network News

Awin expands to Mexico, connecting brands with new audiences

By adminJune 10, 20250

Since 2000, Awin has steadily expanded its international footprint to meet the rising demand for…

The Sunday Times List of Best Places to Work in 2025

May 27, 2025

The Sunday Times List of Best Places to Work in 2025

May 23, 2025

Awin Claims Best Affiliate Network or SaaS of the Year at 2025 Performance Marketing Awards

May 15, 2025
About Us
About Us

Welcome to MetaDaily.io — Your Daily Pulse on the Digital Frontier.

At MetaDaily.io, we bring you the latest, most relevant, and most exciting news from the world of affiliate networks, cryptocurrency, Bitcoin, egaming, and global markets. Whether you’re an investor, gamer, tech enthusiast, or digital entrepreneur, we provide the insights you need to stay ahead of the curve in this fast-moving digital era.

Our Picks

Italy’s Public Gaming Revenues Reach €2.36 Billion in Early 2025

June 20, 2025

Jumbo Technology Sues Evolution for Patent Infringement Over Lightning Games

June 19, 2025

Brazil Senate Moves Forward with Legalization of Land-Based Casinos

June 18, 2025

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact Us
  • Privacy Policy
  • Terms & Conditions
  • DMCA
© 2025 metadaily. Designed by metadaily.

Type above and press Enter to search. Press Esc to cancel.