Close Menu
  • Home
  • Daily
  • AI
  • Crypto
  • Bitcoin
  • Stock Market
  • E-game
  • Casino
    • Online Casino bonuses
  • World
  • Affiliate News
  • English
    • Português
    • English
    • Español

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

What's Hot

Jack Dorsey’s Block May Slash Up To 10% of Staff: Report

February 8, 2026

Bithumb Recovers Overpaid Bitcoin, Covers 1,788 BTC Shortfall

February 8, 2026

Where to Find All Eggs in Heartopia (Onsen Egg Promise)

February 8, 2026
Facebook X (Twitter) Instagram
MetaDaily – Breaking News in Crypto, Markets & Digital Trends
  • Home
  • Daily
  • AI
  • Crypto
  • Bitcoin
  • Stock Market
  • E-game
  • Casino
    • Online Casino bonuses
  • World
  • Affiliate News
  • English
    • Português
    • English
    • Español
MetaDaily – Breaking News in Crypto, Markets & Digital Trends
Home » China’s DeepSeek launches new open-source AI after R1 took on OpenAI
Crypto

China’s DeepSeek launches new open-source AI after R1 took on OpenAI

adminBy adminApril 30, 2025No Comments4 Mins Read
Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp VKontakte Email
Share
Facebook Twitter LinkedIn Pinterest Email
Up to $1500 Welcome Bonus
+50 Freespins
Always 25% Bonus with every Crypto Deposit!
Join Now


Chinese artificial intelligence development company DeepSeek has released a new open-weight large language model (LLM).

DeepSeek uploaded its newest model, Prover V2, to the hosting service Hugging Face on April 30. The latest model, released under the permissive open-source MIT license, aims to tackle math proof verification.

DeepSeek-Prover-V2 HuggingFace repository. Source: HuggingFace

Prover V2 has 671 billion parameters, making it significantly larger than its predecessors, Prover V1 and Prover V1.5, which were released in August 2024. The paper accompanying the first version explained that the model was trained to translate math competition problems into formal logic using the Lean 4 programming language — a tool widely used for proving theorems.

The developers say Prover V2 compresses mathematical knowledge into a format that allows it to generate and verify proofs, potentially aiding research and education.

Related: Here’s why DeepSeek crashed your Bitcoin and crypto

What does it all mean?

A model, also informally and incorrectly referred to as “weights” in the AI space, is the file or collection of files that allow one to locally execute an AI without relying on external servers. Still, it’s worth pointing out that state-of-the-art LLMs require hardware that most people don’t have access to.

This is because those models tend to have a large parameter count, which results in large files that require a lot of RAM or VRAM (GPU memory) and processing power to run. The new Prover V2 model weighs approximately 650 gigabytes and is expected to run from RAM or VRAM.

To get them down to this size, Prover V2 weights have been quantized down to 8-bit floating point precision, meaning that each parameter has been approximated to take half the space of the usual 16 bits, with a bit being a single digit in binary numbers. This effectively halves the model’s bulk.

Prover V1 is based on the seven-billion-parameter DeepSeekMath model and was fine-tuned on synthetic data. Synthetic data refers to data used for training AI models that was, in turn, also generated by AI models, with human-generated data usually seen as an increasingly scarce source of higher-quality data.

Prover V1.5 reportedly improved on the previous version by optimizing both training and execution and achieving higher accuracy in benchmarks. So far, the improvements introduced by Prover V2 are unclear, as no research paper or other information has been published at the time of writing.

The number of parameters in the Prover V2 weights suggests that it is likely to be based on the company’s previous R1 model. When it was first released, R1 made waves in the AI space with its performance comparable to the then state-of-the-art OpenAI’s o1 model.

Related: South Korea suspends downloads of DeepSeek over user data concerns

The importance of open weights

Publicly releasing the weights of LLMs is a controversial topic. On one side, it is a democratizing force that allows the public to access AI on their own terms without relying on private company infrastructure.

On the other side, it means that the company cannot step in and prevent abuse of the model by enforcing certain limitations on dangerous user queries. The release of R1 in this manner raised security concerns, and some described it as China’s “Sputnik moment.”

Open source proponents rejoiced that DeepSeek continued where Meta left off with the release of its LLaMA series of open-source AI models, proving that open AI is a serious contender for OpenAI’s closed AI. The accessibility of those models also continues to improve.

Accessible language models

Now, even users without access to a supercomputer that costs more than the average home in much of the world can run LLMs locally. This is primarily thanks to two AI development techniques: model distillation and quantization.

Distillation refers to training a compact “student” network to replicate the behavior of a larger “teacher” model, so you keep most of the performance while cutting parameters to make it accessible to less powerful hardware. Quantization consists of reducing the numeric precision of a model’s weights and activations to shrink size and boost inference speed with only minor accuracy loss.

An example is Prover V2’s reduction from 16 to eight-bit floating point numbers, but further reductions are possible by halving bits further. Both of those techniques have consequences for model performance, but usually leave the model largely functional.

DeepSeek’s R1 was distilled into versions with retrained LLaMA and Qwen models ranging from 70 billion parameters to as low as 1.5 billion parameters. The smallest of those models can even reliably be run on some mobile devices.

Magazine: ‘Chernobyl’ needed to wake people to AI risks, Studio Ghibli memes: AI Eye



Source link

Up to $1500 Welcome Bonus
+50 Freespins
Always 25% Bonus with every Crypto Deposit!
Join Now
Share. Facebook Twitter Pinterest LinkedIn Tumblr WhatsApp Email
Previous ArticleIs crypto losing its soul?
Next Article $210 discount and 50% off on the second for Sessions AI
admin
  • Website

Related Posts

Jack Dorsey’s Block May Slash Up To 10% of Staff: Report

February 8, 2026

Bithumb Recovers Overpaid Bitcoin, Covers 1,788 BTC Shortfall

February 8, 2026

Crypto Retail Investors Are Trying To ‘Meta-Analyze’ Market

February 8, 2026

CFTC Amends Guidance, Includes National Trust Banks As Stablecoin Issuers

February 7, 2026

Comments are closed.

Our Picks

Voluptatem aliquam adipisci dolor eaque

April 24, 2025

Funeral of Pope Francis Coincides with King’s Day Celebrations in the Netherlands and Curaçao

April 24, 2025

Curaçao’s Waste-to-Energy Plant Remains Unfeasible Due to High Costs

April 23, 2025

Dutch Ministers: No Immediate Threat from Venezuela to ABC Islands

April 23, 2025
Don't Miss
Affiliate Network News

Awin Wins Big at Global Performance Awards 2025

By adminOctober 22, 20250

Awin and our partners made this year’s Global Performance Marketing Awards one to remember, claiming…

Awin Shortlisted 11 Times at GPMA 2025

September 11, 2025

Awin’s CPI Recovers $100M in Affiliate Revenue

September 11, 2025

Awin and Birl partner to transform resale into a scalable growth engine for brands

August 28, 2025
About Us
About Us

Welcome to MetaDaily.io — Your Daily Pulse on the Digital Frontier.

At MetaDaily.io, we bring you the latest, most relevant, and most exciting news from the world of affiliate networks, cryptocurrency, Bitcoin, egaming, and global markets. Whether you’re an investor, gamer, tech enthusiast, or digital entrepreneur, we provide the insights you need to stay ahead of the curve in this fast-moving digital era.

Our Picks

Evolution Reports 2025 Results with Focus on Monopoly and Growth

February 6, 2026

Ireland Gambling Licensing Set to Launch in 2026

February 5, 2026

Finnish Gambling Protection Proposals Spark Concerns From Operators

February 4, 2026

Subscribe to Updates

Subscribe to our newsletter and never miss our latest news

Subscribe my Newsletter for New Posts & tips Let's stay updated!

Facebook X (Twitter) Instagram Pinterest
  • Home
  • About Us
  • Advertise With Us
  • Contact Us
  • Privacy Policy
  • Terms & Conditions
  • DMCA
© 2026 metadaily. Designed by metadaily.

Type above and press Enter to search. Press Esc to cancel.