IntraMind LLC logo
IntraMind LLC
IntraBlog
Torna indietro

685B AI Just Dropped & It's ABSOLUTELY INSANE! (2025)

DeepSeek-V3.2: Sparse Attention, 671B Parameters, and 50% Lower API Costs for Efficient Long-Context AI

1 dic 2025 (Aggiornato il 18 feb 2026) - Scritto da Lorenzo Pellegrini

48

Condividi questo articolo:

Artificial Intelligence
DeepSeek-V3.2 AI model interface on desktop monitor and mobile app showing 685B parameter model details

This image is generated by Gemini

Foto profilo di IntraOS di Lorenzo Pellegrini

Lorenzo Pellegrini

1 dic 2025 (Aggiornato il 18 feb 2026)

Pensiero dell'autore

While DeepSeek-V3.2's DSA promises efficiency for long-context tasks, its true disruption lies in the V3.2-Speciale variant's RL-driven reasoning supremacy—outpacing GPT-5 on IMO gold-medal benchmarks—which exposes proprietary models' Achilles' heel: unsustainable compute costs for elite performance. This pivot challenges the industry to rethink open-source as not just cheaper, but strategically superior for reasoning frontiers.

Lorenzo Pellegrini