A discussion thread on Product Hunt about DeepSeek-V4 is starting to gain attention in the AI underground, and not without reason. DeepSeek has done it again — releasing a model that, on paper, shouldn't be possible to deliver at these prices.
Let's start from the top: DeepSeek-V4 comes in two variants, Pro and Flash. The Pro version has 1.6 trillion total parameters but only uses 49 billion actively per token thanks to a Mixture-of-Experts architecture. This isn't new from DeepSeek, but what is new is the hybrid attention mechanism they call CSA and HCA — it means the model only needs 27% of FLOPs compared to its predecessor DeepSeek-V3.2 when working with long contexts. The Flash version pushes this even further, down to 10%.
And the model is natively multimodal. Trained from the ground up on text, images, video, and audio — not bolted on afterwards.
On LiveCodeBench, it beats Claude by a good margin: 93.5% vs 88.8%. On GPQA, it lands at 90.1%. These are numbers that, six months ago, belonged solely to the closed frontier models.
The price is what really makes people raise their eyebrows. The Flash variant costs $0.14 per million input tokens and $0.28 per million output tokens. Pro is at $1.74 and $3.48. In comparison, this is a small fraction of what you pay for GPT-4-class APIs from OpenAI or Anthropic.
The weights are open-source under Apache 2.0 or MIT, and community members are already starting to test local execution and fine-tuning.
It's worth emphasizing: this is an early signal based on community discussion and available technical documentation — not an independent, peer-reviewed analysis. Benchmark figures from the model producer itself should always be taken with a grain of salt until others confirm them.
But the sentiment is clear: r/LocalLLaMA and HN circles are already underway with reproduction and testing. If the numbers hold up in independent evaluations, this is a new data point that puts significant pressure on the price structure of the closed models.
Keep an eye on this. Mainstream tech media has not yet picked it up.
