The Download: Linux 486 retirement, DeepSeek v4, TanStack AI & more

GitHub| 00:06:05|May 1, 2026
Chapters7
The episode covers Linux adoption in the public sector, highlights China’s open source cost-saving efforts, and notes an open source SDK from 10 Stack aimed at reducing vendor lock-in.

France pushes Linux across government, DeepSeek V4 cuts inference costs, and TanStack AI delivers open, framework-agnostic tooling for developers.

Summary

Andrea guides viewers through a packed week of open source and developer news. France’s DINUM is steering ministries toward Linux to secure data control and sovereignty, building on the Genda Maria National police force’s Ubuntu rollout. Canonical’s Ubuntu feature focuses on local model inference to boost privacy and reduce latency. DeepSeek releases V4 with cost-effective models that compete with OpenAI and Entropic, underscoring a trend toward cheaper, on-device or near-device AI tasks. TanStack AI emerges as a truly open toolkit, offering adapters and isomorphic tools that promise type-safe cross-boundary tooling across server and client. The episode also marks Linux 7.1’s removal of Intel 486 support, a symbolic end of an era while reassuring users that long-term support kernels remain usable. Across these stories, the throughline is clear: cost, control, and open ecosystems are driving the next wave of developer tooling and infrastructure choices. If you’re building internal apps, open source pipelines, or cloud-agnostic tools, there’s actionable insight here for how to navigate today’s AI and Linux landscapes. Join Andrea as she flags what to watch, what to try first, and where the community is headed in this rapidly evolving space.

Key Takeaways

  • France’s DINUM plans Linux migration for every ministry, following the Genda Maria police force’s 100,000+ Ubuntu workstations and €2 million annual savings.
  • Canonical’s Ubuntu feature enables local inference to reduce latency and token costs, with snaps handling consistent permissions.
  • DeepSeek V4 (released April 24) targets open-source customers with claw/open claw and open code, aiming for cost parity or advantage over Frontier, OpenAI, and Entropic per million tokens.
  • DeepSeek pricing: B4 Pro at $348 per million tokens originally, with promo discounts and aggressive undercutting of closed APIs.
  • TanStack AI bills itself as the “Switzerland of AI tooling,” delivering framework-agnostic, open-source tooling with adapters for OpenAI, Anthropic, Gemini, and more, plus isomorphic tooling for server/client type safety.
  • Removing Intel 486 support in Linux 7.1 marks a milestone in kernel history, while long-term support kernels remain usable for those systems, reflecting pragmatic maintenance decisions.

Who Is This For?

Developers and IT decision-makers interested in open source AI tooling, Linux governance in government, and cost-competitive AI infra. Perfect for practitioners weighing on-device inference, vendor lock-in avoidance, and framework-agnostic tooling.

Notable Quotes

"France is going allin on digital sovereignty. The government's digital agency DINUM is rolling out a plan for every ministry to migrate to Linux."
Sets up the central theme of government-wide Linux adoption and sovereignty.
"The idea is to run models on the device instead of round tripping to a cloud API. The pitch is simple. Privacy stays local. Latency drops."
Highlights Canonical’s Open Inference approach and its value proposition.
"TanStack AI is the Switzerland of AI tooling."
Characterizes the open, framework-agnostic philosophy and approach to tooling.
"The 486 is 37 years old."
Notes the historical milestone of Linux dropping i486 support in 7.1.

Questions This Video Answers

  • How does France's DINUM Linux migration plan affect government procurement and data sovereignty?
  • What makes DeepSeek V4 cost-effective compared to Frontier, OpenAI, and Entropic?
  • What is TanStack AI’s isomorphic tooling and why does it matter for type safety across server and client?
  • Why is Linux removing Intel 486 support in Linux 7.1, and what should 486 users do?
  • How does local inference in Ubuntu reduce latency and token costs?
France Linux DINUMUbuntu local inferenceDeepSeek V4DeepSeek Open Source modelsTanStack AIisomorphic toolingLinux 7.1 Intel 486 removalOpen source AI tooling
Full Transcript
Friends goes all in on Linux for its public sector. China's newest open source model is dropping inference cost again and 10 stack shipped an open source SDK that takes on vendor lock in. All this and more in today's episode of the download. Welcome back to another episode of The Download, the show where we cover the latest developer news and open source projects. Please like and subscribe. I'm Andrea and I debug all this news so you don't have to. Let's get into it. France is going allin on digital sovereignty. The government's digital agency DInnum is rolling out a plan for every ministry to migrate to Linux as part of a broader effort to keep data and infrastructure decisions in operator hands France controls. This is not new ground for them. The Genda Maria National, their police force, has been running a custom Ubuntu bill called Gendu since 2008. As of last year, it was on more than 100,000 workstations across the force, saving roughly €2 million a year in licensing. That is the model the DINUM is now pointing to for the rest of the government. The thing to watch here is the procurement signal. a G7 country picking Linux as this scale. Open source just got one heck of a customer. Canonical laid out his plan for local inference features in Ubuntu. The idea is to run models on the device instead of round tripping to a cloud API. The pitch is simple. Privacy stays local. Latency drops. You stop paying per token for stuff that does not need a frontier model. And if your network is flaky, the model still runs. Canonical's BP of engineering Joe Ziger said the roll out will happen across the next year. Focus on open way models, open- source harnesses, and local inference where possible. Distribution is through snaps, which keeps confinement consistent with how the rest of Ubuntu handles permissions. If you're building developer tools or internal apps, it's worth a look. Sometimes the cheapest, fastest, most private interference is the kind that never leaves your machine. All the links are in the show notes. DeepSick dropped a preview of V4 on April 24th. Open source pro and flash variants tune for popular agent tools. The release notes name check claw code open claw and open code which tells you exactly who you think their customers are. Hint, it's me. It's us. Strong capabilities at prices that seem to keep dropping. B4 Pro launched at $348 per million output tokens and DPS has since cut that further with a limited time promotional discount running through May. At the time of recording for comparison, that same volume output tokens will cost around $30 on OpenAI and 25 on Entropic. So, DIPC is undercutting the closed source Frontier by close to an order of magnitude. The real story here of course is cost. Paying Frontier API race for every little task adds up fast and open claw even made D4 flash already its default model just two days after launch. This matter pattern is mixing models. Use a cheap one for routine things a frontier one for hard reasoning. Well, Deepseek just made the cheap end of that mix a lot more interesting. In tooling and open source news, Tanstack, the beloved team behind some of the most used libraries in the React ecosystem, has been building a framework agnostic AI toolkit called Tanstack AI. Fully open source, no service layer, no platform fees. The team is calling it the Switzerland of AI tooling. The alpha already ships with adapters for OpenAI, Antropic, Gemini, and Olama, server libraries in Typescript, PHP, and Python, and client libraries for VanillaJS, React, and Solid with a spelt on the way. The piece that stood out to me here is what they're calling isomorphic tools. You define a tool once with meta definitions, then provide isolated server and client implementations. That gives you type safety across both sides of the connection, which is genuinely rare in this space. Per model typing means your ID knows which modalities each model supports, less guessing, fewer runtime surprises. It's still an alpha, so expect some sharp edges, but the through line is consistent. You own your stock and no vendor decides for you. And we'll close with one for the Colonel Nerds. A new patch series merging into Linux is removing Intel 486 support from the kernel. Linux 7.1 is the version where the i486 started walking off the stage. The 486 is 37 years old. It shipped back in 1989 and the technical recent tracks. The 486 doesn't have some of the instructions modern colonel codes lean on. So the colonel has been carrying emulation glue to basically pretend and fake it for years. That glue was causing more problems that the architecture was worth. Lionus has been arguing for dropping it since 2022. And this patch by Ingo Molnar is what finally moved that needle. If you're one of the people running a current Linux kernel on a 486, don't worry. The long-term support kernels will still work. You're not stranded. I have to admit there is something kind of beautiful about a colonel that ran on a 486 back in 1991, still running on that same lineage today. Linux outlasted the ship it was born on, and that is one heck of a running software. And that's it for this week. Let me know in the comments your thoughts on any of the topics we discussed this week. And if you like this episode, please give us a thumbs up, leave me a comment. I reply to every single one of them. And please subscribe to this YouTube channel for all your nerdy needs. See you next time.

Get daily recaps from
GitHub

AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.