Vite's Next.js Killer Just Dropped

Program With Erik| 00:07:52|Mar 23, 2026
Chapters9
Introduces V-Next as a VIT-based alternative to TurboAC for Next.js, promising faster builds and smaller bundles, and outlines the speaker’s benchmarks and findings.

VNext promises dramatic speedups for Next.js apps by swapping in Vite-powered tooling, delivering smaller bundles and faster builds—watch for current experimental caveats.

Summary

Program With Erik walks through VNext, a Vite-based approach to running Next.js apps. Erik demonstrates how VNext can cut client bundles by up to 57% and slash build times dramatically, using a Next.js 16 app with app router, React 19, and TypeScript 5. He shows a practical workflow: running an MPX VNext check, initializing with v-next init, adjusting package.json, and resolving a few config quirks (like an external flag for better SQLite 3). The walkthrough includes a real-world test with a paw-and-claw dog grooming dashboard, where Erik compares production output and dev experiences with and without caching. He emphasizes that VNext is experimental and still under active development, and that rendering modes, static pre-rendering, and cloud integrations are evolving. He also notes that the project keeps TurboPack and the original Next.js setup usable during migration, and highlights the sizes: output shrinking from 181 MB to 1.2 MB, and production size from 19 MB to 1.2 MB. At the end, Erik weighs pros and cons for different site scales and invites viewer opinions on migrating to VNext.

Key Takeaways

  • VNext can reduce client bundles by up to 57% and cut build times significantly (example: production size drops from 19 MB to 1.2 MB).
  • Initial checks report ~88% compatibility with a typical Next.js 16 app, with minor issues like font handling and a need for package.json type module flag.
  • Initialization with v-next init adjusts config and package.json, enabling Vite-based builds while keeping TurboPack and Next.js intact.
  • Rendering modes differ: Next.js pre-renders HTML at build time, while VNext renders on first request with ISR caching; this is the main area of ongoing work.
  • Caching and deployment features exist (e.g., memory cache handler, ISR, and Cloudflare deployment options), but some options are still experimental or partially implemented.
  • The VNext project is rapidly evolving; there are open PRs to improve static pre-rendering and overall migration flow.
  • For larger projects, performance gains must be weighed against compatibility and migration complexity; a careful benchmark is advised.

Who Is This For?

Frontend engineers and Next.js developers curious about cutting build times and bundle sizes, especially those evaluating a move to Vite-based tooling for production apps. It’s especially relevant for teams with moderate to large Next.js projects who want to validate gains before full migration.

Notable Quotes

"What if I could tell you you could take your nextJS application and build it four times faster and reduce the bundle size by 57% by just adding in this one thing?"
Opening pitch highlighting the claimed gains of VNext.
"The first thing I wanted to do is to actually run a check on my application. And you can do this through MPX V-NEX check."
Shows the initial compatibility assessment step.
"It's experimental, it's under heavy development."
Flagging the current maturity level of VNext.
"The build time went from 7 seconds to 1.1 seconds after removing the cache; production size dropped from 19 megs to 1.2 megs."
Concrete performance and size gains shown in the demo.

Questions This Video Answers

  • How does VNext affect Next.js 16 app routing and server components vs TurboPack?
  • What are the current limitations of VNext for static pre-rendering?
  • Can VNext be safely migrated in stages without breaking a Next.js project?
  • What are the best practices for benchmarking VNext on a real app (e.g., font handling, external flags)?
VNextViteNext.js 16App RouterReact 19TypeScript 5TurboPackCloudflare WorkersISRStatic Rendering','External flags (better SQLite 3)
Full Transcript
What if I could tell you you could take your nextJS application and build it four times faster and reduce the bundle size by 57% by just adding in this one thing? In this video, I'm going to explore V- Next, a new way to use VIT inside your Nex.js application instead of TurboAC. I'm going to show you my benchmarks and how faster it made my application. I created this paw and claw dog grooming dashboard using Nex.js. I vibecoded a little bit it out, but I wanted to have a few technologies I think that we're all familiar with. I'm using Nex.js16 with the app router. I'm using React 19, TypeScript 5. I'm using server components, but I wanted to make sure I'm also using API routes. So, I have a bunch of things inside here. I'm also using cache. So, I wanted it to be an application beyond just the really basics. I wanted to use a few of the features of Nex.js16. So this really cool story how Steve Faulner within one week was able to rebuild Nex.js with AI and a whole bunch of tokens. You can see he was able to get the client bundles up to 57% smaller. And the way he did it was he used VIT and he created something he's calling V next. You can read all about it in this blog post. I'll link it below. I took a look at the repo. One thing it keeps saying everywhere is is it is experimental. It's under heavy development. But I think this is so cool that with AI, people are creating cool forks and projects like this. So I thought I would try it out in one of my projects. So the first thing I wanted to do is to actually run a check on my application. And you can do this through MPX V-NEX check. I'll just hit enter here. It's going to say yes. Do I want to install it and it's going to do a check on my next application to see what can come up with it when it converts it over to VIT. So here is what it found. It shows that I have pretty good compatibility with this application accept some fonts through next font Google. That's not a big deal for me. I also need to add type module in my package JSON. I think this is pretty good. It also says I basically have 88% compatibility with it. 10 supported, one partial, one issue. So there's a couple ways I can do this. I'm using Kuro as my agentic IDE. I also have Kira CLI, but I thought I would just do this manually. To get this working, we just basically need to run this one command. So instead of v- next, I'm going to do v- next init. So let's see how this goes. All right, it finished the init and it made some changes. Not that many. It added the vcon config. It added some new things in the package JSON. You can see here that now we have this new dev v next and build vext. And it looks like it it figured everything else out for us. So let's try this out. I'm going to run npm rundev colon v next. What's nice is also we still have the turbo pack and the normal NexJS inside here. So it's not like it erased it or anything. So now we ran this command. It's running in port 3001. So let's take a look at it. Okay, so it gave us an error. Definitely didn't work on the first try for this app, but I actually know what the problem is. So let me fix it real quick. So in my case, it was just a simple configuration. I needed to add in the V config, this new external better SQLite 3, and then it should fix it. So here it is. Here's my application. Now, you're probably thinking like, why is this worth it? It's experimental. But let me show you some statistics of why I think this is really awesome and why you should try it out in your next.js app. It's also worth not mentioning real quick that I could have gone through and added in this Cloudflare V-Nex skills. This works great with Cloud Code, Curo, all your agentic ids and then just run migrate this project to V-NEX. Agentic IDs are really, really good at migrations right now. I highly recommend you're using them if you're upgrading from one old version to the other. But in this case, my app didn't really need it, so I just showed you how to do it manually. Okay, so I just changed to a different branch and I want to show you I created this docs. I ran a bunch of tests on this. So if I open this up in preview, so the build time, it went from 7 seconds, this is after removing the cache, to 1.1 seconds. And then if I don't have type checking, it went from 4 and a half to 1.1. You can see the output size is incredibly smaller. It went from 181 megs to 1.2 megs. And so for production size, it still went from 19 megs to 1.2. You can see just that alone is a huge improvement. However, though there are trade-offs first, uh I'll mention this again. It's experimental. Things are changing. Like for example, the rendering modes. So for static pages, Nex.js JS pre-renders on buildtime and serves it serves the HTML while V-Nex renders on first request then caches via ISR and cache handler. Next actually handles it a little bit better when you first open up your website. Let me show you. There's actually a GitHub issue about this. So there's something called static pre-rendering at build time. So Nex.js pre-render static pages to HTML at build time and then pages with dynamic data fetching gets rendered once during build next build. Vex currently doesn't do this. So, there is an open PR to make this issue go away, which I very much enjoy. It looks like they actually pushed it into Maine. However, they're still they left it open because it's still not quite done. There's still more testing it needs to be done with it. But, it's cool that the V-NEx project is moving so fast that this may not be an issue in the future. The other rendering modes, it's pretty on par between V-NEX and Nex.js. There is it supports use cache all the different types of caching. There's even this new v-ex deploy experimental TPR which is a cloudflare only option requires a custom domain API token with zone analytics access. There's also this ISR and V-NEX. That's one thing we can do incremental static regeneration that is supported with this new memory cache handler. I suppose if you have a large site and performance is extremely important for you, you would need to look into the details of each one of these cache differences between V-Next and next to make sure that you're handling everything correctly. It looks like it works pretty well out of the box for most smaller sites, but I can imagine like gi ginormous sites are going to have to look into this very carefully. And for those of you wondering why the output is so much smaller, there's a few things. There's no pre-rendered HTML RSC. That's why V-Ex is smaller. It doesn't bundle the server runtimes with the default output. There's no self-hunted hosted fonts, no dev artifacts. Roll down tree shaking also really helps. So, if I was going to look at the pros and cons real quickly, pros with Nex.js. Nex.js the pre-rendered static pages instant the first time to bite. That's what we're talking about. That PR seems to have fixed that, but it looks like it's still experimental. Hasn't been implemented all the way. If you have larger sites, moving to V is going to make your build time so much faster. I don't even know why you would not want to do that. Some of the pros, the faster build times, the smaller output, non-destructive migration, which is really nice. So, you don't have to worry about destroying your app by moving to V next. And if you're using Cloudflare, it has this native Cloudflare workers. But once again, it doesn't have built-in time pre-rendering. Some of that fonts aren't supported yet. It looks like most of the API layer is covered but not all of it. So you are going to have some minor issues like I had with better SQLite 3 I had to add in that SSR external flag. So I think overall I think it's still worth moving over and this is what the recommendation is when I ran it through all my different AI tools and I agree with this. For larger part uh projects you may want to just double check and and compare the performance before you migrate all the way over. I want to hear what you guys think. Leave a comment below if this is something you're going to do with your next apps to move to V next.

Get daily recaps from
Program With Erik

AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.