Open Source Friday with Remotion
Chapters6
Overview of managing cloud agent sessions and assigning tasks to Copilot and other agents from GitHub and IDEs.
Remotion’s creator, Johnny Burgerer, demonstrates building programmatic videos with AI agents, Remotion Studio, and a hybrid code-editing flow that blends React editing with agent-driven scaffolds.
Summary
GitHub’s Open Source Friday kicks off with a lively overview of how Remotion blends video production with AI agents. Johnny Burgerer walks Kadisha through the project’s origins from a 2021 idea to parameterize videos using CSS, to Remotion becoming a full-fledged company with a hybrid rendering approach. The chat dives into Remotion Studio, a browser-based editor that renders React components as video, and how agents scaffold projects from prompts, generating code and then syncing edits back and forth with the codebase. We see a demo where an agent creates a scaffold, then acto...s are edited live in the Remotion Studio, mapping UI changes back to code. Johnny explains the dual rendering paths: a server-side renderer (with a Lambda-backed, distributed approach) and an evolving client-side renderer that minimizes server dependency. The conversation also touches on branding, templates (like Stargazer), performance considerations, and the vision for more designer-friendly, less “AI slop” videos. The session closes with reflections on sustainability, the licensing model, and the broader shift toward AI-assisted video creation—encouraging developers to try Remotion via Copilot CLI and the Remotion docs.
Key Takeaways
- Remotion started as a 2021 experiment to parameterize videos with CSS instead of After Effects, evolving into a React-based framework for programmatic video creation.
- Remotion Studio is a browser-based video editor that maps UI edits back to React code, enabling hybrid workflows where users edit in code or via an AI-powered scaffold.
- Remotion supports server-side rendering (Node.js with headless browsers and FFmpeg) and a growing client-side renderer that drives video output directly in the browser to reduce server load.
- The team uses AWS Lambda-style distributed rendering (Remotion Lambda) to accelerate large renders by splitting work into multiple chunks, improving scalability for big projects.
- Branding and templates matter: Remotion offers templates (e.g., Stargazer) and is exploring deeper integration with brand guides to produce more customized, non-generic outputs.
- Remotion’s licensing model allows free usage but requires licensing for larger businesses building apps or editors on top of Remotion, aiming for sustainability without stifling community adoption.
- The conversation highlights a trend toward HTML-in-CANVAS workflows to enable richer, canvas-based effects and filters in video rendering, signaling a forward-looking enhancement path for visuals.
Who Is This For?
This is essential viewing for developers and product teams building AI-assisted video tools or who are curious about programmatic video workflows with Remotion, especially those considering AI agents or hybrid editor setups in production.
Notable Quotes
"Heat. Heat. Heat. With GitHub Copilot, you can shift your development tasks from sequential toim. simultaneous with the agent of your choice in agent HQ."
—Opening context on using AI agents (Copilot, Claude, CodeEx) to parallelize development tasks.
"Remotion allows you to build um like to create videos programmatically. And now you can create videos not only programmatically, but just by prompting with text because of AI agents."
—Johnny summarizes the core premise: programmatic video with AI-assisted prompting.
"The Remotion Studio is this playback of the code. This is actual web content."
—Description of Remotion Studio’s live, code-linked video editing interface.
"We can prompt the agent to choose a specific design system... maybe GitHub branding."
—Discussing how agents can tailor visuals to brand guidelines.
"You can use any AI agent with Remotion. I prefer to use Copilot CLI."
—Practical guidance on agent integration and preferred tools.
Questions This Video Answers
- How do I start using Remotion with Copilot CLI to generate a video scaffold?
- What is Remotion Studio and how does it map edits back to React code?
- What is the Remotion licensing model and who needs a license?
- How does Remotion Lambda work to speed up video rendering?
- What is HTML in Canvas and how will it impact Remotion's future features?
RemotionRemotion StudioAI agentsCopilotReactHTML in CanvasRemotion Lambdavideo renderingbranding templateslicensing model
Full Transcript
Heat. Heat. Heat. With GitHub Copilot, you can shift your development tasks from sequential toim. simultaneous with the agent of your choice in agent HQ. The agents tab on GitHub is where you can manage all your cloud agent sessions. From here you can assign ad hoc tasks to copilot to claude bionthropic to codeex or to a custom agent. We can also use this tool to monitor our tasks all from one centralized location on GitHub. Now most change requests come via issues and you can assign issues directly to an agent like copilot codeex or claude. Choose the branch you wish to use as the base and again the agent you want and let the agent do the rest.
Inspiration can strike from anywhere including away from your desktop. We can manage agents and even assign tasks right from GitHub mobile. Given that developers spend the bulk of their time in an IDE, it's natural to expect to have access to the same functionality from there. In VS Code, you can orchestrate all your agent sessions local and cloud from your IDE. Just like before, we can assign to cloud or codeex or to copilot using a custom agent or specified model. These tools allow you to assign and manage parallel coding agent sessions all within your existing workflow.
every day. Hey Hello everyone. Welcome to OpenSource Friday, the show where we talk to maintainers about the apps that they're maintaining. I'm Cadesa and I'm so happy that you're here with me today. Today we have a really, really good chat with the maintainer of the Remotion project. If you don't know Remotion, I'm kind of obsessed with it. I've been using it to create videos programmatically for a lot of my videos. And so, I'm so excited to be chatting with Johnny here today. Um, I also wanted to let you guys know that May is maintainer month at GitHub and so we have a lot of programming um, ready for you to go throughout the month.
So definitely go to our maintainermonth.github.com site to see all the goodies we have in store. Uh, let me just pop the link in here so you can go ahead and check it out. And so you know like we always have fun on these streams. I'm just going to bring up Johnny right now so we can get into it. Let me know where you're joining from. Let me know how you're doing and let us know if you've ever tried Remotion in the comments. Lots of requests, but see if you can answer those questions in the comments, but uh let me bring up Johnny.
Hey Johnny, welcome to the stream. Hello. Thanks a lot, Kadasha, for inviting me. So glad to be here. Yes, of course. Of course. So, um, tell us, um, who is Johnny Burgerer and where are you joining us from? Yeah. Um, maybe let's start with, uh, with the bear. I'm joining from beautiful Sururig, Switzerland. Um, yeah, just just a guy from Switzerland who has uh started um an experiment in 2021 um to make videos programmatically and uh I am still still doing it. Yeah. Yeah. So, Johnny, uh, so tell us, um, how did you come up with the idea for Remotion back in 2021?
I'm just going to jump right into the conversation because I I know that at GitHub, we've we've been using Remotion for a few years now to create videos programmatically for a few things. And so, just seeing the explosion that's happening right now, it's it's actually incredible for us to see that for you. So, how did you come up with the idea? what were you trying to build or get done? Sure. Um, yeah, I mean 2021 was a different time. We didn't have the tools that we have nowadays. Um uh my initial uh goal was really to just like have some way to like parameterize videos and maybe use like CSS to center something um in a div uh rather than to have to use After Effects um which does not have components which cannot use CSS.
Um so yeah uh but but now I I think the vision has become a lot bigger. It's about proing videos, building video editors. Um but it was a a very simple use case back then. Yeah. Yeah. Yeah. And so like how long did it take you to build the initial I guess like V1 of our motion back in the day? And how many people were actually do you know how many people were using it back then? Sure. So, I think back then I still had a had a job and uh I know there was like a stretch over Christmas where like um you like I I would not work.
Uh, I went to my parents um took some time off and um finally managed to uh finish Reotion um make a trailer for it and uh launch it at at the beginning of 2021. Um I would say maybe one one and a half or two months where I would um do it mostly after work and um yeah didn't didn't really launch it with that big of intentions only in the last minute I even decided to buy a domain for it right um and then I put it up on GitHub and the magic happened. Awesome. Awesome.
Yeah. So, um I guess like let's get into a demo. I know you have a demo for us um to show folks how uh Remotion works. So, if you if you if you if you're just joining, we're talking to Johnny from the Remotion project. And Remotion allows you to build um like to create videos programmatically. And now you can create videos not only programmatically, but just by prompting with text because of AI agents. So, it's it's been pretty pretty cool to see the growth. You want to share your screen again? Um, yes, absolutely. So, give me one second.
Here we are. Uh, uh, you need to, um, show to everybody. Okay, perfect. So, um, how most people, uh, use freeotion nowadays is with a with a coding agent. um and they would just um prompt a video to be to be made. Um and one very simple thing that you you can do is you can copy this uh prompt from our homepage. Um which just says to install the reotion skills. Um, so these are essentially all of the instructions, all of the best practices, all of the knowledge that the AI needs to have in order to create a video.
And I'm not just going to paste it in. I'm going to edit it. Um, you know what what's uh really cool about these podcasts is that um you organize them all on on GitHub. I'm going to quickly um show show this as well. So, uh, this is, um, a GitHub issue where you kind of like, uh, use it as a system to track all of the the guests you have on your podcast. I thought it would be cool to, um, just copy this link in. And hold on, I'm going to switch the screen again. And we are now in a terminal.
Um, I'm going to say make an animated speaker card. Oh, nice. I I can post on social media. Use 1080 by 1080 and large text. Um, okay. So, let's slow down here for a bit. So what you're doing is you've copied you've copied the instructions to install the skills for a coding agency to use motion and not just install the skills but also give it instructions on what you wanted to create. Huh. Yeah, exactly. So um this is like a great way to get started to make a a scaffold. Is this going to be slop?
Yes, absolutely. Um but it's like very simple way to see like to set up a project um with some with some sample data with a sample animation and you know what it will do is now um set up um some react components. Remotion is react based and it's gonna write essentially a more or less regular react component. So it's um you can use web technology um of which you can uh see a a glimpse here and um there's there's a layer on top um that reotion provides which is its animation system and that will allow you to like define an animation that you know in a moment I will hopefully be able to show you um show you what it looks like.
Cool. Any reason you chose to use React for um video? Um so I was just a React developer um back then and I kind of know like liked it. For me, um, it was like obvious that it would like the the community would move into this direction that React would get more popular. Um, just because you can like have real code and your markup and, you know, to me it just felt natural and it it clicked. Um I would say maybe back then it was not yet as obvious that um React would be like winning as hard at it as it is nowadays.
Um there were a bunch of other frameworks as well. Um and yeah I mean luckily we I mean not standardized on React but I think React is the the biggest framework out of all of them. Mhm. So, um I'm going to open the project. I'm going to show the code if you um would like to proceed. Let's do it. Um all right. So um as you can see um the code that it has generated um it's like regular CSS. We have um a couple of shadows. Um really nothing out of the ordinary. Um, but what is now uh special about it is that I can um open this in a video editor like interface that I'm going to show you now.
Here we go. So, you know, you're you're able to um share your entire screen. Um, I can't share my entire screen. The thing is my my screen is like this long. Oh, it's wide. It's as wide as like this part of my arms. So, this is why you have to excuse me for Gotcha. Got you for for doing this switching all of the time. Um, so now we're in the motion video editor, right? Yes, indeed. Um we call it the the Remotion Studio and um this is just a a playback of the code. This is actual web content.
Um you can also set up parameterization. Um it seems like the AI has already done it now for us. I'm not sure if it has done it correctly. Okay. Yeah. Nice. Seems like it. So, um it's also kind of like a form form that I can edit now. Um and we we also do like some some black magic to like map everything back to the code. So, if I um edit some stuff here, it should go back to the code. And I guess the AI made a mistake here. Um and also we we can see um that this composition you know composition is essentially just like a video has been defined that's um this line in the code and um now I've jumped back into the code and I could um edit it here as well.
So um kind of like two ways to to use free motion. One is by like writing the code in your code editor. Um one way is to to use an agent. Um like we have used to scaffold the video and um uh what I will be doing next what what my current focus is to also based on what you see right here to allow some things to be uh moved around as well. So yeah. Um, yeah, I I think that's the the the end vision to have kind of like an a hybrid video editor which is like partly a chantic like which you can steer with an agent but but also um like manually with with your hands with with your creativity.
Um, I guess this what what we have right now it's a bit um too much slop in my opinion. I'll I'll be the first one to to admit it that um it's like super cool that you can generate um these videos. Uh yet um I've now seen so many of them and they feel generic. So, I'm kind of like I kind of want to go back a bit and um you know embrace um bring your your ideas that you have into code or like to visually edit it and um give more opportunities to make something unique.
Yeah. So, how would we improve this video here? What would be the process that you would use to make it less I guess AI slop as you say? like if I wanted to add like my own um like branding and colors um um to kind of make it more personal, how would I how would you go about doing that? well, two things. We can we can prompt the agent um to um choose a specific design system. Um, maybe I can just prompt to um make it look more like the GitHub design. Um, I think this could work well.
Um, but also um, you know, our our skills are are pretty light. It pretty much like defaults on what it thinks is like good good web design. And if you ask ask me, according to my taste, it does like overload it a bit with um with content. So maybe I would just like go into the code and uh and remove some stuff. Um I'll quickly show you what the agent is doing. But uh maybe the right answer is to to ask you uh do you have suggestions on how we can make make the video better?
Yeah, I guess like you know so like I've been working on creating a um kind of a way for us to do um like like change log videos. Um, and so just like what you do, just point it to a change log link and it creates a video. And so I kind of fed it the GitHub brand guide. Um, like the fonts, the colors, the style. I had the agent like watch an a sample video of what I'm what output I'm expecting. And then we were able to get somewhere that's pretty like passible. Um, and then integrating like, you know, 11 labs with the voice to get something that that's working.
And it's still like a work in progress, but I'm still like looking for ways to definitely make it better, but it's it's definitely now like GitHub branded, but it could definitely use some more work. So, that's why I'm like I'm wondering like, you know, how would you how would you improve it a little bit better? Um yeah, you know, like one temp like one very popular template that is very um GitHub ccentric that I absolutely love that I see a lot of people using is like our uh stargazer template. Oh, the motion has templates. we do have them indeed.
I'll pull it I'll put it up. Um, maybe you have seen videos that look like big fan of her. Hey, Andrea. Hey, Andrea. Thank you. It's not playing right now. So, I guess I can Can I just say something for a minute while I while I pull it up? The Definitely check definitely check out the Remotion project. I have the linky here. So, if you go to gh.io, io/romotion, you'll be able to go forth and be able to use it. So, I would encourage you to use Remotion with your favorite agent. It's so quick to set up and get started if you're interested in learning how to create videos programmatically.
That's a really good way to start. You just copy the one line that you see Johnny did earlier, pop it into your terminal, and have, you know, I use Copilot CLI. Like, I just have Copilot CLI, install, and go forth and do. It's so good. It can add captions. You can, you know, integrate like voice commands. You can do some really, really cool animations with it. It's just a matter of like learning how to do videos, like learning how to build videos and having the language to explain to the agents how to do videos. Um, that's like the only way you're going to get like a better output.
But it's it's pretty cool. Well, thanks a lot for the nice description. Yeah, you actually described it uh way better than I could. So um and uh with that now I've managed to pull up what what I wanted to show. So maybe um one or two people from the audience have already seen um these uh videos. Oh, what's this? Um it's a reotion template where you can um you like enter your repository name and celebrate a a stargazer milestone. So for example, if you want to like post a video and and celebrate that you have made that you have received uh one 100 stars on GitHub um you could uh make an animation like this.
This one is also parameterized. And um let's say I'm going to use GitHub events opensource Friday. Here we go. Um so this your your repo has reached uh 36 stars so far. So I think um since that's such a nice round number, we should celebrate and render write a video for this. And so it's it's using what the GitHub API to pull in the stars. Um it is yeah um so well GitHub has a really nice API that um you know like allows us to get it without an API key. If you don't use an API key, it's very heavily rate limited.
Yes. Um like when we I I think we we made the video for 30,000 stars and then we we actually loaded um all of the data and to not like get rate limited by GitHub um was pretty hard. But um then you know with this template you can also add like an API token to Very interesting. Fetch all of them. Okay. So, tell me more about how you built the remote studio. Um, is it pure react? Like how did you get this um editor going? Yeah, sure. So the the trickiest part honestly is like recognizing the fact that if you let your users mount React components that they can completely mess with the styling of their page.
Like if if the users um like they install Tailwind or anything else that does like a global reset, then every element is going to be styled on the page. And um that is like one of the biggest pains when um building this because like every single element that you see on here, all of these menus, these uh render dialogues, these dropdowns, this quick switcher, these uh assets, it's like you have to like essentially put inline styles on on all of the elements. ments um so that they have like the highest precedence um over over the style that the user might be able to inject globally.
So that makes the the development on the reotion studio um vastly slower than it would be to um like code a normal website. But um I mean that's okay. That's just uh one thing that um I have to put up with. And well, I guess I just said the first thing that came to my mind wasn't really I guess the answer to your question is way more way more straightforward which is that you know we we take the React component that the the user has in their code and we we mount it and we wrap it in some React contexts that we um control and through these contexts we manipulate the time.
We expose an interface for um scrubbing through and that component receives our context read the time uh from the context and is able to draw an image. Gotcha. And when it comes to like the rendering pipeline, right? So like how are you how's Remotion rendering the videos because it gives us such a clean link like here's here it is. Here you go. Go check it out. How is that working under the hood? So, we have a serverside renderer uh which I've just demoed now and we also have like a client side renderer which would also be able to um render this video.
Um with the server side renderers you need to like have Node.js um that spawns up a headless browser um and uses fmp to um concatenate everything and it has an an audio mixing pipeline in Node.js JS which um we've iterated over the years and maybe the the the fanciest version of that renderer is our um reotion lambda solution which kind of like splits up the render into up to 200 separate chunks which can be rendered on AWS Lambda and then they get concatenated um together in the end to to make the video rendering faster. So, that that's pretty cool.
Um however, the the client side render um is is way more black magic. Um, from from our side, we we essentially like re-implement all of the all of the stuff like we we analyze the DOM, we realize that there's a text here and that it has this type of tiling. We recognize the borders, the backgrounds, shadows. We we need to add support for every CSS property um manually and then we draw that onto a canvas. And if we can do that then uh we can capture the canvas and we can uh get rid of all of the Node.js and uh Puppeteer part and and render a video directly in the browser and you know since like many people use Remotion to build like an app to like build video editors.
Um they're like users of Remotion who make a video editor that renders like a million videos per month. Um and and that's kind of like hard and expensive to to scale up um the server infrastructure. You know, you cannot host it on a on a single VPS or on a on a serverless function. Um so we we think if if the client side render is maturing more then you know we will be um promoting promoting it more. Gotcha. Gotcha. And so like so that sounds like pretty um how did you even get did you when you launched emotion was it always a brow browser side and a server side rendering that was happening or did you come upon that like eventually as more and more people were using the project and how are you managing memory?
I have a lot of questions sorry I'm just curious about how it's how it's operating under the hood. Yeah. So indeed the the render is is being executed from the server side like it's like you trigger the render server side. Um but then the actual front end code is uh client side which is sometimes kind of like an annoying bridge that you have to make. You have to like pass data between the things and we have to optimize if you pass if you want to pass a large payload for example. And um uh you know having that extra headless browser is also kind of a pain because we um wants the version of that browser to be the same for all users so that we get deterministic renders.
And we want to you know you know fits that in the serverless function. And um we also try to eliminate the the risk of custom FFMPE builds messing with the determinism. so it it is hard and um that's why the client side render is kind of like more fun. You completely eliminate all all servers. You know when I'm going to launch the client side renderer um there's still some a bit of work work ahead. Um but once it's done you know I'm I'm just going to tweet out like hey guys you can delete all of the servers.
Um you can just live on the web. It's way more fun and um yeah, no more no more bridge needed. Yeah. And and about managing memory um very tricky because uh with video you can potentially have a memory leak really fast if you don't clean up like the the video frames. So like imagine you have like a 1080p video and sometimes you have to process these frames uncompressed and let's say you have 1080p video that's like 1080p by 1920 that's like 2 million pixels. Each pixel can have uh a red, green, blue maybe transparent channel.
So maybe we um calculate with one bite for for each channel. Um so four bytes per pixel and suddenly you have like 8 megabytes per frame. You have 30 frames per second and a 5-second video is like one and a half gigabytes. Um so one one interesting thing that you sometimes have to do is to um when you render a video you you on one hand produce the frames and on the other hand you you encode the frames um into like H.264 or any other video codec. So it's like two separate processes that are happening at at the same time.
And if um the rendering frames part is too fast or the encoding part is too fast, it might be that one of these parts produces like so many uncompressed frames that you are running out of memory. and and so it can make sense to like artificially slow down the rendering if the encoder uh cannot keep up and that is the concept of back pressure um which I totally did not know when I first launched freeotion I had problems and uh now I've I've learned a lot of things and we do better I'm sure I'm sure because you know like video process ing it's so it's so heavy.
That's the only word I can I can think. It's such a heavy process. So I guess I'm curious like what is what is the what is the full stack of promotion and how are you managing the cost? I guess since you have like I'm sure you have like corporate clients and you said people are using it to build their own applications. So that's probably is like funding it. But is it just you? Do you have a team? What does that look like? Um yeah so so Remotion uh it's a company nowadays um but it's just me uh my co-founder Memed who is like uh a friend from from university and then we have also um hired one guy who has like made a great reotion app and we were so amazed by it that we wanted to um get him on our team.
um he's working like part-time, so it's like two and a half full-time employees. Um doing all of it and we we have turned it in into a business. Um, and we've managed to do that through like a pivotal decision that that I made in the in the very beginning, which was to like put a clause into the Remotion license that you can use it for free. But if you are a bigger company uh doing specific things with Remotion, doing like for example like building an app, building a video editor, then you have to like get a license from us.
Um because and it's like it's a big topic for for maintainers. Um yeah, like how do you become sustainable? And I I personally um don't believe so much in in the sponsorship model. Um, I have a lot of friends for for whom it it works, but I I I think there's kind of like a ceiling to how much you can achieve. You you probably will only at best um get, you know, like as much money as you need to like barely get by, right? Um, and you know, like if you have like a bigger vision, right, then then you you'd be grateful for it.
And also like it's um you know, like it's it it's your right to like charge for your work like um you can in in every other profession and you know you can make the license whatever you want it to be. you you don't have to like pick between MIT and Apache and you know it's just like a markdown file and you can write down whatever you want. So, um I'm I'm really glad that I made this decision before before I put up like a lot of um truly truly free open source software on on GitHub and you know about this one I had like a feeling that you know if I would not make it sustainable that I would burn out and um I agree.
Yeah. was very happy about that decision and now I can I can call it my job. Yeah. Yeah. This um little experiment uh that you started years ago turned into a full job. I see even like a company like Hey Jen is using it for their video generator, right? Like their video creator tool. That's super cool. Video agent. Um yeah, totally. So there are now um like this prompt to video space is um actually exploding um right now. So we saw um Replet launch a a product, Lovable launch a product, Claude wow launch a product and um yeah quite intense and and some of these companies they use remote behind the scenes.
Um but also some of them are like waking up and and realizing that you know they can build um a product like Remotion themselves. So it it felt like so weird that until this year after almost five years um we had we had no like big true competitors. Yeah. Um which was almost like like almost got us like wondering um whether we are even building something useful and like in the past two weeks or so um now some like major competitors um wow are are coming up and um you you just mentioned one before they they did use freeotion but uh they they now came up with their own competing framework.
Oh wow. And uh well well they have some some really good ideas. I I give uh credit to them and um yeah I'm fired up because um you know now we can uh measure each other and and see who is best. Yeah. Well, I you know, hopefully Remotion can stay going and hopefully, you know, like everyday builders and creators will turn to Remotion for their video and animation needs, especially as you and the team work to um improve how it works with agents cuz that's how most people are going to use it, right? They're going to just tell their agent, "Hey, I want this video.
Can you use Remotion to do it?" And editing. I've seen like a lot of people doing video editing lately with um tools like Remotion. So um fingers crossed that it will keep growing and you know become one of the top video prompt video tools in the ecosystem because you know you've been here. You've been here and yeah I I hope so too. Um, I'm I'm overall really excited about, you know, like the the new world and um now I I do feel seen that that more people are are flocking to Remotion because um the agents made it um accessible and uh I'll keep working hard on you like make making that workflow smooth.
Um my my goal is that um users can uh you know like really realize their their creativity and and not get a generic slop video out of out of it. Um yeah I I'm like one of my own biggest critics. I I think this is where we where we kind of are at the moment. Um, you know, like I'm certainly not yet uh satisfied and I I'll try to make it like 10 times better and um allow users to uh bring in their creativity. Yeah. So, one person is asking if can I clone and apply via claude max?
Um yeah, so you can use any AI agent um with Remotion. I prefer to use Copilot CLI. So Remotion has really good documentation that you can, you know, just grab the installation link, pop it into your agent, and it will go forth and set up the um the repository or the directory for you for you to create your videos. So do you ever think Johnny of refactoring the code by using AI? Oh, that sounds like a big big ask, Um yeah, sure. The the question is what what what should the the refactor achieve? Um re refactor in into what?
Um maybe Amy, if you can clarify or like give me a specific idea, then um I can tell you if I would I would do it. But you know for like small smaller refactors. Yeah. Oh all of the time it's good for that. I I think life is too short to refactor by hand nowadays. Life is too short to refactor by hand for real. Like everything is moving too fast to refactor by hand. Okay. So Nicolola said, "Amazing work. What's one thing that you think you find super promising and exciting for the future of programat programmatic video creation and AI?
Oh, nice question. Yes, great question and and I'll say that what uh I am hyped about the most at the moment maybe because I've been working on it this week uh which is that on on Monday we're going to like announce and show a couple of demos um of using HTML in canvas um in remotion why this is exciting is because traditional motion graphics software like After Effects, they have like an effect system which is based which is like canvas based, you know, you can like drag filters on top of it if you want to like have a um distortion filter, a a vintage filter, a a a CR like um have your markup uh be seen through a a CRC display.
um with the current web technologies we are pretty much stuck. So um but there is um a technology called HTML in canvas which allows us to um bring this markup into canvas and then apply all kind of crazy uh filters to it. And so you have to like upgrade Chrome to the beta version and enable a flag but uh it's going to be totally worth it. And uh we'll we'll share some cool demos about um that soon. And uh we we've really streamlined it. Like last week I compiled Google Chrome four times from scratch um just to get a version with this uh feature enabled.
So this is what I'm excited about right now. Nice, nice, nice. Um sounds good. So like I guess like what is the performance of motion right now? Like what would you say is the biggest performance bottleneck? Um I I would say the the performance overall it's it's pretty good. It's pretty optimized. Um compared to the um others in the space here we we can properly make use of multi- multi- um threading we we have launched uh a more efficient video tag um this year which can make use of of web codeex GPU accelerated um decoders and and the web render is also um pretty fast.
So I am pretty pretty satisfied. Um d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d d ding preview if you like put a lot of videos in uh in into a remotion composition we still struggle a little bit and um I'm trying to optimize um and and get us as least um as good as we possibly can until we like are not a bottleneck anymore. Yeah. Yeah, nice.
Um, and I think I think Amy said, you know, refactoring for better performance or just to be more competitive. Um, so maybe not like the whole thing. I think like I think like if leaned into being the animation platform for creators, I think that could be really really good. Um because you know like as a creator myself who I'm not I'm not a video editor but I constantly need just like B-rolls or animation to show what I am teaching or what I'm conveying and using platforms like um what is it after effects it's so confusing I don't know how to do that.
Um and so like I that's one thing I've been using promotion to do but you know it's it's it's okay but I I don't have the language for video. So, um I think you know like finding like a finding an area where you could lock into and to like really differentiate yourself and just like be hyper specific. Yeah. Yeah. I do feel you and the whole the whole space is not really accessible at the moment. I mean the best you have is kind of like Cap Cuts. they they understood how to make it accessible but um you have like terrible performance and so it's all payw wall so I'm I'm I'm seeing this and I I'm also looking at what we can do I mean we are already remotion now now I guess like the question in my head is like how do we become more like After Effects so uh stay tuned on on that front and and you know maybe Amy me.
She she just like wondered how not like about the performance in terms of of speed but about how like better videos can be created. And um Amy, if you were asking that then uh I this morning I I woke up early and and I wondered myself like why why did Remotion with this skill not like generate a nice video where like the other tool generated a nice video and I had like a a nice optimization and then I changed the skill. I and then it it made a a nicer video. I this is like such a a tedious strain of work, but but I think a lot of it is now just like changing the markdown files to to allow users to get get the best uh videos.
So will not not as exciting um as I wish it would be but but I think the answer to that is like editing markdown files. Yeah. Really, really cool stuff. Really, really cool stuff. Keep up the work, Johnny. I know it's not I know it's not easy. And for everyone who's watching, be sure to check out the Remotion Project. Go to gh.io/romotion to check it out and, you know, contribute. Try it out. If you like it, this is like a project you could like really lock into and see, you know what? I'm going to contribute to this project and see if, you know, I can make it better for my liking or for the community.
Anything else you want to share before we um pop off? Before that, I did see one question. Could you recommend on how to use it with um cloud code? So, cloud code, codeex, compiled CLI, whatever other agents you're using. At the beginning of the stream, Johnny showed us how, you know, he just like started his agent. He popped in the command and he popped in what he wanted the agent to do and the agent was able to do it. So once we finish the stream, you'll be able to watch the video again on LinkedIn and you can see how to use it with your coding agents.
So any last words you want to share with the folks, Johnny, before we wrap up here? you know, it's like I I'm I'm going to leave it. I'm not going to give give any any plug. Um I'm I'm just uh happy that people already got um a bit of uh recognition and if you uh want to try it out then you know I I think you will find it. Um, and yeah, maybe I can just close it off by saying thank you to GitHub for being such an awesome host and uh, not just here on the podcast, but also like being the collaborative platform for the the Remotion community.
And um I know that it's hard to scale uh since you guys are also exploding at the moment but um I think you are uh handling it extraordinarily. Thank you. We are doing our best and uh more than our best. So thank you for saying that and thank you for being on the stream Johnny to chat to us about remote. It's it's it's really good really good work here and I I wish you explosion this year so that you can expand your team even more and you know get more corporate clients and just all the things to um support Remotion.
Um thanks for watching y'all. Thanks for coming on Johnny. And I believe that's it for today's stream. Um yeah, have a great day. Bye you. Bye everyone. Oh my god.
More from GitHub
Get daily recaps from
GitHub
AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.









