Open Source Friday with Unity-MCP
Chapters12
Introduce MCP as a bridge that enables AI assistants to interact with the Unity editor and assets.
Unity MCP opens a bridge for AI agents to interact with Unity, enabling automated scene setup, scripting, and UI work inside the editor with open-source tooling.
Summary
GitHub’s Open Source Friday features Kevin discussing Unity MCP, a bridge that lets AI agents directly interact with the Unity editor. The project originated from a hobbyist VR developer and grew through collaborations with C-Play, Aura, and Ramen VR to support both Unity and Unreal workflows. Kevin draws on a long career in AAA game development, including Blizzard and a collaboration with Google DeepMind on StarCraft II, to explain how MCP abstracts diverse asset types into a format usable by language models. The core promise is twofold: first, serialize and expose assets in Unity so the LM can reason about them; second, provide interfaces for the LM to output changes that affect scenes, scripts, and UI. He emphasizes that Unity MCP targets both asset manipulation (scenes, scripts, widgets, materials) and the ways to connect these pieces into a coherent game flow, while acknowledging the ongoing maturation of AI tooling and the need for good evaluation loops. The conversation also compares Unity MCP to Blender MCP, clarifying that MCP is a protocol for exposing program functionality to LLMs, with Unity acting as the domain-specific target. A live demo showcases a simple Unity scene where an AI agent reads the scene, adds objects in a circle, and attaches a script to spin those objects, illustrating the data flow between cloud code, MCP tools, and the Unity editor. Overall, Kevin invites developers to experiment freely, contribute via GitHub or Discord, and pursue practical, well-scoped prompts that yield verifiable results.
Key Takeaways
- MCP acts as a protocol to expose Unity (and other apps) functionality to LLMs, enabling scripted interactions with assets, scenes, and UI.
- Unity MCP uses asset-type specific tools (e.g., scene understanding, UI widgets, scripting) to translate diverse Unity assets into LM-understandable representations.
- The workflow supports a modular approach where multiple MCP clients (Unity, Blender, etc.) can be orchestrated, enabling cross-tool pipelines and shared learnings.
- A practical demo shows an AI agent inspecting a scene, adding cubes in a circle, and attaching a spinning script, illustrating end-to-end data flow from cloud code to the Unity editor.
- The project emphasizes verifiable prompts and feedback loops (verification of changes, error handling) to reduce model hallucinations and improve reliability.
- Community involvement is encouraged via GitHub, and a Codeplay Discord channel serves as a primary hub for tutorials, setup, and contributions.
- The tool is positioned for both hobbyist experimentation and industry-facing usage, recognizing ongoing AI tooling maturation while delivering tangible value today.
Who Is This For?
Unity and Unreal developers exploring AI-assisted game development, AI researchers prototyping editor-level automation, and hobbyists curious about open-source MCP tooling for game design workflows.
Notable Quotes
"This project has an interesting history. It's kind of like jumped around between multiple different like maintainers and collaborators."
—Kevin explains the open-source origins and community-driven evolution of Unity MCP.
"The core problems are very similar between Unity and Unreal: you have many asset types and you have to hook them all together."
—Highlights shared engineering challenges across engines and the value of MCP as a unifying approach.
"One of the main benefits is there’s separate MCP tools for different asset types, like scene understanding and UI widgets."
—Describes the modular toolset that makes AI interactions with Unity feasible.
"With LM tools, there’s a learning curve, but prompt clarity up front dramatically improves outcomes."
—Emphasizes prompt engineering and setup discipline when guiding AI in game dev tasks.
"Just try to make a game with it and give us feedback on where the holes are—there are no barriers since it’s all open source."
—Encourages hands-on experimentation and community contribution to improve the project.
Questions This Video Answers
- How can Unity MCP help me automate scene setup and scripting in Unity?
- What is MCP and how does it connect LLMs to Unity or Blender?
- What are practical examples of AI-assisted tasks in Unity using MCP?
- How do you verify and test AI-generated changes in a Unity project to avoid crashes?
Unity MCPMCP (Model Communication Protocol)Open Source FridayUnityUnreal EngineAI in game developmentC# scriptingUI widgetsAsset serializationCloud code
Full Transcript
[music] Heat. Heat. Heat. Hey. Hey. Hey. Hello, hello. Good morning, good evening, and good afternoon. And welcome to everyone on Friday, open source Friday with uh Unity MCP. We're having Kevin join us today to talk about uh Unity MCP, which is a bridge uh allowing AI assistants to directly integrate with the Unity game developing editor with uh an MCP client. So, super excited to kind of talk about game development and AI tools and have a little fun today. Kevin, would love to have you introduce yourself to the team uh across Open Source Friday and looking forward to some fun.
Yeah. Yeah. Hey, thanks for having me. As you said, I'm also named Kevin. Um, I come from a long history of um, game development. I've been in the industry almost 20 years from an engineering perspective. Um, most of my experience has been in large AAA game development. Um, I was at Blizzard Entertainment for a long time. Worked on various projects. Um, but in the last couple years, I've been more focused on kind of the tooling space and AI applications and gamedev. How do we accelerate the industry? And I was just realizing right now like this is actually the third company I've been at where I've been working on open source tools for like AI and game development like you're saying right now.
Yeah, I just realized [laughter] that. So this we're talking about the uh the Unity MCP tools and my last company were working on more of the ARC pipelines trying to make open source package management for sharing ARC and PCG tools for games. And before that at Blizzard, I was one of the main developers behind the collaboration with Google Deep Mind where we made an AI that could play Starcraft 2 and we tried to cultivate a community of kind of hobbyists to kind of build AIs that can play video games. So I've many eras of, you know, open source AI and gamedev.
Yeah, I mean the intersection uh of all of that has definitely uh adjusted over a decade. I remember when I first started um kind of in the AI space probably about almost over a decade ago, we were looking at how AI could be within the tooling space within game de development too. Mostly around like simplifying the process, making it easier to produce games. Uh and it's interesting to see how far we've come in 15 years or so. I imagine the same you've kind of thought the same thing. Um awesome. Well, well, thank you for joining us today.
Uh maybe just to kind of kick it off a little bit, I'd love to get your background not just in kind of uh open source and gamedev, but like how did unit MCP come together? Like how did you think about the project? Where where did it um the idea come from? And then uh yeah, we'll kind of jump in from there. Yeah. Yeah. This project has an interesting history. It's kind of like jumped around between multiple different like maintainers and collaborators. It's been really been that like open source perspective. I think it it kicked off like early last year.
there was a VR developer Justin who just did it as like a side project as a hobby to show it off and we put it up on GitHub you know free open source and it it kind of exploded like people found a lot of value out of it um like nowadays in the industry there's there's so many hobbyists um just thinking back on my own career like 10 20 years ago the accessibility of these tools is like so much greater now you can just download Unity try it out there's so many more people that can watch YouTube videos tutorials try it out um so the intersection of the accessibility of this stuff plus this like really like great project from Justin really like exploded.
Um so as he kind of started working more on that um he started collaborating with this company called C-Play. Um so there's a number of developers there like Yos um that he worked with for a while and they tried to build more of a product around it but still keeping the the open source core you know free and maintained and then like last month we kind of brought in more people into the team. So there's another company we're now collaborating with which I kind of come in from um called Aura and Ramen VR. So we work more on kind of Unreal Engine development tools.
Okay. So we're now working with them on the ind space as well to kind of build this platform to you know empower game developers across the whole ecosystem to you know make great games. That's awesome. And it all and it all comes Yeah. Like maybe just double clicking with that like I mean coming across Unity and Unreal like how are you seeing these tools kind of come across the intersection of of of AI tooling specifically like I yeah I'm kind of curious about that as a topic. Yeah, I mean like at the they're very different ecosystems Unity and real but in reality but in reality like the core problems are very similar like so the applications are very um well like in like I kind of explain it in terms of like when you work on games a lot of people think about it as I want to put a character in my scene that can run around.
That seems like a pretty simple operation but but under the hood it's actually pretty complicated. There's a lot of different types of assets and data files that can be configured. There's like scripts, there's like text assets like JSON, there's binary assets like images and 3D models. And you have to also configure them in the right way and also hook them all together to make it work. So a simple character like placing my scene, walk around. There might be like an input system you have to configure. It might be like a 3D model. You got to hook in a script for the movement you got to hook in.
Animation you got to put on top. Um, and those problems are very similar between Unity and Unreal. And the solution to those problems and how you approach it from AI perspective is also very similar. So there's a lot of overlap. Even though the tech is different, the the high level problems are are very similar. That makes sense. And maybe kind of double clicking. So you're kind of coming at this tool from two different ecosystems right now. Yeah. Um, h how are the communities kind of fusing around it? Do you see this becoming like the Unity bridge to Unreal or like Yeah.
How does how does that work together? Um I I think people kind of exist on one side of the fence or the other. Um when you're making a game, you kind of commit to one and make your game in this in this kind of tool set. The the stuff you build doesn't really translate. So it it's more so that like we're taking the learnings we've had from Unity and Unreal to kind of like um share across both of those. So if we build really good tech for understanding like the spatial of a scene, that's going to be applicable to both tool sets.
So we can kind of like give value to both communities. And the same thing from the community perspective. If someone's using the Unity tool set and they give us really great feedback about the UX, how this tool should work, that feature is probably going to be something desired by the Unreal side as well. They're very similar developer experiences. So there's kind of commonality even though the communities are kind of like independent. Yeah, that makes sense. Um. Yeah. Yeah. Cool. Cool. I know I kind of derailed that into like an ecosystem discussion, but uh may maybe coming back to the the actual tool itself like Yeah.
Maybe maybe talk a little bit more about um kind of where it's at in terms of features, capabilities, like where you see it going and we can kind of jump into some of the details around the project. Yeah. Yeah. So, so I think like you mentioned, this is a MCP project. So the main goal of this is to allow agents and LLMs to interact with with Unity to allow you to kind of modify and inspect um Unreal and Unity assets. And for Unity, like I mentioned earlier, one of the main challenges to working in these ecosystems is there's such a diversity of different asset types and understanding how they all work and how to configure them is kind of complicated.
And then the second layer is once you know how to kind of configure a script, how to hook it all together, it's complicated. Um, so I would say those are kind of the two main things that AI can help empower developers for. So by having MCT MCP tools, we can take all these diversity of assets and expose them to the LM in a format you can understand. So that's the first value the kind of project provides is like there's all these different asset types. We've done the work to kind of build tools to kind of cleanly serialize them into a form that lens can understand to kind of read it.
And then all from the other perspective, if you want the LM to kind of help you build something, there needs to be an interface for the LM to like output some text that can then affect your scene. So there's kind of an input process as well on a per asset basis. Um, so there's that kind of like that bridge on a per asset type. And then as kind of a second layer, there's like how do you connect things together? So there's just, you know, some general best practices that we kind of like encode in the tools to help it learn how to like connect a model to a a script.
Just kind of some common patterns that, you know, you can help it along with. So those are kind of the two categories I think the tools like are focused on helping with. Interesting. And and then maybe um as you think about like how it's helping like how how is the community like distribution of it right now? like how are you finding people find the project, engage with it? Um yeah, maybe talk about that a little bit. Yeah. Yeah. It's always hard to tell like where the community come from. They just kind of like like I mentioned when the the project from GitHub, it kind of exploded.
And uh being being engineers, we're not really good like marketing business people. So [laughter] it just kind of like blew up. So thank you to GitHub, I guess, for being a good platform to give us visibility. Um but yeah, so there's a GitHub community and you know the standard you know poll requests and comments and feedback, but we also have a Discord community that we cultivate. Um so people in there kind of sharing feedback and collaborating on ideas as well. Those are kind of the two main communities and touch points we have to maintain the project.
Yeah, there there's a question that came up and maybe you can kind of help clarify, but someone asked is it similar to Blender MCP? And it might be good just kind of provide a little bit of clarity around like how the tool differs like what it does, why it does different things. Yeah. So, so, so Blender and MCP. So, so MCP is a just a general protocol for how you expose like a program to an LLM. So, there's a lot of different MCP projects that can, you know, make that bridge to different programs. So, Blender MCP, I haven't seen it myself, but I can imagine it's a similar concept.
They have an agent that wants to interact with Blender and its functionality that it can provide. So by making that MCP bridge it can allow Ellen to kind of interact with with Blender. Um so that concept is the same. Um in terms of Blender the the functionality of Blender is more focused around 3D modeling and sculpting models and for kind of scenes and lighting. Um so that that tool is often used in gamedev pipelines. So you might like make your model and sculpt it in Blender and then you might export it and then put it into Unity to actually play play the game.
Um, but the Unity program is less about 3D modeling, but more about the game playing connecting things together. So, they're similar tools and MCP is a similar kind of like application of that pattern, but they're a bit different. That makes sense. And I mean, maybe maybe double clicking just a little bit is like I I imagine um developers may use different tools at different times. like there's nothing that necessarily like predisposes them to do 3D rendering for other development aspects within game development, but like for this specific application like the Unity MCP will help with developing the the the world within the game engine itself.
Is that like a good Yeah. It's more it's more for like the layout of the level and like the gameplay behavior of how things behave when you interact with them. Um, but one of the nice things about MCP kind of backing out is that you can hook in multiple different MCB clients into like your cloud code or whatever your agent is. So it actually enables workflows like you were saying might be useful of you just ask cloud to make a model and put it in my scene where I might be able to kind of call into the Blender MCP to kind of make a model and then call into the Unity MCP to kind of hook it up in the game.
So these things actually kind of collaborate pretty well in terms of a modular system. That makes sense. Um, cool. And maybe maybe like before we jump into like the the the details of it, but like how um talk about a little bit how the the project actually integrates with Unity today like what is the capability and like how do you see you know obviously it's like an installing an MCP server. um I imagine from like the cloud code side, but like maybe talk through a little bit about the the semantics and the capabilities of the how how it integrates with it.
Yeah. Yeah. Once once I get to the demo, I can give you like a more specific technical kind of a description of where the pieces are. Um but in terms of the capabilities, um I think one of the I'm kind of saying the same thing, but like one of the main benefits is there's separate MCP tools for different asset types. So to be more specific like the scene spatial understanding to kind of place things in the scene, another tool for like the UI to make a widget layout of like a health bar for your character.
Another one for adding a script into the level. So it's kind of a a suite of tools for every individual asset type. That's that's kind of how I would describe the high level of what the the feature set is. Super helpful. And then I'm going to pull up this question too. Um, does the UN Unity MCP tool work with ProBuilder? Um, I'm not sure offhand what ProBuilder is. Is that a uh I think it's more of a um uh a Unity like custom tool developer for like specific objects in you. Uh here I'll I'll drop the the ProBuilder link here just so folks have it.
But um essentially yeah it's it's a uh custom tool developer uh with within Unity itself. So I imagine there's some like rendering of the objects in the world within Unity and then how that would interact with the actual MCP server so that if you're building Yeah. the the Unreal or Unity MCP, it's it's designed to work with kind of the built-in Unity types because it doesn't really know other plugin, but but if a other plugin, you know, the result of it creates objects in your scene, then yeah, you can then interact with them with Unity MCP as long as they're the standard, you know, Unity file types.
Um, but I'm not I'm not as familiar off hand, so I [laughter] Yeah. And and then there was another comment here too um specifically around like sending augmented scripts and automate functions. So it's sounding interesting, but like maybe talk about how you've used it yourself for some of these like custom scripts that you might have and like what what that looks like. Yeah, I think that's one of the other things that these tools are are quite good at. Um if you want to do some operations that are kind of tedious in in your level, like you want to go through every light in my scene and I want to adjust the brightness by like 20%.
uh it's kind of a pain to kind of go through and do manually. So, so tools like this can be really useful to kind of make a script that it can execute to kind of batch operate on your level. And there might have been ways to do it before in terms of like manually writing the script and then plugging it in, but with LM nowadays, they're really good at writing writing code. [laughter] So, by having LM then write the code for you, then execute it, it can really accelerate your workflow. Yeah, that makes sense. And I imagine like part of this um I'd be curious, there's another question here.
I won't pull it up on the screen, but like around beginner versus like experienced developer friendliness. And I imagine to that point like using an AI powered coding tool allows you to accelerate for beginners of like nontechnical game development to build these custom gaming scripts that then can go into the Unity engine without having to have the the knowledge of Unity or even like the scripts themselves, right? Is probably like a benefit of this a little bit. So yeah. Yeah, I I've given a couple framings there around that. I think one of the one of the main benefits, one of the really great benefits of these tools for newcomers is that it's really good for like explaining how things work.
If you get a tutorial, you can ask the AI, hey, how does this how's this level kind of set up? And it's really good at making like documentation for you of explaining it. So, it's a really good learning tool just kind of like talk about how the game works. I will say on the other side of the fence, I I don't think AI tech is quite at the point where it can like make a game for you and you have like no knowledge of how to make games. it can probably get things kind of stood up quickly.
But as with every other codegen tool, um I would kind of classify them as like it's kind of like a new college grad. It has a lot of knowledge on all the systems. It's really eager to help you, but doesn't really have like the core architecture experience of how to build things that are like really robust. So if you kind of come to this tool and you're like, make me a fun shooter game, it's not going to do a very a very good job at making you a fun shooter game right away. all the same kind of prompt engineering best practices that you might have gotten through like usually cloud code or like cursor does also apply here like this it's still kind of a skill how do you like craft the right question to get what you want but that's actually like some of the biggest challenges right is like you have to be clear enough to guide and and structure like the the planning process before AI kicks off not that you couldn't get to the same outcome with like smithing and stuff like that Obviously, the more clear you are up front, the better the outcome down the road, right?
Yeah. Yeah. But but I will say to a newcomer with no experience, Yeah. the experience now is like significantly easier than it was five years ago to come in and just make something fun. [laughter] It's it's still a lot better. Yeah. Um I'm going to drop in another question here which I think is interesting around agent orchestration. It's like is this necessary? Like how do you actually manage agents within the coding um environment in this capability? Like are you seeing that happen more? Do you have to add a different delegation frameworks, harnesses, things like that to make the the AI work well with the MCP server?
Um, I I I kind of see MCP is kind of a separate class of problem than agent orchestration. MCP is just like providing the functionality for like how to modify, you know, things in your program. But I think the the agent orchestration kind of world is kind of changing a lot like monthto monthth best practices works the best. Sure. So you know over the last couple years we went from you know type in a single prompt to like get output to go to more of like an agent kick loop where has a set of tools that it can kind of loop on to kind of call into to uh your program.
So we're now into a space with cloud code where it's getting really better at tools and it's getting much better at like writing scripts to do tools. is doing more programmatic tool calling. So there's a whole ecosystem of like agent orchestration that like it's changing so fast it's hard to give like a clear answer of like what's good because what's good today is probably different than what was a month ago. But the benefit of MCP is you can plug it into whatever the newest like best practices and it it still it still works and keeps up with the latest tech.
That's one of the benefits of having MCP. Yeah, that makes sense. It kind of gives you like um a little bit of an abstraction layer to that and allows you a lot more flexibility. So that makes sense. Um there was another question too. Where'd it go? Uh this one um how do you see it being used to simulate distribution shift and edge cases around like ML and vision systems? And I don't know actually if you actually do anything in that space. um mostly around like well this is this is my naive aspect but like I don't know if Unity has any vision um training algorithms already built in for like custom uh custom models being built into it yet but I don't know so it's about custom models to help you develop games or maybe like I think that's I think that's it like essentially using like the Unity data is like a training uh ML system and so like can you build like custom models around it like simulated worlds.
Yeah. I think one of the one of the challenges for building like custom models for this stuff is we need a lot of training data to kind of like build robust models and there's not that much data on like Unity scripts versus like like nowadays agents are really good at web development because there's like tons of JavaScript and HTML that exists and it's easy to scrape. But a lot of game development tools, it's a smaller community and also a lot of the assets tend to be more binary file format. So they're not naturally like in the LMS training data set.
Um so you're kind of at a disadvantage to start. Um so then okay, you have to navigate a data set like you're saying to do like a guess like vision training to help you with games. Um, I I think there probably are people exploring that. And there's a whole category of world model gener of AI research going on right now that I think is going to kind of become more relevant in a couple years of if you do want to make a game, if they actually have trained models that have general knowledge of world layout and scene understanding and spatial understanding, that might actually be a really great tool to, you know, put together games.
But that tech is kind of a couple years away, I think. So that probably is coming up, but it's it's still like it is coming up, I think. Yeah, coming up, but not quick, you know. Uh super super helpful. Um [snorts] okay, maybe shifting gears. I know we've kind of like covered a lot of the project. Maybe maybe jumping in. Do you want to do the demo and then we can kind of continuing on with the conversation as as folks see? So I'll pull up the demo on the screen and so people can see it.
Here we go. So I I just have a very like simple example set up just to kind of show the core the core data flow of like what this thing even is and I can explain it from both the kind of user perspective of what you're going to use this tool for and I can also dig into the technical details of how it works. I know this is like a coding stream so I'm sure a lot of people are are more technical. Um so at a high level what I have on my screen here is I have a Unity window and I have a basic starter project.
So, I have a scene with some boxes, you know, nothing nothing fancy. And then I have a cloud code window here that I'm going to be using to interact with the with the scene and the core project, the Unity MCP project like we were talking about. It helps put that it helps create that bridge between your AI agent and your scene. And in the upper right, I just have our our Unity MCC server, which I'll explain in a minute. Um, so as like the most basic thing, if I'm in cloud code and I have it all all hooked up, I can just say, "Hey, tell me tell me about my scene, like what's in my what's in my project right now." And you know, by default, you know, agents don't have connections to anything.
They can't see anything. But one of the benefits of this project is it allows it to kind of call tools to reach into the scene, understand the state, and reason about it. So, it's calling some tools. It kind of queried the scene. It kind of queried the hierarchy of objects in my scene. It's thinking about it and it'll give me a little summary of, okay, I have some boxes. Here's what color they are. There's some little clickable stars in the level. Very basic example of, you know, there's a data flow here where it can read assets.
Um, where I think it gets more interesting is it actually taking action in the scene. So, I'll give it another another kind of very simple prompt. I'll just say, "Hey, I want to add some more boxes to my scene and put them in a circle." Um, if you're trying to make a level layout for a game, you probably would have a more interesting interesting, you know, prompt. But just an example of it also has the ability to use these MC tools to reach into the scene to kind of add objects and modify the scene. So, it's uh Oh, there we go.
So, it reached in. It added some more more cubes to my scene. Cool. Um, so the that's kind of the basic input and output. Um, how so so like while you're doing that real quick, like obviously you're adding things into it. Um, and maybe you'll get to this, but like how do you think about like adaptive agents uh within this environment? So that is like continuously learning based off of input or or any like metadata you're coming from Unity to like feed back into your your agent like h does that work today or are you seeing that?
So I think um there there's two things there. There there's one you could frame that question as a verification loop where it can kind of check its work and do a better job by inspecting what it did and get feedback on itself. A second tier there is like actually training the model based on what it's done. That's kind of a harder problem. But I think like on the verify, let's start with that because then like we can we can have a side conversation on on like the Yeah. the other modeling aspect. Yeah. That's kind of that's kind of the harder deep end.
But like the that like verification loop I think is one of the key insights that is coming up lately. I think that's a good call out that um one of the one of the biggest issues using Lum tools in general right now is they kind of hallucinate. They try to do their best effort to make a change, but they kind of often get it wrong. So, you end up being in this loop where like you ask it for something, you got to check again in 10 seconds and kind of correct it and give it feedback.
Um, but one of the best ways to get leverage out of these AI tools is to have a verification loop like that. Um, so for example, if if you had if you're trying to have it make a gameplay script for you, but then you have like a simple like test harness or you have like unit tests or some way it can like programmatically verify its work. Or even in this case, another useful tool is if it can take a screenshot of the screen and like look at it when it did, is it actually in a circle?
If it could verify that visually, it would give it feedback. So that is kind of like the next step I think of this class of tools. We have a basic integration, but the more you can kind of give it those feedback loops, the more it can be affected and check it work. So, how do you um like part of that feedback loop is like knowing how to create the right evals to to benchmark against like I'm curious for a nontraditional game developer like how do you start with like what how do you think about good evals in this space so that you can actually get the right outcomes?
Yeah. Yeah. We we do a bunch of like thinking about that internally as we try to build these tools. It's it's hard. Making emails is surprisingly hard. Like you want to pick problems that the goal of it obviously is to test the quality of your product. So um you want to have questions that are representative of what users will do. That's the first thing. You also want to have questions that are easy to verify. So there's a clear like solution that you can say yes or no to give that feedback. You want to have something that can probably run quickly so it doesn't take too long.
And you also want to have a problem where the kind of like diversity of output it could make is is more restricted. So if you had an eval that was like make my game more fun. That's like a very like broad output [laughter] of what it could be. Good luck. But so it's like hard to verify when the output is like so abstract. So you also want evals to be like very precise of like I want to make this this box follow the player within 2 meters. make a script to do that that's like easy to verify as a good like example of an eval.
So there's these qualities you try to think about of how do you make this you know all those qualities and then if you do then it really helps you um as you're developing these tools and tuning the system prompts um because that's one of the biggest challenges making these systems um LMS are very like blackbox and if you change a system prompt here it could have like unknown effects on all the other behavior. Yeah, they're pretty random. So h having kind of clean e valves to always verify your work that you're not breaking things is is really useful when you're doing general AI development.
But kind of digressing into the implementation side there. Yeah. No, no, that makes sense. And like on on that note though, like um on the implementation side, yeah, who are you seeing like pick this up right now? Like is it more um like seasoned developers of games today? Is it more like the non-seasoned? like who who is actually implementing it right now? So, there are definitely different like groups of people. I I think there's definitely a large hobbyist community because it's just so accessible and it's just kind of fun on the weekend, you know, vibe coding game.
So, I think that's definitely a community that's growing. Um, but I also do think there's a lot of industry professionals um that are trying to get more and more into these tools. So in terms of how this project is structured, that's even the MCP, there's an open source core to all these tools that you can just use for free. There's also kind of a productized version of it called C-Play, which is more of a kind of package version that's more for sale for, you know, larger customers or or hobbyists, too. Um, so we're kind of targeting both of those those demographics.
And I will say in general there is a lot of interest also from the industry like I'm sure you're aware in the tech industry in general there's tons of you know layoffs and struggle right now and especially in the game development industry is also again like a lot of lot of layoffs studios closing kind of budget constraints. So there's a lot of eagerness of people to learn how could you potentially use this tech in you know production environments. Um the answer to that is is hard. So people are trying to, you know, figure it out and try it out and see what it actually works.
So it's kind of an evolving target, but there's definitely interest on on both sides of that fence, you know, small and and large. Got it. That's super helpful. And I'm going to pull up a question here that that I think is actually really interesting. So how do you This goes back to the eval question. Um, failure from the LLM reasoning versus failure by the game state engine or just something else that's like broken, [laughter] which is I think is an accurate question. Yeah. Yeah. Oh, I think um well, you want to have an eval that is like a reliable question.
So, if if you're making an eval that's like make a code change to the core engine state and like change how the renderer works. That's kind of a very deep change that has like a lot of chance to kind of break things. So, it might be hard to distinguish between the L and making a mistake and the engine just being kind of fragile. Yeah. So you kind of want to like target your your your questions and evals to be more like, you know, I place things in my scene. It's kind of a one of the common workflows you would use anyways in the UI.
So hopefully it's more stable and not likely to break the engine. And it's and that's commonly not really an issue anyways because the the goal of these tools is to empower people trying to build content. So the eval are around the user flows that are, you know, well tested and and more robust. So it's not really an issue in practice. Um, obviously you can probably detect like if if it causes a crash, that's probably a good sign it was like an engine problem and not necessarily your LLM. But if it causes a problem where like the game play the box follow the player in like the right way, that's probably the LM's fault.
That was more of just like the runtime behavior being not what you desired. So [snorts] in practice, it's not really a big issue. Got it. Got it. That makes sense. Um, super interesting. This there's some uh some deeper questions here. Um I don't know if I can fully get through this one but simulating distribution shift of changing objects and like lighting and placement. Um does this MCP help towards the data generation pipelines and like creating this um integrating with world models like I'll let you read the question because I think it's an interesting yeah yeah I think that's not something that we're directly focused on but there are people in the industry doing that.
Um there's a lot of work on synthetic data generation. Sure. So, so for example, if you're doing like a startup doing self-driving cars, um it's kind of hard to get data on, you know, cars driving with your AI because you don't want to like crash [laughter] and like so they tend to like make a lot of simulations in game engines like Unreal or Unity where they simulate a car driving around and they use that to record a video to then train a model that they then use in the real world. So there's a whole industry of synthetic data generation that uses this tech, not necessarily even the MCP, but like um game engines in general.
So that that is is a thing that that is out there, but not our current focus. We're focused mostly on the game developers making, you know, experiences. Yep. Makes sense. Um super cool. Let's go back to the demo. [laughter] Okay. Yeah, it's fine. I'm happy to digress into any any part here. Yeah. So the the last two things I was just going to show was that I just showed the basic input output, but I had also mentioned one of the main goals of this tool set is just give you access to all the different like asset types.
So I just showed create objects. So I can also say, hey, maybe do some scripting. So make a script that makes my boxes spin. So in in Unity, a lot of the game play is done with C# scripts. So it'll actually write a C# script, attach that onto my boxes and be able to execute it. Um, you could also and then just make sure that worked. Oh, it's just still it's still trying to attach the objects as that's going through. This question came in too. Um, what about the decision- making of the algorithm of a non-human player?
So all of your like um you know a agent-based players, how do they respond via uh the agents and like does that actually is that a use of the AI? Do do people use it that way today? More like NPCs like kind of how they interact with the AI and like but but then it's like it's different from the coding of the MCP it or like the actual like building of the game environment versus the AI itself. Yeah, I would say there's two different categories of AI in games. And just showing off here, my boxes are spinning.
It did a good job, you know, putting scripts on them. Um there there's kind of the the phase of developing the content for the game, like you're a developer, making the level layout, writing the scripts, and then there's the runtime aspect, like while a player is playing a game, do you want dynamic behavior of NPCs that chat with you with like, you know, custom dialogue? Those are kind of two separate problems. Um, and even the MCP is more focused on the first, which is like I'm a developer making the content up front. It's not so much the the runtime NPC behavior.
Super helpful. Interesting. Sorry. Okay, back to the [laughter] That's okay. I'll just I'll just ramble through some examples. Well, if you throw more questions at me, that's that's fine. So, so it can make it can make uh scripting also throw in it can also make like UI layouts. So, if you have a game that has like a health bar with your player name, it can also help build the the UI layout for that. So, I'll give a little pro prompt to make a health bar for me. Um, it can also do things like change the materials on the level.
It can like change the lighting. Um, yeah. So that that there's a lot a lot of diversity of uh asset types in the game. Super cool. And and I I think we have pretty good coverage of it, but I think that's one of the things that we could always use more help on from a community perspective of are there things that we don't have good coverage of. Can they know contribute like a new tool to help do a better job at like UI widgets or is the behavior not tuned very well? Does it mess up with some new UI type?
They can always, you know, help help add support for that. So hopefully this is finished. I think it I think it made a little mistake there. My UI it made a layout of like some text didn't quite make my health bars properly. Oh yeah, yeah, yeah. Interesting. It did this for me in actually in uh practice for this. It gives me an error message and if I give it the error message last time it was smart enough to fix it. So, that's actually one of the things I find really interesting is um you can just feed it like the raw errors and it's smart enough to figure that out.
I I I think it's um Yeah. Yeah. I Yeah, there you go. Done. There we go. I figured it out. So, yeah, I made a health bar MP mana. So, just an example of it can do like you know UI layouts for you as well. That's awesome. And um have you seen any like edge cases that it seems to struggle with at all or does it seem to be pretty easy to to kind of um Yeah, I'm just kind of curious if that's been an issue at all. Yeah, honestly like when I was coming into developing these tools, I've been pretty impressed by like the baseline knowledge of how to hook assets together.
Um it definitely lacks in, you know, how to actually interface with modify an asset. But in terms of the second problem of like how do I connect things together, you know, my player, my script, my model, my animation, it does a surprisingly good job at understanding how to do that out of the box. I guess there's there's a sufficient number of forum posts online of people talking about it that it has good, you know, baseline knowledge. Yeah. Yeah, that makes sense. Super cool. And um maybe maybe kind of like an adjacent to this like this is just the MCP layer itself.
Like how are you thinking about um like other protocols within like game dev? Like are you looking at A2A or like any of the other like more AI um specific protocols whether that's on like commerce capabilities things like that built into these? Like I'm just kind of curious like how you think about where game devs going uh going forward. And I mean I think for this specific project it's kind of scoped to just like modifying you know assets and scenes but yeah I definitely do see a broader ecosystem of AI tools. I think all game companies right now are trying to explore you know how to apply this tech especially the last couple years with you know LM and agents.
Yeah, both both in terms of like the workflows of developers, the workflows of you know business people understanding the market, you know, even like in in in-game like NPCs like we were talking about. So I think all that stuff is being worked on by a lot of companies and I think the next like five years is going to be very interesting to see how it all plays out. So that that's personally why I'm kind of like in the pool here working these tools. It's just a very exciting time as an engineer. Things are changing so much that like it's just fun to be here for the ride.
I don't know exactly where it's going to go, but I don't know. It's just it's just fun to work on. It is fun. [laughter] I love it. Uh and maybe like just kind of double clicking a little bit on the project itself, like how how are you interacting with community right now? Like do you seek other contributors, maintainers, support across the project? Like what does that look like now? Or like how can people get involved with it today? Yeah, I I called out I think one of the main things is our Discord community. There's a Discord chat where you can talk about, you know, feature requests and bugs and support and uh share the stuff you made with the tool.
I'd recommend that as kind of the main main point of contact. It's part of it's kind of a sub community inside of the C-play Discord. Codeplay is like the larger company that has more of the product dies version, but we do support the the core open source as well. So, I would check out the the Codeplay Discord. Helpful. And then um maybe maybe one last question as we kind of like close up like how do you find people like you know how can they get started with the project, use it, kind of engage with you um things like that.
I mean I think um just try to make a game with it. Have some fun. This is this is game development. You should just jump in, get Unity, and try to make a game and figure out like as a developer trying to make something like where does it fall short? like it's going to have holes and things it's not as good at. So just like trying to test the product and give us feedback of where the holes are is is super valuable. We get we get so pigeon hole on what we're working on right now trying to get like the scripting to work better.
We kind of lose perspective on the overall experience of someone fresh coming in making a game from scratch. So yeah, just have fun. [laughter] That's that's my advice. I love it. It's all free open source, so you know, no no barriers. Yeah. And and are you like obviously this video is a good like tutorial to it, but like do you have any other like tutorials or like documentation to get started or or where to go? Um that is a good question. Um I think the codeplay website has a documentation section. It definitely has like setup instructions.
I would definitely also look at the at the GitHub project. The GitHub has a read me on the main page with like setup instructions how to set it up. Um, yeah. So maybe I I would say the GitHub um read me is a good a good starting point and that links to other things like documentation. [clears throat] I love it. Any any other like things you want to show in the demo or want to call out in the project before we we kind of wrap up for the day? Um uh no I just have you know more examples of like Yeah, feel free to keep going.
I'll just keep running them in the background while you finish up. you know, I want to maybe change my lighting to be like nighttime, so it can also change like the the atmosphere of the scene. Um, but yeah, [snorts] I mean, that's about it. It's uh it helps you modify assets, change your scene, and make your game, you know, accelerate your workflow, try to make something fun. [snorts] I like it. Uh Oh, that's super cool. I like that. It's really It's midnight now. And then like once once it's done, like how do you publish the the game itself?
Can you do that from the agent itself? Do you have to follow like standard Unity processes? Like what what does that look like? Yeah, I think you just follow standard Unity processes. There's um I think it's like a there's like a make a build button somewhere in here that you can then publish. So it's it's not really in the scope of the AI tools that we're working on. [snorts] Sure. Sure. Totally get it. Sometimes like the tool being focused and scoped is a lot better than having to make it make it too bloated. So I totally get that.
Yeah. [laughter] Super cool. Uh well, thank you for for joining us today. This was an awesome overview of the project. I think for folks that joined uh got a really good overview of what's coming both in terms of game development and capabilities there, but also how these agents and kind of AI powered coding assistant tools can interact with it through MCP and um yeah, it's I think you called it out too like open source is clearly an a pathway to getting access to these tools early, engaging with it and building with it as well. So super cool.
Cool. Yeah, thanks for having me. Happy to be here. Awesome. Thanks, Kevin. I appreciate it. And thanks everyone for joining us. And uh stay tuned for next week. We'll have more folks. We're excited for for you all to join. Thanks so much. Have a wonderful Friday. Take care. Hey, hey, hey. hey, hey. [music]
More from GitHub
Get daily recaps from
GitHub
AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.









