Your First AI Agent with Laravel 13 - Ship AI with Laravel EP1
Chapters11
Introduces the official AI SDK and demonstrates building a support agent capable of holding conversations, outlining the goals of the series to create a talking, real-time support assistant.
Laravel News shows how to spin up your first AI agent in Laravel 13 using the official AI SDK with OpenAI, including token controls, prompts, and multi-provider testing.
Summary
Harris from Laravel News guides viewers through building a first support AI agent in a fresh Laravel 13 app. He starts by installing the official Laravel AI SDK, publishing configuration, and migrating a new agent_conversations table to store chats. He demonstrates wiring an OpenAI API key and scaffolding an agent class named Support Agent, highlighting the key contracts for conversational behavior and tools. The video walks through customizing prompts, max_tokens (set to 500) and temperature (0.7) to balance conciseness and natural responses, and updating the default instructions to reflect empathetic customer support. A quick route test showcases the agent replying to a realistic order issue, then exposes how to return richer response data including prompt tokens, completion tokens, provider, and model for cost tracking. Harris also demonstrates swapping providers on the fly (OpenAI, Anthropic) and changing models without changing business logic. He emphasizes that everything in the example is production-ready and serves as the foundation for the next episode, which will build a ticket classifier for category, priority, and sentiment. This is all set in a Breeze + Livewire + Tailwind 4 stack with PostgreSQL and PyVector in the repo.
Key Takeaways
- Install and configure the Laravel AI SDK in a Laravel 13 app, then publish config and migrations to create the agent_conversations table.
- Scaffold a dedicated agent class (Support Agent) that implements conversational and tools contracts, with promptable traits to handle prompting and streaming.
- Tune generation with max_tokens = 500 and temperature = 0.7 to keep responses concise yet natural.
- Rewrite the agent's instructions to reflect empathetic, customer-focused support and clear next steps.
- Test via a new support_test route that returns rich response data, including reply text, prompt tokens, completion tokens, provider, and model.
- Switch between providers (OpenAI to Anthropic) and models without changing business logic, demonstrating provider-agnostic code.
- This episode sets the foundation for a ticket classifier in Episode 2, sorting tickets by category, priority, and sentiment.
Who Is This For?
Essential viewing for Laravel developers who want to ship AI features quickly, especially those using Laravel 13 with Breeze, Livewire, and Tailwind, and who want a provider-agnostic AI integration in production.
Notable Quotes
"Laravel just sipped an official AI SDK. A few lines of code and you got an agent that can hold conversations."
—Introductory claim about the new SDK and capabilities.
"You're a friendly professional customer support agent for an online store because this is what we build, right?"
—Shows how the instructions are customized for a support persona.
"The code is identical no matter which provider you choose."
—Emphasizes provider-agnostic implementation.
"Let's run this in the test console. It's loading down here. I'm sorry for the worry."
—Demonstrates a practical, empathetic response from the agent.
"And you can also change the model as well of course. So model will be cloth set 4. Let's use that 2025 054."
—Shows how to switch models for the same agent.
Questions This Video Answers
- How do I install and configure the Laravel AI SDK in a Laravel 13 project?
- Can I switch AI providers (OpenAI, Anthropic) without changing my code in Laravel AI SDK?
- What configuration options control the AI response (max_tokens, temperature) in Laravel AI SDK?
- How do I test a new AI agent route in Laravel to ensure it returns structured data like tokens and provider?
- What will Episode 2 cover in the Laravel News AI series on tickets classification and sentiment?
Laravel AI SDKLaravel 13OpenAI APIAnthropicAgent classPrompt designMax tokensTemperatureProvider switchingAgent conversations table
Full Transcript
Laravel just sipped an official AI SDK. A few lines of code and you got an agent that can hold conversations. I'm Harris from Laravel News and today we're installing the SDK and building our first support agent. Throughout this series, we're building support AI. It will classify tickets, search a knowledge base, hold conversations, and stream responses in real time. Let's get into it. So I'm starting fresh with a lar 13 application breeze with livewire stack tailwind version 4 postsql with py vector installed as well. So all of that is in the github repo if you would like to do the exact same setup.
Now let's install the AI SDK composer require laravel AI. Great. Now this is installed. Next we publish the configuration file and database migrations. So let's run PHP artisan vendor publish and let's do the provider equals we want the AI service provider. So AI AI service provider. Nice. And now we run the migration. So PHP artisan migrate. We created the agent conversations table. If you go to our database, you'll notice we have the agent conversations table. So let's go up here. Agent conversations table which is empty. We have an ID, the user ID, the title and created and updated.
Of course, this created the table for the conversation storage that we will use later on. Now, we need an API key. The SDK supports OpenAI, Anthropic, Gemini, and more. I'll use OpenAI for this series, but the code is identical no matter which provider you choose. I'm going to go to my EMV and with the power of screencasting, I'm going to add my OpenAI API key. Last thing, let's look at the configuration. So, let's open up AI.php. The default provider up top is OpenAI as you can see with the credentials below. You only configure the providers you actually using.
That's the full setup. So, now we are ready. Let's build something. Let's go back to our terminal. Let's clear this up. In this AI SDK, agents are dedicated classes that define how your AI behaves. Let's scaffold one. So, PHP artisan make agent. Support agent. As you can see, we now got a support agent. Let's go back, open this up so you can see what we have inside. So, here's what we get. We have the support agent class that implements agent conversational and has tools contracts. Then we have the promptable here. If we go deeper, you'll see that there are some methods here that we're going to use about prompting, streaming, queuing, etc.
Let's go back. We have instructions. This is where we get the instruction that the engine should follow. In this case, we have the default one. You're a helpful assistant. Then we also have messages which is a list of messages comprising the conversation so far. And last but not least we have an array of tools. The tools that will be available to our agent. Now regarding those two contracts conversational and has tools we're going to use them later on. So the promptable trait in this case gives us the prompt method as we mentioned to actually talk to the AI.
So for now let's strip this down to the basics and make it a proper support agent. First we are going to add the maximum number of tokens. So max tokens you can see that we get them from the attributes from the AI SDK. So let's import that. And the maximum number of tokens in this case will be 500. So maximum tokens will be 500. And this will keep our replies concise. We don't want essays for support responses. And then the next thing we're going to add is called temperature. So let's import that as well. Let's set this to 0.7.
And what temperature is is giving us a good balance between consistent and natural sounding answers. Now let's replace the default instructions with something built for customer support. Let's go down here to instructions. I'm going to copy and paste my own instructions here and we're going to go and discuss them line by line and replace the full instructions. Sorry. And let's go here and see. You're a friendly professional customer support agent for an online store because this is what we build, right? And we have some guidelines here. Be empathetic and acknowledge the customer's concern. Ask clarifying questions when needed.
Pro provide clear, actionable next steps. Keep responses concise but helpful. And if you cannot resolve an issue, let the customer know a human agent will follow up. Okay, that's great. These instructions save everything. Empathy first, then helpful. Just like you train a real support agent, right? The more specific you are there, the more consistent the agents behave in this case. And that's it. Let's take it for a spin and add a quick test route. Let's go to our web route here. Going to go down here. Let's add a new route. It's going to be a get request.
Let's name this support test. And now what we're going to add here is a response. We need a new support agent, the one we just added. And we're going to prompt. We're going to use prompt method in this case. And let's pass our prompt. So what will our prompt be in this case? We're going to say, "Hi, I placed an order 3 days ago and it still says processing the order number is 1042." Perfect. And last but not least, we would like to showcase the text. So, response text, we are ready. Let's run this in the test console.
Let's go back to our browser now and let's type support test. As you can see, it's running. It's loading down here. I'm sorry for the worry. I know it's frustrating when an order doesn't move. I can help check this processing after 3 days can happen if the item is still being prepared, etc., etc. That's so cool. There it is. It's very apathetic. Acknowledges the frustration and offers to help. Our agent works just fine. So, the response object, let's go back to our web file. Now the response object here gives us more than just text. Let me update this route to show what's available actually.
So instead of text only, we're going to return more here. Right? So going to have an array. The first one will be the reply in this case and the reply will hold response text, the one we already had. The second thing is going to be the tokens. So, we want the prompt tokens and we're going to grab this from response usage and prompt tokens. As easy as that. Completion tokens. Completion tokens. We also have access to them as well. So, it's going to do exactly the same. Spawn users completion tokens. Next, we're going to have the provider because you might want to switch providers and be able to see which provider you used for this response.
So, let's go again. Response meta. We're going to use meta in this case provider and not meta as in Facebook, but just meta data. Last but not least, we have model as well. Spawn meta model. So, you know, also have access to the model as well in this case. So now we can see token usage for cost tracking plus which provider and model handled the request. Good to have it in production. Let's go back to our browser refresh. And now you see we got a more structured response. We got reply the prompt tokens 106 completion tokens 113 the provider which is open AI as we mentioned and we used GPD 5.4 thinking model by the way as well.
So great stuff. And you can override the provider of any request. You can go for example here and instead of sending this prompt, I'm going to go here and then let's say pro I want the provider to be lab anthropic. Just like that. As easy as that. So I'm going to move this down here. And you can also change the model as well of course. So model will be cloth set 4. Let's use that 2025 054. That's the actual model. So let's go back. Let's refresh. And you see now our response came from a tropic with clothes solid 4.
Same agent, same instructions, but different provider. Your code doesn't change at all in this case. Pretty pretty cool stuff. So I'm going to move this back because I really need to use OpenAI in our examples. And keep in mind that that's our foundation. Now the AI SDK installed, configured, and support as responding to customers. Everything we build in this series starts from here. In the next episode, we are building a ticket classifier that uses structured output to sort tickets by category, priority, and sentiment as well. automatic trials without any human intervention in the loop.
More from Laravel News
Get daily recaps from
Laravel News
AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.









