Your AI Agent Has Amnesia. Let's Fix It. - Ship AI with Laravel EP4

Laravel News| 00:10:00|Apr 15, 2026
Chapters11
Discusses the lack of memory in the support agent and the goal of adding conversation memory so the agent can handle multi-turn chats.

Ship AI with Laravel finally remembers past chats, turning a basic lookup agent into a true multi-turn support assistant with conversation memory.

Summary

Harris from Laravel News walks through wiring conversation memory into a Laravel-based AI agent. He first demonstrates the memory problem: a second customer message arrives with zero context, leaving the agent unsure how to respond. The fix is lightweight: add the remembers conversations trait so the agent loads and stores history in database tables created in episode one. The video shows three API modes—chat start, chat continue, and chat resume—so a conversation can be created, extended, and later resumed with a persistent conversation ID. Harris then bundles the logic into a chat controller to power a chat widget and explains how to handle first messages (create new conversation) and subsequent messages (continue using the conversation history). He also notes a fallback path for full control over history by implementing a messages method instead of the trait. The takeaway is that two lines of code (an interface and a trait) are enough to give real memory to the agent, laying the groundwork for future features like embeddings. Finally, Harris hints at embedding-based knowledge retrieval to answer FAQs in future episodes.

Key Takeaways

  • The remembers conversations trait automatically stores and loads conversation history using database tables created in episode one.
  • Starting a new chat returns a conversation ID that ties future messages to the exact conversation.
  • Continuing a chat loads the entire conversation history and passes it to the AI to maintain context (e.g., order 1042).
  • The chat resume endpoint stores the conversation ID so users can pick up where they left off on subsequent visits.
  • If needed, you can implement a custom messages method to fully control the conversation history instead of using the trait.
  • Two simple code additions—an interface and a trait—enable real memory for the support agent.
  • The foundation is laid for integrating embeddings later to search FAQs and recommendations.

Who Is This For?

Laravel developers building AI-powered chat assistants who want reliable multi-turn conversations and easy persistence across sessions. Essential viewing for teams upgrading from stateless bots to memory-enabled agents.

Notable Quotes

"The second message doesn't have any context. The agent doesn't know what it means because its call to prompt is completely dependent in this case."
Demonstrates the problem of no memory in a multi-turn chat.
"The remembers conversation trait handles storing and loading conversations history automatically using those database tables we set up in episode one."
Explains the core solution to memory in the agent.
"The API changes slightly of course. So instead of just calling prompt we tell the agent who the user is."
Shows how conversation ID ties into starting and continuing chats.
"The response now includes a conversation ID and that's our handle to come back to this exact conversation later."
Key detail about starting a new chat and tracking it.

Questions This Video Answers

  • How do you implement memory in a Laravel AI agent with persists across sessions?
  • What is the remembers conversations trait and how does it work with Laravel databases?
  • How can I start, continue, and resume a chat using a conversation ID in Laravel AI apps?
  • Can I replace the trait with a custom messages method for full control over chat history?
  • What upcoming features (like embeddings) can enhance a memory-enabled chat in Laravel?
Laravel NewsLaravel AI integrationConversation memoryremembers conversations traitchat startchat continuechat resumeembeddings
Full Transcript
Our support agent can look up orders and customer data, but every message is the first start. No memory in this case, ask a follow-up question, and has no idea what you were just talking about. I'm Harris from Lavan News, and today we're adding conversation memory so our agent can hold real multi-turn support chats. Let's build this. To go back to our editor, we have the support agent here. So the support isn't from last episode with the order lookup and customer history tools. It can pull real data from our database. But what's what happens when a customer sends two messages. Let's go in our routes file and down here going to add a new route going to be called memory problem. So I'm going to showcase what the problem actually is. Let's add a closure. And then we have the agents new support agent first agent. And the prompt will be hi my order 1042 seems to be lost. And then we're going to add a second message down here which going to say again agent prompt can you just resend it? We're going to return both results, first text and also second text. Okay, the second prompt has zero context. The agent doesn't know what it means because its call to prompt is completely dependent in this case. So for a real support chat, we need the agent to remember the whole conversation. Let's go and run this. So let's go to our browser and run memory problem. And when this is done, you'll see that the second message doesn't have any context. So we have the first message. I'm sorry you're dealing with that. I checked order 1042, etc., etc. Because it could find that from the database. But the second one, it says, I can help with that. What would you like me to resend an order? A receipt or a confirmation email. If it's about an order, please send either your order number which we already sent the email used for the purchase then I'll check it for you. So the second message doesn't have any context of the first message and this is what we want to fix in this case. Let's go back to our support agent. So the SDK makes this very very simple. Two things the conversational interface and the remeers conversation straight. Let's add them both. Let's scroll up here after house tools. Let's add this thing called conversational and import that. And also let's add this trait. This trait is going to be called remembers conversations. Import that as well. So here is the updated agent with everything from previous episodes plus conversation memory. It's exactly the same. It's only those two lines of code. The remembers conversation trait handles storing and loading conversations history automatically using those database tables we set up in episode one. Remember let's go to our web route again. Now the API changes slightly of course. So instead of just calling prompt we tell the agent who the user is. Three ways to use it. First starting a new conversation. So let's go down here. We're going to add a new route. In this case, it's going to be called chat start. We do two things. We pull the user. We also have the support agent. And what we do is we use the for user here which accepts a user. This means that we start a new conversation tied for that specific user. The response now includes a conversation ID and that's our handle to come back to this exact conversation later. Of course, let's hit this route. Now, let's go to our routes file. So let's do chat start and the result you'll see that we have two properties in this case. So we have reply the agent uses the order lookup tool gets the real order data and response and we also get back the conversation ID. Now let's continue that conversation. Right? Two ways to do it. Let's go back to our routes file in this new route called chat continue. We have the conversation ID that we pass as a parameter and then we use continue. The continue method loads the entire conversation history and passes it to the AI. The agent knows what a replacement refers to because it remembers everything about the order 1042. Let's go to the browser and run this. Chat continue and then conversation ID. We're going to keep the conversation ID. It's to chat continue and then the conversation ID and it says I'm sorry this happened. I can directly issue replacement from here because order 1042 is already marked as shipped. There it is. The agent references the specific order acknowledges what was discussed and handles the followup naturally like a real conversation in this case. Let's go back to the editor. Now you could also build a resume last chat feature by storing the conversation ID in your database or your session. We have this chat resume endpoint. Now as you can see again we get the user the agent and now we are pulling this last conversation ID out of the session and we just continue based on that. Let's go to chat resume store the conversation ID from each response and users can pick up right where they left off. It works really well for a chat interface in this case. Let's go back. If we refresh, I'll probably get nothing because it can't find it from my session. Now, let's put all of this into a real controller. Let's go into our terminal again and create PHP artisan make controller. And we're going to name this one chat controller. Go back to our chat controller. And let's put this this whole thing we have so far into this controller that could power a chat widget. In this case, I'm going to copy and paste implementation. So we don't have to write all of this. So let's come down here. We're going to discuss this line by line. We have the send function implemented. So we accept a request. Of course, we validate that we have two things that we need. We need a message which is required and we also have the conversation ID which we don't require in this case. because we don't really want this to fail. If for any reason you couldn't find the conversation ID in your session on your database, then we instantiate the support agent. We check if we actually got the conversation ID and if we did, what we do is we just continue the previous conversation based on this conversation ID for this specific authenticated user. If we didn't get the conversation ID, which means that this is the first message in this conversation, then we just create a new response and then we reply back with the reply property, which is the text from the response and the conversation ID. Either way, we return that reply and the conversation ID so the front end can keep the thread going. Let's go back to our routes file and add a new route. and the scroll down and this new route will have to do about chat. So open up chat. It's going to be a post request. Of course, going to add s controller in here. Let's call send. And of course, let's add the middleware of like we did in the last episode. We don't have chat UI first. That's coming in episode 7 with live wire and streaming. This controller is the back end that will power it up. So keep that in mind. So to give you a rough idea of how this is going to work, the front end will send a post request to our chat endpoint with a message. For example, my order is late without the conversation ID obviously because this is the first message, right? Then gets back a reply with the conversation ID and the next message we will include the conversation ID. So again, the front end will send a request to the chat endpoint with the message and the conversation ID and gets back a reply with the same conversation ID. So the agent remembers the entire conversation. Now, for a quick tip, if you need full control over the conversation history, maybe you're loading messages from your own database or a different storage system, you can implement the messages method instead of using the trait we used in our support agent. Let's go back to the support agent. So, instead of using the remembers conversation, you can just add a new function and call this messages. And inside messages you can have, for example, the chat history and get your chat history here and then return it, of course. And this will act exactly the same and give you full control over what context the agent sees in this case. This is very useful if you're migrating from a custom chat system or need to inject specific context. So for most cases, rememberers conversations handle this perfectly. So I'm going to undo everything here. Going to keep remembers conversations in and we move forward. So two lines of code, one interface and one trait. Our support agent now holds real conversation. It remembers what the customer said. Every order is looked up and every response it gave. That's the foundation for a real support chat. Next time we're building a loaded base with embeddings. So the agents can search FAQs and recommendations for example for answers.

Get daily recaps from
Laravel News

AI-powered summaries delivered to your inbox. Save hours every week while staying fully informed.