What if your AI coding assistant could not only write infrastructure code, but also deploy it, test it, and fix issues automatically — all on your local machine? That's exactly what the LocalStack MCP Server makes possible.In this session, we'll introduce the LocalStack Model Context Protocol (MCP) Server, a new tool that lets AI agents manage your entire local cloud development lifecycle through a conversational interface. You'll learn:What MCP is and why it's a game-changer for AI-assisted developmentHow the LocalStack MCP Server turns manual cloud tasks into automated workflowsHow to set up and configure the server with your favorite AI editor (Cursor, VS Code, etc.)Real-world demos: deploying CDK apps, analyzing logs, running chaos tests, managing state with Cloud Pods, and more.Through hands-on examples, we'll walk through a complete workflow where an AI agent deploys a serverless application, verifies resources, troubleshoots issues, and tests resilience, all without leaving the conversation.If you've ever wished your AI assistant could do more than just generate code, this talk will show you what's possible when agents can actually manage your local cloud environment.

How much faster could your cloud application release cycles move if your developers didn’t need to deploy code to the cloud?
Local cloud development eliminates the security implications, cost concerns, and access restrictions of traditional cloud development by replicating production-quality application environments on local infrastructure.
Join us on Tuesday, December 16, at 1pm eastern time for a live demo webinar to learn more about:
Even if you’re not available to join the livestream, sign-up here to receive the session recording in your inbox.

Modern software systems operate in complex, dynamic environments where failures are inevitable. Traditional monitoring and manual incident response are no longer sufficient to ensure resilience or customer satisfaction. This talk explores how to design and implement self-healing software systems by combining telemetry data with an AI-driven agentic approach. We’ll start by examining how high-quality telemetry forms the foundation for detecting anomalies and predicting failures. Next, we’ll show how modern GenAI (LLMs) can transform this telemetry into actionable insights for AI agents that interpret data, pinpoint root causes, and apply automated fixes. Through a practical, real-world example, you’ll see how telemetry and AI work together to create adaptive feedback loops that continuously improve system reliability, while freeing engineers from repetitive operational tasks.

In this hands-on session, you’ll learn how to level up your serverless development by integrating the AWS Toolkit for VS Code with LocalStack — enabling faster Lambda development, debugging, and testing without needing a live AWS account.👨💻 Featuring:Joel Scheuner, Senior Software Engineer at LocalStack, will show you how to:✅ Configure the AWS Toolkit for VS Code to connect with LocalStack✅ Deploy and test Lambda functions locally with full AWS emulation✅ Set breakpoints and debug functions right from your IDE✅ Iterate quickly on code changes with minimal setup overheadYou’ll also hear from an AWS engineer sharing real-world insights and best practices for modern serverless workflows.Whether you’re new to serverless or already deep in AWS development, this session will help you code with confidence and streamline your Lambda workflow from start to finish — all from your laptop.