What if your AI coding assistant could not only write infrastructure code, but also deploy it, test it, and fix issues automatically — all on your local machine? That's exactly what the LocalStack MCP Server makes possible.In this session, we'll introduce the LocalStack Model Context Protocol (MCP) Server, a new tool that lets AI agents manage your entire local cloud development lifecycle through a conversational interface. You'll learn:What MCP is and why it's a game-changer for AI-assisted developmentHow the LocalStack MCP Server turns manual cloud tasks into automated workflowsHow to set up and configure the server with your favorite AI editor (Cursor, VS Code, etc.)Real-world demos: deploying CDK apps, analyzing logs, running chaos tests, managing state with Cloud Pods, and more.Through hands-on examples, we'll walk through a complete workflow where an AI agent deploys a serverless application, verifies resources, troubleshoots issues, and tests resilience, all without leaving the conversation.If you've ever wished your AI assistant could do more than just generate code, this talk will show you what's possible when agents can actually manage your local cloud environment.

When it comes to productivity, developer experience is more than just a buzzword. Creating an intuitive developer experience could help you get more out of LocalStack by democratizing access, cutting out manual tasks, and making environments more easily interchangeable between LocalStack and AWS.On a day-to-day basis, this could mean fewer tickets, less time spent creating environments, and more time on the important work that your environments support.This demo session will show how LocalStack’s new integration with Quali Torque can accelerate deployment on both LocalStack and AWS by:* Using generative AI to create reusable environment templates that can be deployed to LocalStack and AWS interchangeably in just a few clicks.( Providing a self-service catalog for your teams to find and provision environments quickly and easily—and without access to create or modify resource configurations.* Simplifying the deployment experience by eliminating complexity and security requirements to run environments on AWS.* Tracking all activity to identify performance issues for LocalStack deployments and wasted cloud costs for AWS deployments proactively.

Debugging serverless functions has always been challenging, often requiring repeated invocations, extensive log tracing, and cloud deployments to diagnose an issue. The new Lambda Debug Mode in LocalStack changes this by allowing developers to debug AWS Lambda functions directly in their IDE, with breakpoints, variable inspection, and step-through execution, without leaving their local environment.In this presentation, Marco Edoardo Palma provides a hands-on demo of Lambda Debug Mode—from debugging standalone functions to handling multi-function workflows. Learn how this developer-first approach makes debugging serverless applications faster, smoother, and more intuitive.## Resources- Documentation: https://docs.localstack.cloud/user-guide/lambda-tools/debugging/#lambda-debug-mode-preview- Samples: https://github.com/localstack-samples/localstack-pro-samples/tree/master/lambda-debug-mode

For one-off tasks, AWS Lambda really can be incredibly easy. You write a few lines of code, deploy it, and you have a function running in the cloud ready to respond to events, scale automatically, and that only costs you pennies.But as your application grows, so does some necessary complexity. When a few one-off functions become a full serverless backend architecture made up of interconnected services, you’ll need to pay careful attention to best practices to ensure that your application is easy to debug, maintain, and scale.That’s where AWS Powertools for Lambda fits in. It’s a suite of reusable utilities designed to simplify bringing best practices around things like logging, tracing, metrics, idempotency and more to your Lambda functions with minimal effort.This demo session will dive into some of the functionality provided by the AWS Powertools (TypeScript) core libraries, such as:Encapsulating best practices into reusable libraries for structured logging, metrics collection, idempotency, and more.Leveraging Middy middleware to integrate common cross-cutting concerns, such as injecting Lambda context or automatically flushing metric.Enabling local testing with LocalStack, allowing you to deploy and debug Lambda functions with structured logs, trace data, and embedded metrics.Providing modular examples that can be deployed to AWS or LocalStack with ease, enabling developers to explore libraries.