

AI assistants promised faster code and happier developers, but real productivity extends far beyond code generation. This session explores how AI reshapes the full developer experience across review, testing, and delivery, and what leaders must rethink to improve DevEx in an AI-enabled environment.



AI has revealed the limits of traditional productivity metrics. Learn how velocity and output measures miss critical shifts in workflow and developer experience, and how to evolve toward signals that reflect meaningful progress.


AI can increase output while quietly expanding review time, rework, and cognitive load. Discover how to separate real productivity gains from activity inflation and ensure AI adoption strengthens, rather than strains, your teams.




AI accelerates code creation, but our feedback systems are struggling to keep pace. When code generation takes seconds but human reviews, security checks, and testing still take days, teams face a new bottleneck. This session explores how the surge in AI-assisted code stresses legacy validation loops, quietly eroding quality, and what engineering leaders must do to keep iteration fast, reliable, and secure.



Engineering excellence now depends on how smoothly work flows after code is written. This session explores how AI affects cycle time, feedback loops, and cross-team coordination -- and how platform engineering reduces friction to improve delivery and DevEx at scale.


After AI adoption, what defines a truly high-performing engineering organization? This closing keynote synthesizes the day’s insights into a clear framework for improving DevEx, aligning AI with platform discipline, and building sustainable performance beyond code generation.