Official Session Summary
Pulled from the live conference page.
We talk about "agents" constantly, but most are still trapped behind a glass screen. This session explores the engineering challenges of Embodied AI by turning the Reachy Mini into a real-time, multimodal Hype Robot. We will move past simple scripted movements to a solution where the robot perceives the audience and generates contextual, physical responses. I’ll dive into the technical stack required to bridge the gap between high-level LLM reasoning and low-level servo actuation. Attendees will learn how to manage latency in vision-to-action loops and how to build something interesting and fun with an open source robot connected to an LLM.
Speaker Background
Quick context on the person or people on stage.
Senior Director of Developer Relations at Akamai, with a practical bent toward developer platforms, demos that move, and embodied AI experimentation.
Why This Slot Matters
A compact framing layer for navigating the conference.
This is one of the more substantive abstract-backed sessions on the schedule; worth opening when you need enough context to decide whether to stay in the room.