
The Hand OS gives developers a universal interface that works across screens, devices, and environments. Voice. Gesture. Spatial awareness. Safe autonomy. One SDK. Infinite possibilities.
The Hand OS sits above every assistant, device, and system — giving users a single expressive interface that feels human, safe, and universal.
The Hand replaces the mouse, keyboard, and touchscreen with:
Developers can build apps, plugins, games, and experiences that feel alive, expressive, and human — without pretending to be human.
The Hand OS includes:
Natural language commands for navigation, editing, control, and gameplay.
Recognizes taps, swipes, grabs, points, thumbs‑up/down, and more.
Understands buttons, fields, warnings, popups, slider
Every gesture maps to a clear, predictable intent. Wave, point, heart, alert, guide — each one is expressive and unambiguous.
Every interaction is filtered through one rule: The Hand will never cause harm, encourage harm, or allow harm — physically, emotionally, socially, or digitally.
The Hand OS blends gestures, mood, and context to create interactions that feel alive — without pretending
Developers can add new behaviors, gestures, personalities, and integrations. The Hand OS is designed to grow with your ideas.
The Hand OS orchestrates Siri, Alexa, Google Assistant, Copilot, and ChatGPT — giving users one expressive interface across all systems.
The SDK includes tools for:
SDK coming soon — join the developer waitlist.
Join the first wave of developers shaping the universal expressive interface for everything.
Sign up for updates, promotions, and behind‑the‑scenes stories. Follow us on social to see how The Hand is growing — and how you can be part of it.