Beyond pick-and-place, X-OOHRI exposes abstract robot actions via a radial menu after selecting a real-world object. Users then manipulate virtual twins to specify missing spatial parameters.
This can also support remote teleoperation 🎮
3/4
Discussion
Beyond pick-and-place, X-OOHRI exposes abstract robot actions via a radial menu after selecting a real-world object. Users then manipulate virtual twins to specify missing spatial parameters.
This can also support remote teleoperation 🎮
3/4
X-OOHRI is a mixed-initiative UI!
Users manipulate life-size, colocated virtual twins of objects in AR to issue precise robot instructions 🪄
When problems arise, robots simulate resolutions or suggest alternatives, and users can help through virtual interactions or physical actions ✨
2/4
Beyond pick-and-place, X-OOHRI exposes abstract robot actions via a radial menu after selecting a real-world object. Users then manipulate virtual twins to specify missing spatial parameters.
This can also support remote teleoperation 🎮
3/4