In #vr users can experience “magical” interactions, such as moving distant virtual objects with the Go-Go Interaction technique. How might we similarly extend people’s abilities in the physical world? 🪄

Excited to share Reality Promises, our #UIST2025 paper, led by the amazing Mo Kari ✨ @hci

🧵1/5

Through a Meta Quest 3 headset view, a user holds up their hand and pinches to open the MagicMake menu, which expands from a small icon into a full panel of virtual objects. They flick through options, skip a Coke Zero, and choose a coconut water. By reaching into the panel, they “pull out” the 2D object into 3D space. The object drops onto the desk and begins to materialize, while the menu icon shrinks back into a point. A progress effect shows the object “coming into reality.” In sync, a hidden mobile robot, also wearing a Meta Quest 3 headset, picks up a real coconut water from the kitchen counter and carries it to the desk. As the robot finishes and retracts, the virtual object fades into the real bottle, now physically present for the user to pick up.
Through a Meta Quest 3 headset view, a user holds up their hand and pinches to open the MagicMake menu, which expands from a small icon into a full panel of virtual objects. They flick through options, skip a Coke Zero, and choose a coconut water. By reaching into the panel, they “pull out” the 2D object into 3D space. The object drops onto the desk and begins to materialize, while the menu icon shrinks back into a point. A progress effect shows the object “coming into reality.” In sync, a hidden mobile robot, also wearing a Meta Quest 3 headset, picks up a real coconut water from the kitchen counter and carries it to the desk. As the robot finishes and retracts, the virtual object fades into the real bottle, now physically present for the user to pick up.

Beyond materializing physical objects (seemingly out of thin air), users can manipulate out-of-reach objects via RealityGoGo — creating the illusion of telekinesis. 🪴

2/5

On the left is a desk; to the right, a far shelf where a small potted plant sits. The user, seated on a couch, reaches out as if toward the plant. A dashed line from the hand to the plant suggests the object is aimed at. With a pinch and arm retraction, a virtual duplicate of the plant moves closer while the physical plant disappears. The user then places the duplicate on a surface, such as the desk by a window, where it lands, and a countdown begins. A hidden robot then moves the physical plant to where the virtual twin has been placed, fulfilling the reality promise.
On the left is a desk; to the right, a far shelf where a small potted plant sits. The user, seated on a couch, reaches out as if toward the plant. A dashed line from the hand to the plant suggests the object is aimed at. With a pinch and arm retraction, a virtual duplicate of the plant moves closer while the physical plant disappears. The user then places the duplicate on a surface, such as the desk by a window, where it lands, and a countdown begins. A hidden robot then moves the physical plant to where the virtual twin has been placed, fulfilling the reality promise.

Even virtual agents’ actions can have physical effects, with motion paths that divert attention from the hidden robot. 🐝

3/5

a bee-like virtual character flies toward a Pringles can on a table to the right of the scene. As it arrives, the real can is visually masked, and a virtual duplicate appears. The bee carries the virtual can along a curving path toward the left. Near the end, the bee heads straight to a drop point; at the same moment, the physical can arrives there (moved by a hidden mobile robot). The virtual and physical coincide, the bee exists, and the can seems to materialize.
a bee-like virtual character flies toward a Pringles can on a table to the right of the scene. As it arrives, the real can is visually masked, and a virtual duplicate appears. The bee carries the virtual can along a curving path toward the left. Near the end, the bee heads straight to a drop point; at the same moment, the physical can arrives there (moved by a hidden mobile robot). The virtual and physical coincide, the bee exists, and the can seems to materialize.

In #AR, using real-time on-device 3D Gaussian splatting, we create the illusion that physical changes occur instantaneously, while a hidden robot fulfills the “reality promise” moments later, updating the physical world to match what users already perceive visually. 🤖

4/5

A user wearing a headset looks at a mobile robot in a room. From the user’s eyes, a viewing cone spans the robot from top to bottom. Point-based “splats” inside the cone are colored red, showing what is visible, and those outside are gray, for discard. As the headset moves or the robot moves, the cone adapts to always cover the robot. A second cone is used for the arm, and it grows and shrinks to cover the arm as it expands. Feathering is used to blend the cone with the background, acting like an invisibility cloak.
A user wearing a headset looks at a mobile robot in a room. From the user’s eyes, a viewing cone spans the robot from top to bottom. Point-based “splats” inside the cone are colored red, showing what is visible, and those outside are gray, for discard. As the headset moves or the robot moves, the cone adapts to always cover the robot. A second cone is used for the arm, and it grows and shrinks to cover the arm as it expands. Feathering is used to blend the cone with the background, acting like an invisibility cloak.