Game Developer Deep Dives are an ongoing series with the goal of shedding light on specific design, art, or technical features within a video game in order to show how seemingly simple, fundamental design decisions aren’t really that simple at all.
Earlier installments cover topics such as cover topics such as the sound design of Avatar: Frontiers of Pandora, how camera effects, sound FX, and VFX created a smooth and high octane movement system in Echo Point Nova, and the technical process behind bringing The Cycle: Frontier to Unreal Editor for Fortnite.
For nearly a decade, Owlchemy Labs has explored what it means to touch the virtual world. From Cosmonius High’s alien high hijinx, to Job Simulator’s physical comedy, to Vacation Simulator’s playful expressiveness, every game advanced our studio’s mission: make VR interactions so natural that players forget the interface entirely.
Dimensional Double Shift (DDS) represents the next step in that evolution: a full commitment to hand tracking. The result is a new design language grounded in embodiment, iteration, and accessibility.
“Moving from a controller, where people are used to pressing A or B or pulling a trigger, introduces a lot of new factors,” said Alex Covert, Lead Gameplay Engineer. “Every time we design a new appliance for a dimension, we ask: How will the player actually grab this?”
Expert System Engineer, Marc Huet, describes this shift as both philosophical and technical. “Our goal was to make hands feel present and believable—to minimize dissonance between what you see and what you feel,” he explained. “That meant always keeping hands visible, supporting many grip types, and ensuring the virtual hand never broke immersion.”
The Challenge
1. Lessons Learned from Early Failures
Early prototypes exposed the hidden complexity of human motion. “We had issues with hand sizing,” Covert explained. “People felt uncomfortable using a giant hand that didn’t match their own.”
The fix wasn’t a player-facing slider. The system now scales the virtual hands to match the player’s tracked hand size automatically—one of those invisible changes that quietly removes discomfort and makes everything feel more ‘you.’ Small change, big comfort wins. It became a shorthand lesson internally: the best UX improvements are often the ones nobody notices because nothing feels wrong anymore.Â
Sometimes, the obstacles weren’t ergonomic but systemic. “We realized that a sock-puppet prototype conflicted with Meta’s hand-tracking system,” Covert recalled.Â
The problem was less technical than behavioral. The first thing most people do with a puppet is talk to themselves. In practice, that motion turned out to be identical to Meta’s system menu gesture. Instead of animating a character, players were repeatedly summoning the OS.
There was no clever workaround. The gesture belonged to the platform. So the team didn’t redesign the interaction — they cut puppets.
Huet’s early experimentation echoed those constraints. “At first, our grab detection was binary—you were either grabbing or not,” he said. “It worked for Vacation Simulator, but it didn’t feel alive. We wanted to see your fingers react, your hands exist in the world.”
The team’s first step toward that goal was rebuilding their “grab logic” from the ground up. Huet described the process as discovering just how unpredictable real human motion is: “There are so many valid ways to grab something—we had to decide which of those looked right in VR.”

Image via Owlchemy Labs/Google
2. Natural vs. Intuitive
Through hundreds of hours of playtesting, the team learned that natural and intuitive don’t always align.
“Every person does it differently,” Covert said. “Some pinch, some grab. The closer we get to how people actually use their hands, the less we have to explain.”
Huet’s research helped formalize that variability. Drawing from human-factors research on everyday grasping—such as a 2016 IEEE study by robotics researchers Tomáš Feix and colleagues that categorizes common human grips based on form and intent—Huet built a library of grab techniques for VR. This includes grips like corner, clump, cylinder, hilt, and wand—each mapped to real-world use.Â
“We looked at how people hold cups, phones, and tools,” he explained. “Every grip has a story—a context—and we wanted that reflected in VR.” He also coined internal terms like “closedness,” a 0-to-1 measure of how open or closed the player’s hand is to define when a grab occurs.Â

Image via Owlchemy Labs/Google
“Some players over-grab, some pose-match,” Huet said. “You have to balance both so things don’t feel sticky or slippery. It’s subtle math that ends up shaping how people play.”
This systematization allowed the team to move beyond imitation toward consistency. “It’s fascinating research,” Covert noted. “Marc showed pictures of different hand poses people use when grabbing things — holding a phone versus holding a cup. It grounded our approach.”

Image via Owlchemy Labs/Google
The Approach
1. Designing for Accessibility Through Simplicity
Removing controllers didn’t just change the interface — it opened the door for more inclusive play.
“One of our big considerations is one-handed players,” said Covert. “At one point, we had a two-handed twist-style pepper shaker, but that created accessibility concerns. So we added a shake alternative.”
This philosophy carried through every mechanic. DDS introduced “snap points” for setting items down easily, freeing a player’s hand mid-task.
“In the diner, the squeeze bottles have a threshold for how hard you need to squeeze,” said Emma Atkinson, Technical Designer. “But if that’s difficult, you can just tilt them and they’ll pour out.”
Moving to hand tracking also meant removing unnecessary instruction. Instead of teaching players new button metaphors or gesture rules, the team focused on subtracting friction and trusting existing human intuition. Because people already know how to use their hands, many interactions could be learned simply by doing—without prompts, overlays, or step-by-step tutorials.Â
Huet extended that philosophy to invisible design systems. “Accessibility isn’t always about menus or toggles,” he said. “Sometimes it’s about thresholds—if a player’s range of motion is limited, we can scale the sensitivity so a smaller gesture still counts. The goal is to let everyone feel capable in VR.”
That sensitivity scaling, originally built for testing, became a subtle yet powerful accessibility feature. “You don’t have to announce accessibility,” Huet added. “You can just design it in.”
2. Creating Self-Haptics
Without vibration motors or triggers, the team reimagined tactile feedback from the ground up.
“The squishable UI buttons and keyboards give tactile feedback because your fingers touch each other,” Covert explained. “That’s what we call self-haptics. You feel yourself performing the action instead of relying on vibration.”
Atkinson added, “You don’t want your hand to become the object—you want to hold it. We try to minimize cognitive dissonance between what you see and what you feel.”
Huet built on that idea technically, integrating collision physics that made contact feel grounded. “The raw hand data comes straight from the device,” he explained. “We then run physics updates to see where your virtual hand should be after impact. There’s a little tolerance—enough to give you that ‘hit’ feeling before it breaks through.”
He described it as “letting physics fake haptics.” “You’re not feeling a vibration,” he said, “but when the virtual hand resists or stops, your brain fills in the feedback.”
That philosophy—turning embodiment itself into feedback—became core to Owlchemy’s design lexicon, alongside “bubble pass,” the intuitive hand-to-hand object transfer system.
3. Designing for Shared Gestures
Early builds tried to make passing explicit: both players had to perform a synchronized ‘handoff’ gesture. The team replaced it with what became the ‘bubble pass’—an idea from Tim Winsky, implemented by Systems Engineer Marc Huet—where a thrown item hovers in front of the other player long enough to be grabbed. The result is more playful, and it teaches itself.
Even during internal playtests, people discovered the new pass system and had that moment of joy when they figured it out on their own.
The change was data-driven. “We added analytics to see how often players used the old passing system and found that almost no one was using it,” Covert said. “That told us it was time to rethink the design.”
Huet’s grab system played an unseen role here too. “When you toss something, timing matters; milliseconds affect how real it feels,” he said. “So we built in what we call ‘sticky’ and ‘slippery’ release thresholds. It’s how you tell the difference between handing off a mug and throwing a beach ball.”
His experiments revealed that velocity and openness need to respond to intent. “If you open your hand slowly, the object should linger. If you open fast, it should fly,” Huet explained. “Those little cues make shared gestures feel believable.” It also learns what ‘I’m ready’ looks like: hand up, waiting. When it sees that, it helps the object meet the hand—so catching feels intentional instead of accidental.
4. Designing Around Technical Limits
Hardware realities still present creative challenges. “When you point away from yourself with the sprayer, your hand can block the headset cameras, so it doesn’t always detect the motion,” Covert explained.
Instead of seeing such limitations as setbacks, the team treats them as opportunities for innovation. “We’ve even talked about adding aerosol spray cans where you press the top with your index finger,” Covert added. “We don’t have support for that yet, but it’s on the wishlist.”
Huet elaborated on those constraints: “Hand occlusion, lighting, tracking speed—those are constant battles,” he said. “If a hand moves too fast or leaves the camera’s field of view, tracking breaks. So we design around that by keeping actions close, within this invisible box in front of you.”
Environmental conditions also shape interaction design. Hand tracking behaves differently depending on lighting, camera visibility, and the underlying capabilities of the headset itself. In low-light or high-occlusion situations, tracking data can become noisy.Â
Why?Â
Well, not all systems compensate for that noise in the same way. Some systems include infrared illuminators to support hand tracking in low-light conditions, while others degrade faster as lighting drops.
Rather than treating those moments as failure states, the team designed interactions to be forgiving by default: dropped objects recover, missed grabs self-correct, and small errors never cascade into frustration. The guiding principle was simple: Players should never feel punished by physics or hardware constraints.
The Results
Hand tracking didn’t just expand accessibility. It made play feel human.
Players naturally gestured, waved, and improvised together. They learned by doing, not by reading.Â
“We almost don’t need to design new ways of doing things,” Covert said. “The closer we get to how people actually use their hands, the more natural the game feels.”
Huet added, “Our success is when people stop noticing the system. When your virtual hand feels like your hand—when you don’t think about grabbing, you just grab—that’s when VR disappears and embodiment takes over.”
Across development, the team’s iterative design language. Terms like self-haptics and bubble pass became shorthand for a culture of experimentation and discovery. “You could almost make a dictionary of the words we’ve made up,” Atkinson laughed. “It’s our own language.”
For Owlchemy Labs, hand tracking reaffirmed its core philosophy: interactions should be instinctive, inclusive, and joyful.
“We’re still improving our authoring tools,” Huet said. “But every iteration teaches us something new about how humans move—and how to make VR move with them.”

Image via Owlchemy Labs/Google
Key Takeaways
Hand tracking is both interface and embodiment. Designing gestures means understanding how people feel feedback: not just how they perform it.
Authoring meets procedural motion. Strong hand-tracking systems allow developers to lock poses when precision matters (i.e., picking up a cup one way) while still allowing motion to adapt to how players actually reach, slide, and settle into a grab. The mix of authorship and procedurality is a key aspect of our system many developers miss. They try to hard to do one or the other
Accessibility improves when interaction simplifies. Removing hardware often clarifies rather than complicates the experience.
Iteration is the constant. Every friction point (hand size, gesture mismatch, or camera occlusion) drives better design.
Analytics close the loop. Observation, data, and playtesting inform every refinement.
VR’s future is hands-on. Interaction and embodiment are no longer separate disciplines. They’re one and the same.