Skip to main content

Episode 16 - Building Intuitive Interactions in VR: Interaction SDK and Presence Platform

Navyata Bawa, a Developer Advocate at Meta Open Source, chats with a Interaction SDK Software Engineer Danny Yaroslavski about how to add immersive VR interactions using Interaction SDK.

Summary​

In this episode of The Diff podcast, host Navyata Bawa talks to Interaction SDK Software Engineer Danny Yaroslavski about how to add immersive VR interactions using Interaction SDK. They talk about how the First Hand showcase app and the Interaction SDK samples app let developers have hands-on experience with hand features, the challenges that come with creating a seamless user experience and how pose and gesture detection tools work together.

Danny Yaroslavski

Social Accounts​

Episode Transcript​

Building Intuitive Interactions in VR: Interaction SDK and Presence Platform

[00:00:00] Navyata Bawa: Hi everyone. This is Navyata and thank you so much for joining us today. In this episode of The Diff, we'll discuss how to build intuitive interactions in VR using the Interaction sdk. And learn more about what Presence Platform offers. We will learn about the Interaction SDK features, the first hand showcase, the open source first hand sample, and other resources available to you to help you get started with Presence Platform and Interaction sdk.

[00:00:33] And with me today is Danny, who's a software engineer on the Interaction SDK team to tell us more about Presence Platform and how we can use Interaction TK to add natural interactions in our own vr. Hi, Danny. It's great to have you here with us. Could you please tell us a little about yourself and your work?

[00:00:50] Danny Yaroslavski: Yeah, it's great to be here. Uh, so I'm Danny. I'm a software engineer here on the, uh, input Frameworks team at mea. Uh, my team works on building [00:01:00] interaction models and interaction systems, uh, for really spatial mediums. Is that vr m mr. Um, but both for third parties and our internal teams here at mea. Uh, this includes interaction models for both controllers and.

[00:01:14] Uh, things like interacting with Rays direct touch, poking at buttons grabbing, um, as well as several showcases that demonstrate interactions, um, like these, uh, like the interaction as the Case samples app and the firsthand showcase app. Thanks

[00:01:32] Navyata Bawa: for the introduction. Danny you mentioned the first hand showcase app.

[00:01:36] Let's talk a little bit about that. Could you tell us what this showcase app is about and

[00:01:40] Danny Yaroslavski: how it. . Yeah. Um, so firsthand is a showcase app that lets you experience some of the magic of hands and hands interactions, um, in a really fun way, uh, in firsthand. You get to play with a few switches and levers, a holographic tablet, um, all while you're solving puzzles in this kind of virtual [00:02:00] world.

[00:02:00] Um, in, in this app, you get to build yourself a pair of robotic gloves and kind of play with the superpowers that comes with. Uh, but along the way you get to test all of these different interactions and, um, interaction paradigms, uh, that you might find in the interaction as dk, uh, such as, uh, direct touch and grab.

[00:02:20] Um, on top of like firsthand. We also have the interaction as DK samples app, which similarly lets you test these kinds of interactions, but more so demonstrates them in the context of the same samples that you might find in the interaction SD. Uh, that's kind of an easy way to try several of them for yourself without even having to open unity, um, and really just test the entire, uh, set of things that we offer, uh, with the interaction as the game.

[00:02:55] Navyata Bawa: That is awesome. It's great to see a real world example, like showcasing all how all these [00:03:00] interactions work and how developers can use these in their own games. So the first hand sample that you mentioned, like how do developers get access?

[00:03:07] Danny Yaroslavski: Absolutely. Um, both firsthand and the Interaction SDK Samples app are available for free to download on App Lab.

[00:03:14] Uh, when it comes to the code itself, uh, there's two, uh, resources you can, uh, follow for the Interaction sdk. Um, you can download that as part of the Oculus All in One integration package. And for the firsthand, uh, showcase, we actually have a slice of some of the interactions that we've built in that app that are available to download on gi.

[00:03:37] Um, really our goal with interaction is decay, is to empower developers. We want developers who are building, um, spatial experiences, not only to not have to start from scratch, but also get a bunch of the best practices and best designs that we've built out internally at MEA with researchers, designers, engineers, and so that you can incorporate them right into your [00:04:00] experiences, um, with really little work on your.

[00:04:05] This kind of allows you to provide your users with the best experience right off the bat. Um, and so that's just the start. Interaction SDK is really part of this overall larger effort called Presence platform, um, and is really one of the tools in your tool set, um, that, that developers can really dig into.

[00:04:24] Navyata Bawa: that sounds great. Like having a sample that people can refer to when they build their own apps can come in really handy. So you mentioned that Interaction SDK is part of Presence Platform. So what other SDKs are, are like Presence Platform consists

[00:04:37] Danny Yaroslavski: of? Yeah. Um, presence Platform really focuses on providing developers the building blocks that they would need for both mixed reality and virtual reality.

[00:04:47] Um, again, all of this is to enable developers to build their own. Uh, so in addition to interaction as decay, uh, for natural interactions with hands and controllers presence, platform includes several api, uh, APIs, like [00:05:00] pass through and seen APIs for mixed reality experiences, as well as the voice sdk, uh, for voice driven inputs.

[00:05:07] You can really imagine that when you have all of these capabilities together and you mix and match them, uh, you can create some really, really cool stuff. Uh, so like firsthand, there are other apps available on app. That you can download today, and you can see how these SDKs can really come together to form a cohesive whole.

[00:05:26] Um, one of these apps is called, uh, world Beyond. Uh, this one actually mixes several of the capabilities of the present platform, uh, to really kind of give you this out of this world experience.

[00:05:37] Navyata Bawa: Wow. Interaction SDK can make VR experiences so much more of immersive. What are some of like the challenges that arise when developing hand interactions in vr and like, how does Interaction STK help resolve some of.

[00:05:50] Danny Yaroslavski: Yeah. Um, hands are really this kind of incredible input. They're, um, they're both something familiar [00:06:00] but not quite because when you start putting hands into virtual, uh, experiences, there are some things that you miss from real life, uh, or from a physical interaction that you kind of have to make up for when you get into the virtual world.

[00:06:17] Uh, some of these. Um, some of the ways that you make up for, for, for these differences. For instance, not being able to physically touch something, uh, for when you're doing direct interaction, there are ways that you can change the visuals or the heuristics to still make it feel compelling and still feel realistic in a virtual environment.

[00:06:39] Um, so like the one I just mentioned for direct touch, when we interact with virtual button, Our hands will normally not stop at a physical surface. Um, uh, there is no physical surface to stop at when you're in, in virtual reality. So in interaction sdk, one of the things that we provide is something called [00:07:00] touch limiting.

[00:07:00] And this is purely a visual affordance that makes it seem like your virtual hand stops at this virtual surface, even if your tracked hand goes beyond it and. Um, touch limiting is one such example. Another example is in hand grab. Uh, when you grab objects and you, um, pick them up a virtual object, your fingers will probably go through the virtual object, but this doesn't seem right when in the physical world, and really if I'm picking up something like a, a vase or, or, or something.

[00:07:38] I kind of want my fingers to conform around the object. And so my tracked hand in this case will likely close and feel natural, have that self haptic response, but the hand grab, uh, visual will reposition the visual hand joints to be around the VAs itself. Uh, so these are [00:08:00] kind of. first step in kind of bringing hands into a virtual environment, really making up for this lack of physical objects and, and virtual, um, uh, visuals that you need to provide.

[00:08:12] But on top of that, there's kind of other considerations that you really have to do. Um, you have to take into account, uh, latency. You have to take into account the fact that when you are doing, um, interactions with a input device that's effectively driven by computer vision. There is going to be some amount of delay between when you, uh, do an action and when you get a response back.

[00:08:38] Uh, on top of which, when you do have computer vision, you also introduce things like noise and, and, um, the potential for, for jittering and having things not feel quite right. It is not the ideal case. Put your hands in front of you and feel like you've had one too many cups of coffee. Right? So with [00:09:00] interaction sdk, there's lots of different options that you as a developer have that are kind of included to mitigate some of these things.

[00:09:10] Many of these revolve around taking the hands input and maybe modifying it in some way. Let's say if. Um, if you have input that's, that's moving too much, you can add a filter on top of the hand joints and then kind of smooth out that motion with knowing that there's a trade off that now you're a little more latent as well if you're interacting with things and grabbing them and there's jitter on those objects.

[00:09:34] Having that kind of filter applied on your hand joints will make the object kind of move more smoothly over time. Uh, one other kind of modification for hand joints is something. When you're moving your hands out of the view thrust of the camera, when that kind of thing happens, if you're holding an object, the last thing you want is for that object to fall or get lost just because your hand is out of view.

[00:09:57] And so this is another kind of case where [00:10:00] automatically, if you're using the interactions that we have in Interaction sdk, as you move your hands outside of you, an ongoing selection may continue. Until you bring your hand back in from, uh, from out of you, and then the object that you had been carrying previously is still going to be attached to your hand until you do actually trigger or release.

[00:10:22] Um, so I just talked about latency, tracking pros, um, virtual, physical, uh, differences. Uh, but really these are the kinds of things that are uniquely different with hands and really make it. Both more challenging, but both more interesting to kind of develop for them. Hands are really, they're different and there's a lot of expectations for how users want to use their hands.

[00:10:50] And so really when you're designing for hands inputs, it's not enough to just kind of transfer knowledge from [00:11:00] controllers over to hands. There's a lot of expectations that users. Have of their hands cuz they've, they've been growing up with 'em, they've been using them all their life. So really it's about making that gap between virtual hands and physical hands, smaller in all the kinds of ways that you can, whether that's removing jitter, whether that's making virtual interactions look more natural, really making the experience of using hands feel like users.

[00:11:28] Are are present full, like they have the presence of their actual fingertips, um, as opposed to having the hands feel like a controller. Um mm-hmm. .

[00:11:40] Navyata Bawa: Yeah. I mean, adding hands into your experiences like, just makes it so natural because we are all used to using hands to like pick things up, touch things. So, Makes sense to have that and that as a natural way of interaction though.

[00:11:53] I do have a question for you, like, since it's so natural for us to like just use our hands to interact with things, [00:12:00] how do you make sure that people don't just like have unlimited degree of freedom now that you have hands, you can do whatever you want, right? Like how do you limit that?

[00:12:09] Danny Yaroslavski: Yeah. I think designing for hands really tests.

[00:12:16] Where you want to draw the line between making something physical and making something virtual, and really it's up to the really developers to understand what is the main draw of hands for their title. We sometimes see that developers like to start with a slate of saying, Hey, hands are physical in real life, and so we're going to add.

[00:12:40] Joints, two hands, um, right off the bat. But really when you think about it, when you interact in real life, whether it's picking up, uh, little pieces, there's a lot of response from physical objects that needs to happen that helps inform you when you actually [00:13:00] do these things in real life. And that might not map out really nicely when you just apply a physics system to even virtual physical objects.

[00:13:08] And so, The, a good thing to really start to think about is, do I even need physics to begin with? Does this kind of interaction work in a pseudo physical way, um, a geomorphic way, like, or does it work really, um, all the way to the other, um, the spectrum where we really do want, um, a type of experience that is very physics based.

[00:13:31] And so questions like these are things. Really, it's up to you as a developer to kind of get a sense for what the real core value proposition is for your experience and adjust where on that sliding scale of, um, realism to, um, pseudo realism. You land when designing with hands in mind.

[00:13:57] Navyata Bawa: That makes sense.

[00:13:58] Yeah. But like, [00:14:00] so now that you have hands in your VR experiences, you can have gestures, you can have poses. So how does Interaction SDK help users who wanna integrate like gestures and poses into their apps? Like how does it make it easier for them to like start.

[00:14:15] Danny Yaroslavski: Yeah. So on top of all of the interactions that we've already discussed, um, interaction SDK natively includes a lot of pose detection and gesture detection tools.

[00:14:25] And what this means is that on top of the hands APIs that you get with the Oculus all in one integration, which can give you things like joint rotations and positions, interaction SDK allows you to add a layer on top of that. And so if you want to detect a certain kind of. , you can configure a pose recognizer to check for the orientation, um, uh, and, and finger features of a hand, as well as the orientation and transformation of a wrist, and you [00:15:00] can end a bunch of these features together to get to something like pose detection.

[00:15:03] A really good example of that is to think of something like a thumbs up if you wanna detect the thumbs up in interaction SD. You can combine, uh, two things. This is, uh, both a shape recognizer and a transfer recognizer. The shape recognizer will be something where you can configure and say, Hey, I want the thumb to be extended and I want all my other fingers to be curled.

[00:15:27] And when this is true, um, that's part of the thumbs up. The other part is to make sure that we actually orient the hand in the correct orientation. This is definitely not a thumbs up. And so if we take a transform recognizer on top of that, that's something that will check the orientation of the wrist.

[00:15:45] We can configure it to say, Hey, we want, um, the wrist to be facing towards, uh, this direction. And anything outside of that, we won't count as recognizing that wrist for the thumb. [00:16:00] Once we have what is effectively like a bullion for the wrist and a bullion for the fingers of your hand, we can end those two things together and we can get a pose detection for a hand.

[00:16:11] Now on top of pose detection, the next logical step is gesture detection. And in much the same way, like we have these, uh, components that you can configure to detect, um, shape and orientation, you can also have components that detect poses over. That may be a component that checks for velocity in a certain direction over time, or it may be a component that combines several poses and says, as long as you move from pose A to pose B, to pose C over a certain span of time, then to signal that that's a gesture.

[00:16:46] So in interaction sdk, there's a whole set of components that are specifically built for pose and gesture detection and really are focused around enabling you to define your own, configure, your own. Set up to both do one [00:17:00] handed, two handed, uh, poses as well as gestures.

[00:17:04] Navyata Bawa: That sounds great. Um, I was wondering that the interaction SDK samples app that you mentioned is, uh, are people able to try all of these out in that samples app?

[00:17:14] Danny Yaroslavski: Yeah, so, uh, in the interaction sk samples up there are, are lots and lots of different scenes that focus on every type of interaction as well as pose, detect oppos and gesture. Each scene really tries to showcase the variety of ways that you can, um, incorporate these different features into your app, and also is a good starting point as well.

[00:17:38] So as you run through these samples and you're building out your experience, you can kind of pick and choose what fits most closely to what you're building and use that as a starting point for your own scenes as you develop.

[00:17:52] Navyata Bawa: Well, okay. That's amazing because, uh, when people are getting started with hands, like it's so new that it's great to have like examples to look back to [00:18:00] and just like sort of see how it's been implemented and how they can use something similar for their own apps.

[00:18:06] Um, and so for our developers who are just getting started out with hand tracking and interactions in their VR apps, can you recommend like some resources or learning material where Art Web can learn more about the sample, the interaction, TK as a whole and just

[00:18:19] Danny Yaroslavski: get. Yeah, absolutely. Um, the first, the best, uh, source that you can really, uh, go through right away is the Oculus documentation page, and I highly recommend that.

[00:18:31] From there, there's an overview page about interaction sdk. There's a few, uh, links on how to get started, how to set up your Unity project, how to walk through, uh, setting up your, uh, scene to begin with, as well as several tutorials on. Bits and pieces, uh, of interaction as decay depending on what you want to use.

[00:18:52] There's a wide breadth of capabilities and interaction as decay enables, and so it's really up to you to see which [00:19:00] things really fit the mold of what you are looking to build out. Some of these tutorials will include, uh, things like how to set up, uh, direct touch, uh, which we talked about, or how to record custom hand grab poses for hand.

[00:19:13] as well as things like how to set up pose detection or how to do something like ad curved canvases, uh, on top of your existing user interfaces and how to make those interrupt with both rays or poke. Uh, so I highly recommend looking at the Oculus documentation, uh, pages to start. Uh, and then from there there's several other, uh, sources that, um, we can recommend.

[00:19:39] There's a whole section on best practice. For hands and, uh, controller design and really just design spatial environments on that, uh, uh, in, in the developer docs as well. Uh, but plus, uh, we also have, um, The first hand sample that we'd already mentioned, which you can download on GitHub [00:20:00] to see more ways to configure things, uh, as well as, uh, we have connect talks where we've, uh, talked about, uh, hands in the past, uh, both in, uh, 2021 and 2022, uh, that you can get a sense as well of like high level getting started with, uh, direct touch and the capabilities of interaction as.

[00:20:19] Navyata Bawa: Great. I think these are great resources. Um, you mentioned the firsthand sample GitHub project, so I wanted to like just ask you what's the difference between like the firsthand showcase if people try that out on App Lab and then they wanna like, try out the GitHub project. So what are the things they can learn from the GitHub project?

[00:20:36] How is it different from the showcase app?

[00:20:38] Danny Yaroslavski: Yeah. Um, so really these are just two different sets of samples and the in firsthand. The main goal was to be able to see these interactions in a real world example. And so whether or not you started with the GitHub samples or the interaction SDK samples, really [00:21:00] each one will just show different ways of configuring these interactions.

[00:21:03] And so while in first hand, you may have a two handed interaction for a steering wheel, , you may or may not find that in the other samples app. And really, this is why I just encourage to try all these different samples and, and see what CLO most closely resonates with what you're trying to build out. Um, both again, are built off of interaction sdk and really, um, it just happens to be that the firsthand one, uh, takes also visual elements from the firsthand experience as well and is kind of, uh, built into that GitHub.

[00:21:37] Yeah,

[00:21:38] Navyata Bawa: I mean it's really great that it's on good help so that people can actually like just download it, set it up, try it out locally on their own machines, which is great because open source is such a great place to like just learn and grow and create a healthy community around a project. So Cool. So before we end this episode, what are some of the other projects that you or your team are working on?

[00:21:59] Danny Yaroslavski: [00:22:00] Oh, there's a lot of exciting things, uh, coming down the pike. Um, there's a few that, uh, I'm personally super excited about. Uh, one is hence first locomotion. Uh, we've heard from developers, uh, who've been building experiences where they want to move around the space. They want to understand what is the best way to adapt locomotion to hand.

[00:22:24] Really, we've been doing a lot of research internally on how to. Um, get into a locomotion state, move between, uh, teleportation and maybe churning snap churning or smooth churning. And really, I'm excited to be able to, uh, to see the sample go out in the wild, uh, to you developers as part of interaction as decay, uh, in a future build.

[00:22:49] Uh, that really, uh, shows how you can incorporate locomotion into a hands first. On top of that, there's a few other features as well. Um, we have snap samples coming up [00:23:00] soon, which will show how to, um, take objects, virtual objects and have them snap to your environment, different surfaces in your environment, as well as samples that really make the most of hand tracking, uh, improvements that we've been making, uh, at our, at our core tracking layer.

[00:23:16] So with hence 2.0 and 2.1, we've really been making a lot of improvements on hand over hand, um, where. In the past where we may have had, uh, losses in tracking on, uh, hand when it's occluded by the other hand, nowadays there's far fewer, uh, moments when that happens when you completely lose tracking, when your hands go over one another, that this enables new types of interfaces, like putting UIs right in the palm of your.

[00:23:46] Um, imagine putting buttons or sliders or things like that. And so in interaction tk, we're planning on releasing also a sample that makes the most of, uh, hand over hand interactions too. There's a lot of really cool [00:24:00] things happening in, in interactions in general. Uh, there's new, uh, input capabilities from some of our pro line of devices as well, and so we are also investing in controller centric interactions and samples.

[00:24:15] Especially for the Touch Pro controllers, there's now new features that have to do with pressure and, um, how you might want to remap that kind of pressure, value into an interaction is something that we're also going to be shipping and giving developers to try out. For instance, being able to use that pressure, uh, sensor to drive a squeezing motion or a breaking motion, uh, or a break, an object that you're holding, uh, that really gives way.

[00:24:42] new experiences that you can, uh, provide with, um, controllers, especially when you try to remap those controllers to virtual hands. Uh, so I'm super excited about all these kinds of samples, uh, we're planning on sharing and, and many more beyond that. Uh, yeah, it's, [00:25:00] it's, it's a fun time. .

[00:25:01] Navyata Bawa: Yeah, that's, that's so many exciting things.

[00:25:03] Like I can't wait to try these things out myself. Um, and you also mentioned like local motions. I was just thinking like, since. Like just having hands also makes it more accessible, right? Like if someone wants to say they wanna go somewhere, they can point to where they wanna go. Or you can use gestures to have sign language to communicate.

[00:25:21] Like there can be so many things you can do with hands and Yeah, I mean there's no limit. And then also like controllers, like you mentioned, like the pressure sensor. There's so many things I can imagine, like fun stuff, fun apps that you can do with that. Wow. This is. Okay, cool. And so, um, the, the, are there, like, so you mentioned there are these new samples that are gonna come out.

[00:25:44] Are they all gonna be on like GitHub or App App or like where can people try these out and where can they keep up with like these upcoming releases? Like how do they keep up with what's the new version of the SDKs new samples.

[00:25:58] Danny Yaroslavski: Yeah, absolutely. [00:26:00] Um, so at Meta we try to ship fast, right? We, we, we wanna get this kind of stuff over to you as soon as possible.

[00:26:07] And really, um, if you keep up with a meta blog, uh, that's probably the best kind of, um, communication channel that you can follow to see anything new that's coming your way. And normally we'll have releases fairly frequently throughout the year. Uh, that'll introduce, uh, new samples. Specifically for Interaction sdk, many of the samples that we have coming down the, uh, down the path are going to be landing right in the All in one integration with the, uh, interaction SDK package.

[00:26:37] And so you'll be able to find those samples right there as well as over time the App Lab app for the Interaction SDK samples is going to be, um, updated with some of this new functionality. One other exciting thing is that firsthand. As we're kind of building out these samples and these capabilities, we're also iterating on the experience there.

[00:26:58] And so be sure to look out for [00:27:00] some, uh, more cool things coming your way in the firsthand app as well.

[00:27:04] Navyata Bawa: Wow, I can't wait to try these new features out. Um, thank you so much for the helpful insights Danny, and thank you for all the amazing work that you and your team do. And with that, we'd like to conclude this episode of the Dev.

[00:27:17] Thank you, Danny, for providing us with such great insights into presence, platforms, interaction, s, STK features, and all the super useful resources to help our developers get started with adding interactions and hand tracking in vr. Presence platforms truly empowers developers to build compelling mixed reality and natural interaction based experiences, and we can't wait for the to experience all the amazing apps that our developers create.

[00:27:40] Thank you so much for all the amazing work that you and your team do. It was great having you here. Thank you. Yeah. That's it for today. Thank you so much, Danny, for joining us and have a great rest.[00:28:00]