top of page
Search

Podcast Episode - How VR Works

Listen to the full episode on YouTube

[ Transcript ]


With the rise of immersive VR as a method of communication and entertainment, many have grown to become excited about the prospect of immersing oneself into a virtual world. Imagine putting on a headset, and in a moment, finding oneself transported to a gym, a snowy forest, or even outer space. How is such a phenomenon possible? What are the technological mechanisms behind this experience?


Liza Tsyvinsky:  I'm Liza Tsyvinsky, and today I'm here with the co-founders of Immergo Labs to help tackle this question. Mike, why don't you introduce yourself?


Michael Powell: Yeah. So, I'm Michael Powell. I'm the CEO at Immergo, my background is in biomechanics. I finished my PhD at UCSC where I met Aviv and Ash. And, you know, we'll dive into that a little bit more later.


Aviv Elor: Hi, my name is Aviv Elor. I'm another co-founder at Immergo. I am the Chief Research Officer. I focus on our VR platform development, our user research and our innovation strategy. In the past, I have come from Robotics Engineering at UC Santa Cruz – it's where I met Mike and Ash. Along the way, I did some wacky internships to full-time roles with Warner Bros, Walt Disney Imagineering, Meta, Google and the National Institutes of Health. Immergo and VR are both very near and dear to me, because I got into this after rupturing my own tricep ligament in a national judo match and used VR to help recover. So, I'm excited to chat about VR!


Ash Robbins: Yeah, I'm Ash Robbins, I'm the CTO. And, I guess to keep it kind of short, I work a lot with our biomechanics, our machine learning, and the AI that helps us kind of feel embodied in our avatars' community.  


Liza Tsyvinsky: Thank you so much for introducing yourselves. It's great meeting all of you today. I would like to start off with our first question. So, VR is popular for its ability to simulate interacting with a 3D environment. I'd like to know if you have any insights on how that works on the technical level.


Aviv Elor: VR, or virtual reality, is the concept that you are able to enter this fully virtual world. You can look around and see these different 3D objects. You can act upon the world, you can touch and influence objects, you can hear different things. And so I got to say, the history of VR is actually quite interesting. If you look at, like, science fiction, it goes way back. And some of the first devices, double check me on this, like back, I think, in the 1960s, Morton Heilig's Sensorama was this first device that you put your head in this giant machine and you would be able to see 3D images. You'd be able to smell different things in this device – and it's had quite the evolution. Like if you look back around the 2000s, virtual reality was often defined as, you know, 3D video games, being able to look into a virtual world on a 2D screen. If we're looking to today's technologies, it's become a whole lot more immersive, right? And a lot of this looks like, again, that ability to bring in this 3D virtual world that you know you can interact with, you can potentially move around in. It's able to understand and track your position both, you know, where you're at, but also your rotation. And today there are tons of these goggles, or glasses, that you can put on these devices. Again, it has cameras on these pairs of goggles. It's able to figure out your position and rotation. And then from that, through each of the lenses on these goggles, it's kind of simulating all of these 3D virtual objects for you to walk around and influence.  


Aviv Elor: And this could be a fully virtual world with virtual reality. This could be some mixed reality overlaying the physical world and digital elements onto it. Or, it could be completely augmented reality and just overlaying elements onto the world without necessarily reacting to the surfaces that are there.


Liza Tsyvinsky: Thank you so much for that. On the topic of simulation, a lot of people have been referring to this term called embodiment in VR, which is the feeling of controlling your in-game avatar as though it is your own body. How is embodiment achieved in virtual reality?


Ash Robbins: Yeah. So, embodiment really, I guess I'll answer this one. Embodiment is this really cool thing where, so, as Aviv was saying with VR, we can track things and put you in a virtual world. Now, generally what happens is you have the controllers and the headset and we're like, okay, we can know we can put an avatar's head and controllers in these areas.  


Ash Robbins: So, we can kind of draw hands where the hands are and do that. And this thing happens where, when your hands and arms of the virtual avatar you have align and you look down at them and it feels like your body, you get this feeling of embodiment. And, it only really happens when you're actually aligned pretty well or you identify the avatar. Different people can feel embodied in different ways. And one of the most interesting things is, like, as I've done a lot of testing with our biomechanics and changing things, there are times where you'll look down, you'll see this virtual arm. It'll feel and look exactly where your real arm is and it's just like, wow, this feels and looks like me – now you can kind of tweak things and move around in it and it really helps you feel a lot more immersed in this whole world. So, it's a really cool feeling that just really changes and makes it feel like the world you're in is the virtual world; it's the world you're actually in now. Yeah.  


Liza Tsyvinsky: To follow up from that, what have been the best ways for you to, I guess, better the sense of immersion that a user experiences in VR?


Michael Powell: So, kind of going off what Ash just described, we've been working hard on the biomechanic side of this to be able to get the full body tracking. There are different groups achieving this in different ways. Our goal is to really get this in a very natural, kind of easy to use sort of method. You know, just using your web camera or your phone to be able to provide input to, then get that full body tracking as much one to one, or as close to one to one, movement as possible. And this is what really enables the telehealth side of things of being able to provide those metrics – if you're getting accurate body capture, can you provide those sort of metrics and quantify them to a provider so they can help understand you, know how you're moving, and how best to help and support you.  


Liza Tsyvinsky: Thank you all of you. Now that we've got I guess this intro of what makes VR so special, I'd like to go into how you go about creating VR apps. So yeah, how does one go about creating an app in VR?


Aviv Elor: It's been interesting, again, just looking at the history of immersive virtual reality in the application space. It used to be quite a bit of a hodgepodge of custom systems. People creating these variety of headsets. They have a bunch of onboard sensors. These sensors are trying to figure out your position, your rotation. They're taking a variety of your inputs, whether that's, you know, you moving your hand, clicking a button, or even tracking your hands. And so, you need to work with some sort of hardware platform. And so that could be a, you know, off-the-shelf virtual reality headset these days: Meta Quest, Pico, HTC Vive; there are a lot of commercially available headsets.  


Aviv Elor: And as the VR industry has matured, a lot of academics and industry groups have got together and they've started creating standards. And so, one of these standards that we use with our app is OpenXR, which I believe is Open Extended Reality. And really defines, okay, we have all these different headsets out there. We have all these different hardware devices that, you know, present immersive virtual worlds, whether it's the VR headsets, whether it's augmented reality glasses, right? How do we take this data and format it into a way that it can feed into an application? And on the application side, we use the Unity game engine. There are a lot of game engines out there, and all these things are formatted engines that simulate physics. So, as you walk around the world, as you touch objects, how do you know that you're colliding or acting upon an object? How do you manage all of the different objects or virtual elements in the world? And then how do you communicate with other subsystems and manage those?  


Aviv Elor: And so, Unity is a common game engine. Unreal is another big one. Some companies even create their own custom game engines, which is a very big effort. And so the way that we've worked, at least with our VR app, is, you know, we use gold standard game engines to create our software, to use our machine learning models, and then we use OpenXR to put that into any off-the-shelf virtual reality headset or augmented reality glasses that are compatible with OpenXR.


Liza Tsyvinsky: Yeah. So, Aviv, you highlighted machine learning – and from there, I was wondering how you utilize AI and machine learning in your product.


Aviv Elor: Ash, do you wanna tackle this one?


Ash Robbins: Yeah, I'll jump in here. So, we have all these models that we've been working on through, uh, some of were highly inspired from some of our work way earlier during our PhDs, but advanced and using some of the ideas and creating all these new ways of basically learning and understanding how to create models with this machine learning in order to, kind of, get the best understanding of the human body as we can.  


Ash Robbins: And so, we have a lot of partners and we actually work with biomechanics labs right now to specifically gather a lot of data and improve our biomechanical models so we can make sure they're really, kind of like, valid and they're moving properly – and we get, just, this really good and tuned understanding of the human body. We also take data on people of all sorts of body shapes and sizes to make sure that our models can, kind of, adapt to any types of body shapes. We have all sorts of methods there that learn off of this data and just really align this biomechanical data. And then we use all these machine learning things, and we have really optimized algorithms because they have to run really fast. So, they're running in real time right now. So, they can actually do all the work to get the best understanding of the body as possible and use that to kind of wrap back into the embodiment we were talking about earlier. So the whole point is, like, how do you go from these few little data points we're getting from the VR headset and these other ways, and fusing that into this really nice and accurate and precise biomechanical model?  


Ash Robbins: And so that's what we use a lot of machine learning and AI methods for to kind of align all those together.


Liza Tsyvinsky: Thank you for your detailed answer. So with that, what are some common challenges that you run into when developing in VR?


Michael Powell: I want to leave this one to Aviv and Ash because y'all are – y'all are tackling those problems daily.


Aviv Elor: So, some common challenges, like Ash mentioned, we have a lot of data. We have a lot of subsystems going on. Being able to, you know, just take your head in your hands to figure out where the rest of your upper body is. Or, you know, fuse our companion app with your smartphone camera to figure out your whole body, right? There are a lot of subsystems going on and you need to manage all of these with, you know, handling the way that a person's avatar is visualized, and people are really diverse, right?  


Aviv Elor: People come in all shapes and sizes. So, you need to be able to take in all this data to calibrate it around their context. You need to manage it while you're visualizing all these different objects. And ultimately, these headsets, take a MetaQuest for example, these are basically Android phones, right, so we don't have a lot of computer to work with, so we're managing these really complex systems and we are optimizing, rigorously optimizing, to make sure that it can run fast. Because if it doesn't run fast, if it's not updating quick enough to what you're used to in real life, it can make you feel cyber sick, right? You can wear the headset and if it's lagging, if it's not moving with you, that can lead to nausea. So, we have all these complex subsystems. We need to make sure that they're running efficiently, that they're working as expected, that they're accurate, and we need to do it in a way that, is comfortable for users to use.  


Aviv Elor: So, it's a fun challenge. It's a lot of engineering optimization, and we partner directly with physical therapists and patients through our panel, and our therapists that are using it in the field. We get weekly feedback, we have bi-weekly play tests. It's really important that we're engaging directly with our users to stress test these systems to make sure that one, they're usable, they're comfortable, people can feel confident in using them, and that our users feel included in the product that we're building.


Ash Robbins: Yeah, and I'd like to add on a little bit more here as well. Aviv tackled a lot of the really good challenges and the ways they kind of work. But on the actual building side of VR, you can imagine it's like, okay, I'm coding something. I'm going to make a little change to see if that makes our biomechanics feel a little bit better. But what that actually looks like is, you run the change. It takes a little while to build.  


Ash Robbins: You put on a headset. You wait for 20, 30 seconds. You go move your body in the exact right way or try out this whole complicated thing. It doesn't work. Then you do that, make a little another tweak, you do that again and again and again. And it's like, okay, each time, you're just changing a little thing, which can be a minute, or two minutes, of the whole process. So, how to build a lot of tools to make that easier and a lot of complicated little things that just allow the development process to be easier, but then also, those kind of help as well for like, the users, or turn into tools that we want to show people. So, if we're trying to debug and see skeletons and see how it's moving when you're moving around in certain directions, we can actually show that to the users and that's actually a really helpful useful thing to be able to, like, look at your skeleton from a third person perspective or something along those lines.  


Ash Robbins: So the development process is a little bit different than a lot of other ways you just develop because you can't just test out these biomechanics as well with just simulators. You actually have to go try it out. And for the VR, you, for a lot of cases, have to test out the whole thing. Like, if you're seeing how interesting the design is for doing movements that you're trying to be guided towards, it's like you can't really just run through a thing on a screen with that. You actually have to go and experience that. So the development has this, kind of, whole layer where you really have to get into it. Yeah.


Michael Powell: I'll just add on one more bit to this because I think even if you zoom out from just the VR, they are also designing and building a lot to work with, you know, a web app to be able to display and get this data. We have to be, you know, we're working with patient data and capturing the full body, so they have to be HIPA compliant. So there's all these other pieces that they're designing around too that, you know, involve the VR, but they're also making sure it fits into what our users need.  


Michael Powell: So I think there's, you know, a lot of complications that go on. You know, if they were just making a game, things might be a little simpler, but with all these other components that they're connecting, including, you know, computer vision piece to get the full body tracking, I think it adds a lot more complexity for them and what they're designing.


Liza Tsyvinsky: That is all for our questions. Thank you so much for your time. Would any of you have any final thoughts that you'd like to share with our listeners?


Michael Powell: I would say, you know, the biggest thing right now is we have opened up our wait list and we are actively bringing on alpha users to help us design, test and iterate. So, you know, we are looking for PTs or physical therapists who are interested in, you know, the next kind of evolution in telehealth and would like to try it out with us. We're looking for patients who like using technology or are interested in trying – maybe you already have a VR headset and want to hop in and test it out.  


Michael Powell: We'd love to chat with you. Feel free to sign for our wait list, and we'd love to connect. We're still in the early stages and we'd really like to get that feedback and help us design it right to what people need. Aviv, Ash, anything you want to add?


Aviv Elor: I honestly think you hit everything. Yeah. Come try out Immergo! Help break things, help exercise with us; we're excited to work with you.


 
 
 

Comments


bottom of page