Engineering Emotions and Energy with Justin Wenck, Ph.D.

Will Human Connection Be a Thing of The Past? Apple Vision Pro and Meta Quest Headsets

March 12, 2024 Justin Wenck
Engineering Emotions and Energy with Justin Wenck, Ph.D.
Will Human Connection Be a Thing of The Past? Apple Vision Pro and Meta Quest Headsets
Show Notes Transcript Chapter Markers

Are we standing on the precipice of a revolution in human connection, or teetering on the edge of its demise?

We tackle this question head-on by looking at the Apple Vision Pro and Meta's Quest headsets as our guides. Picture a world where walking down the street or attending a meeting could be fundamentally altered by the technology we wear. This episode is not just about the headsets themselves, but also the philosophies that drive them—the spatial computing of Apple and the virtual realms of Meta—each with profound implications for our interactions, professions, and the very fabric of our relationships.

As our conversation unfolds, we'll contemplate the Star Trek holodeck's science fiction becoming our reality, where sensors and artificial intelligence morph our world in real time. 

Imagine the potential transformation in fields like architecture and medicine, where augmented overlays could provide invaluable insights. Yet, with this power comes the question of intrusion: will we choose to intensify our reality with digital interference, or will we command technology to step back, allowing us to savor genuine human experiences? 

Join us for a thought-provoking session that probes the delicate dance between embracing innovation and maintaining our humanity, all while considering how we might direct this technological symphony to enrich, rather than diminish, the essence of our daily lives.

Send us a Text Message.

Watch the full video episode at Justin Wenck, Ph.D. YouTube Channel!

Check out my best-selling book "Engineered to Love: Going Beyond Success to Fulfillment" also available on Audiobook on all streaming platforms! Go to https://www.engineeredtolove.com/ to learn more!

Got a question or comment about the show? E-mail me at podcast@justinwenck.com.

Remember to subscribe so you don't miss the next episode! Connect with me:
JustinWenck.com
Facebook
Instagram
LinkedIn
YouTube

Disclaimer: No copyright infringement intended, music and pics belong to the rightful owners.

=====================================================

Justin Wenck Ph.D.:

Are you ready to live a life with enough time, money and energy, have relationships and connections that delight you? Are you ready for the extraordinary life you know you've been missing? If so, then this is the place for you. I'm a bestselling author, coach, consultant and speaker who's worked in technology for over two decades. I'm a leader at transforming people and organizations from operating in fear, obligation and guilt to running off joy, ease and love. It's time for Engineering, emotions and Energy with me, justin Wink, phd.

Justin Wenck Ph.D.:

Today, I'm going to be diving into the question will human connection be a thing of the past? Why would I ask this? I recently had a chance to try out the brand new Apple Vision Pro headset, and I've also had a chance to try out one of the MetaQuest headsets. It really does make me question where are we heading as humans? Living together? Do we even need to be together, or can we just get every experience, everything we need, from this? To dive into that I'm going to talk about first. What are the components of the Apple Vision Pro and what is this allowing and what's going on here to make such a really immersive experience, which is really what my big experience was? I want to talk about comparing that with the MetaQuest not really on the hardware way, but more of what are the different philosophies between what Meta's setting out to do and what Apple's setting out to do. Meta is very happy to call there as a virtual reality or augmented reality headset, but Apple's very much like no, no, no, this is a spatial computing device. After using it and seeing what's going on, I think that is a very valid distinction because it really does speak to the different philosophies of these two companies with what they're doing, both in hardware and also the experience. That's what I'm going to finally end up with talking about today is what are the implications of that?

Justin Wenck Ph.D.:

Apple Vision Pro. If I look at what there is now and where I see this going, what does that actually mean for you, me, everyone else, and how do we actually connect? Do we get to connect or are we? It's just even worse than our phones. Stay tuned. This is going to be a really great show, a little bit different than other shows that I've done, but I think this is going to really play into technology, our lives, and are we living fulfilling lives, where we feel connected and we're able to have an impact in the world?

Justin Wenck Ph.D.:

To start out what exactly is going on in the Apple Vision Pro. One of the big things that, when you put this thing on, is the display, which there's two of them, one for each eye, and the specs are really good. Basically, it looks like you're looking out on the real world. You almost can't tell that you're looking at the outside world through a display. It really feels like which is why maybe you've already seen some pictures, videos of people walking around, driving with them, things like that because it is that good and that real. That display it's effectively just a camera taking pictures and putting that in low enough latency that you can't tell the difference, which is really really incredible. There's a lot of processing, so it's got some of the newest chips, whatever.

Justin Wenck Ph.D.:

I'm not going to get into that, because the other thing that's pretty incredible is there's cameras. At first, when hearing about people walking around with them, I'm like that sounds ridiculous. This is meant to be stationary or in a room or something. But then when I realize that, oh, this thing has a camera on it and it's going to take stereoscopic 3D pictures and spatial audio video, all that stuff, then it's like, oh well, there might be a reason for someone to wear this thing, because if they're capturing content to then go on YouTube or whatever types of things, for people who have these devices to consume them, it would make sense that you'd wear these so that you could be like oh, I want to show what it's like to walk around the city, which is we see people on Instagram and TikTok having videos going on hikes and things like that. Again, it's like there's other actual cameras for this, but if you're just getting into this and playing around, this might be your only camera, so it's like you're going to maybe use it.

Justin Wenck Ph.D.:

Anyway, it's got amazing cameras for that, but those aren't the only cameras. There's also two high resolution cameras that are used basically for each eye needs to see something, so I believe those vote with those cameras. But then there's also six other world tracking cameras, because one of the main ways you interface with the Apple Vision Pro is just by your movements. So it's like you see something and you just pinch your fingers together or to zoom, you just move your fingers in a certain way. They don't have to be right in front of your face, literally. When I was using this thing, I just kind of sit and I just had my hands resting in my lap, very relaxed, very calm, and it's just looking around and then I see something and I'm like, oh, that's what I want. And then I just tap my two fingers together and it goes boom, it got it. So it's taking in movement all over the place and I'll have more to talk about that when we get to.

Justin Wenck Ph.D.:

What are the implications for what is this like in the world? So then that brings into like well, how is it knowing what you're wanting to select? It has four eye tracking cameras, so that is so that it can see exactly what you are looking at, and the fact that it can see anywhere that you're looking at is going to have some big implications on understanding human behavior and what people are paying attention to. So there's, this thing is taking a lot of data, both on the human using it and on the external world. Probably more data is being consumed by this device than has ever been put out at this scale, arguably ever.

Justin Wenck Ph.D.:

There's also some, you know, additional scanners, like a LiDAR scanner that. So, if you're familiar at all with automotive devices that are trying to do self driving, they often have LiDAR. So it's like you see, you ever seen one of these like Waymo Jaguars going around. It's got things going around. I believe it's often LiDAR, which is sending you a little laser pulses to know how far away stuff is, and very, very helpful. Tesla's don't use one of those, they're just using, as far as I know, visual cameras and not having a LiDAR sensor In iPhones.

Justin Wenck Ph.D.:

This they actually already have LiDAR sensors. This isn't totally new. They might be using it in new ways In the iPhone it's used just for helping to do like depth stuff to autofocus really, really quickly. There's also sensors that are detecting the motion and the position and the acceleration. They're using something called inertial measurement units, so these are kind of like gyroscope type things, so when you, as your head moves, it's able to know in space that you have moved your head around.

Justin Wenck Ph.D.:

So again, there's a lot of spatial data happening and then all there's what's really cool about this is there's all these inputs of data and there's cross-referencing, so it ends up being very much sort of like how a human is. We don't just know where we are by seeing that, oh, I'm walking and the things are level. We also have the in our inner ear, the cochlear system, and I could be totally misremembering my biology from high school and whatever that. But you can close your eyes and you can know if you're in balance or out of balance, which is why often, when people do use virtual reality stuff in games and it's presenting them with all this movement and they're not moving, people get sick. So there's apparently been all this research and tricks to prevent that.

Justin Wenck Ph.D.:

Then let's get to the audio. It basically has speakers that are just like pointed at your ears from the headset, but it's able to direct the audio in ways that it sounds like the audio is coming from anywhere. It really is like being completely immersed, like being one of the best Dolby IMAX theaters, but all you do is have this headset. It also apparently has six microphones so that it can tell exactly where sound is coming from. So, again, this thing is taking in massive amounts of information about the environment. Probably one of the biggest things is that how do you interact with that? You know eyes, voice, hands. You can apparently, you know, use a keypad, track pads, things like that.

Justin Wenck Ph.D.:

One of the big downsides for now is, for sure, it's the weight. This thing is about 600 grams, 22 ounces, which is like well over a pound, which the human head is already somewhere around 9, 10 pounds, I believe, and so when you're adding 10 percent weight just to the front, like after 20 minutes, I was like, okay, my face hurts from having this heavy thing. And that doesn't even include the battery. All of this so there's a ton of technology and it's $3,500. I've seen a report already that the bill of materials, which is just the components, is $1,500. So this is not a bunch of cheap components, because that's just to buy the things, but then to put them together, assemble it, test it, and then that doesn't even include all the research development. So $3,500's a lot of money. Yet Apple's very likely losing money on this, for sure, at this point.

Justin Wenck Ph.D.:

This is not a consumer product. This is a research project and to people that are buying it, you're likely wanting to research what to do with this thing. But Apple's also researching a lot about how you're using it, how you are as a human and all these things. It's able to basically detect your entire environment. Then it's also able to put you into a completely different environment.

Justin Wenck Ph.D.:

One of the things is it has one little dial and I find this very interesting, being into yoga or chakras. It's called the crown controller button and basically that allows you to dial in and out. How much pass through. Pass through is how much of the outside world can I see, versus how much am I just in my own immersed world where I can't see anything? They call that the crown button, crown controller. So I do find that a little interesting that the crown chakra is at the top of the head in yogic traditions. That's the connection to spirit, to out there, and this is dialing in and out of connection to the tech world, the computational world. So between the human environment and the tech environment, which I'll get a little bit later onto, what does this mean for how we live our lives, for society, all that? I'm going to get into that a little bit.

Justin Wenck Ph.D.:

But I was like, okay, saw the Apple Vision Pro, I was like, let me compare it to the MetaQuest. So a friend of mine has the MetaQuest 2, which I'm not even going to go over the specifications because this is a $500 device from a couple of years ago. Yet what's important to me is what it's like to interact with and one of the things from the MetaQuest and I'll compare what it's like to use the Apple Vision Pro in just a little bit. The MetaQuest 2, it's all about getting you immersed into the environment and completely being cut off from the outside world. There was a little bit of pass through, but I believe it was limited to black and white and it was very kind of grainy.

Justin Wenck Ph.D.:

So again, a newer version. It might be better, it might be color, but it seemed like the main thing was to get into a game, get into immersive video, just basically get out of anything to do with the real world, and also it required the use of a handset. So there wasn't really the gesture stuff. It really is. To me, it's just more of what I would expect from Meta, whose main products for talking Facebook, instagram and WhatsApp is basically like getting completely sucked in and putting all of your time, all of your attention into their products so that then your attention can effectively be sold. And on this podcast I've done other shows about how your most valuable thing is your attention. So when I look at what is the philosophy where it's basically just hey, how can I just capture more of the user's attention and basically have them consume more sensory input, more attention, more time, that seems to be the main goal, the main philosophy of what the MetaQuest is trying to do.

Justin Wenck Ph.D.:

But at some point in my tread, do a newer one, and if I learn of things different, I'll definitely let you know and correct myself. Somebody else has a different experience. I'd love to hear that. But when I compare that to the Apple Vision Pro where, as I mentioned, it has that crown control it's not crowd control, crown dial that lets you dial in like, okay, do I want to see more of the outside world or do I want to be more immersed into, like a virtual environment, so already that's kind of saying like, hey, this was important enough to put a button because, remember, there is, effectively there's that one button and I think, a volume control, volume rocker as well. Otherwise, again, most things, especially in Apple and other devices, is let's get rid of as many buttons as possible. Buttons are expensive, so if there's a button for it, they believe that that is a very important essential for the experience they want you to be able to have.

Justin Wenck Ph.D.:

The other thing is the fact that there is so many sensors, there's so many cameras, there's so many various ways to input that to me this is not about having a headset at all. This really is about completely changing how we interact with our compute devices. If we look back to how computers were long ago and it actually isn't even all that long ago we're talking about in the 1970s even, and earlier interact with the computer was text-based at best Right, and so it was always a text-based interface and then came like Mac OS, it came Windows and that brought in the graphical user interface and that's really how we've been using computing ever since. It's gotten a little bit shifted when we started having phones and it became more of a personal thing to have this visual way to interface and maybe a little bit with the voice and things like that. Yet this concept of this spatial computing, I think, is really really kind of cool in that it's allowing technology to be aware of the 3D space and allowing you to utilize 3D space to interact with it.

Justin Wenck Ph.D.:

To me, actually, the headset is the least interesting part of the Apple Vision Pro and to me it's all about how can we get the algorithms, the operating system, applications that can basically work within this new realm of interaction between humans and compute. So it's not just about the headset, the headset just constrains the problem so that it's possible at this point, because the headset puts a lot of stuff into a fixed position while allowing you to move your hands, your head, your eyes and all these things, which is a ton of stuff happening all at once Like the human system, is freaking incredible. And so to be able to try to pay attention and to get meaning out of that by a computer device is a lot of effort. And so by putting on a headset, it's like, okay, I know where everything is and so then I can make more sense of it. Yet imagine a world where these sensors you know the idea of cameras, the idea of eye tracking, the idea of body tracking, the spatial audio, both speakers and also the microphones, are placed in like an environment. And then you have compute devices, you have ways to visualize things, monitors, keyboards, projectors, other screens, other input devices.

Justin Wenck Ph.D.:

And then, all of a sudden, to me this is effectively making way for what was the holodeck in Star Trek. So I used to watch Star Trek when I was a kid and eventually started watching this Star Wars stuff. I still find Star Wars to be kind of okay. Actually it's probably gotten worse over the couple of years, but Star Trek had some really, really amazing concepts, ideas from so long ago, and the holodeck was basically this room you would go into and you would say like, all right, I'm in a spaceship, you know, in the middle of space, but hey, I want to see you experience what it's like to be a detective solving a case out of winery in California and then boom, it's like you're there. And this Apple Vision Pro, the Vision OS, to me, really does enable the possibility of having that type of experience. That's immersive.

Justin Wenck Ph.D.:

Yet you would not necessarily need a headset, you just need to go into a room that can provide this. But I think what's even more cool is again this crowned aspect where it's like you can say, hey, I don't want that, I want the whole computing environment to go away, I just want to be in the real world. So again, remember, if you're not having a headset, I think there'll be some things where the headset is beneficial, because that might be the way to truly get immersed into things that are being visualized. I think of this for doing design of buildings, or for someone that's like a surgeon, wanting to really visualize how the human body and things are interconnected, like that, this might be the best way is to have the headset on and to create in that and to do things like that.

Justin Wenck Ph.D.:

Yet for many things, having screens and monitors and whatnot. It's going to be more than enough for most of us for many things, but we might just want to dial that crown thing. You're like I don't want any computing, I don't want any monitors, I don't want any audio, I don't want any speakers, I don't want to be bothered, I just want to be in the real world. I really think that, although there's nothing that prevents the Apple Vision Pro and the devices like this to be used exactly the same way as the MediQuest is because I mean, in fact, I'm sure whatever's available in the MediQuest will also eventually be available on any Apple Vision Pro type headsets in the future, just like any app you can get on a Google Android, you can pretty much get on the iPhone. So there's always going to be the ability to get sucked into another world. That's going to just take you away from your humanity. Just use your dopamine hits so that you just keep doing things that make it do what you want.

Justin Wenck Ph.D.:

Yet the Apple Vision Pro, the VisionOS, the VisionIC does allow this way that we as humans can choose how we want to be benefited by technology by saying yep, I want it, I don't want it and how we want to work with it. So I can just imagine something very cool where it's like you have your phone, you walk into a room you've never been to before and there's a monitor, there's a keyboard. You've never used it before. But the room, the devices, they are aware that you are there, they're aware of the phone and just by the nature of the phone and it being yours, you could set the phone, set the phone down and you could just something that indicates something like point at the phone, point at a monitor, and then it'll bring up an interface on the monitor. That's your setup and that's it, and then you can use the keyboard.

Justin Wenck Ph.D.:

If you want to use the keyboard, there's any other accessories in the room. You can use those, because, again, it ends up allowing compute to be more like a, like another person, in a way that would know who you are and do things based off of who you are, let them use them based off of your level of trust, your abilities, things like that. And then, but if you're like, hey, I don't want to use any of this stuff, please go away. Like another person, it could politely go like all right, see you later, just enjoy the sunset, or just enjoy being in this room and quiet solitude. And I understand you don't want any of these devices on, you don't want to be bothered at all, you just want to be sitting with the real human who's right next to you and have a conversation like people have been having for thousands and thousands and thousands of years, without the benefit of technology.

Justin Wenck Ph.D.:

But then perhaps there's something where it's like, hey, we need the technology, so boom, dial that in, put it up on the projector and then now we could collaborate by moving stuff around and things like that, because again it can know who we are, what we're doing. There's possibility where this could actually be a real enhancer to what it is to be a human, to what it is to interact with technology and have technology to actually benefit our lives instead of how it's really been, to be honest, where it feels like we're more and more slaves to the technology and it is in charge and we are just trying to figure out how we work with this, get by with this. I'm like, oh, I feel like there's a 20% chance that the Apple Vision Pro, that the philosophy they're going with with the spatial computing, could actually help bring about a very utopian way of working, living and connecting with compute technology, artificial intelligence and all that, whereas when I look at the meta quest, I feel like that's totally on the dystopian side of things, that there is very little chance that that ends up being an overall positive to how we live, work, interact, connect and create. I just think it's going to be more of the same that some people will be able to control themselves, but that's going to be because of amazing, amazing ability, not by design from the meta products, whereas the Apple products looking towards the very human essence of what matters to us and maybe what it means, what is the purpose of technology in relating for humans, beyond just maximizing profits for corporations? So with that, we've gone over what's inside the Apple Vision Pro, what experience that enables.

Justin Wenck Ph.D.:

One other thing is, with these experiences being so realistic, the demo for the Apple Vision Pro, someone has to be there with you and they take you through. There's a part where they go. This upcoming scene, this video is going to have realistic animals, realistic heights and things like that. If you're okay with that, go ahead. You can skip that part. I understand because it feels like you are really high up. It feels like these animals are coming at you.

Justin Wenck Ph.D.:

I think there are definitely some ethical considerations for some of these things, because I believe still so far most VR games because the most common type of game is to shoot them up they're still very cartoony. Yet what if it's not super cartoony? What if it's really really real? There's some things that I think we got to be careful with how we use these things, who uses them, when they use them Just as we've seen how hard it is for people that have to do the content moderation on Twitter, on Facebook, on these things, and they see the worst of things that humanity has.

Justin Wenck Ph.D.:

That's just in the 2D environment. Just imagine what that might do when it's rashed it up into three dimensions totally immersive sight, sound. We're getting to the point where we don't know if video is real or not, because the AI generation of video is starting to be so good that it's starting to become as good as a video camera. So when you have that and you're able to experience in a way that makes it seem real and it starts to blur the lines between real human reality and virtual reality, it's going to be some interesting things to work out. Yet what I do like about the VisionOS is that it seems to be building on where the headset doesn't even have to be needed to be able to create a fantastic experience where technology can help us and benefit us.

Justin Wenck Ph.D.:

So, with that, I'd love to know what you think, so you can send me an email podcast at JustinWinkcom or, you know, send me a DM on social media to let me know, like, did I miss something? What do you think? Is one of these better than the other? You know, am I totally out of line, thinking that there is hope that this could actually be a benefit for humanity and that it's all ridiculous? Or am I still not optimistic enough and that these can all be doing great things? So I'd love to hear from you. Thanks so much, take care and good day. Thanks for tuning in to Engineering Emotions and Energy with JustinWink, phd. Today's episode resonated with you. Please subscribe and leave a five star review. Your feedback not only supports the show, but also helps others find us and start their journey of emotional and energetic mastery. You can also help by sharing this podcast with someone you think will love it just as much as you do. Together, we're engineering more amazing lives.

The Future of Human Connections
Future of Immersive Technology