The Third Angle

Actronika: Touching the metaverse

PTC Season 1 Episode 5

“The biggest organ in the human body is skin. Not addressing it is totally impossible in any type of metaverse, or parallel world, that you would like to emulate.”

It’s not enough that we can see and hear the virtual world, to become truly immersed we need to feel it - to interact with all our senses. And that’s exactly what Actronika is doing with its haptics vest Skinetic. This incredible bit of kit recreates the sense of touch on your skin, using advanced technology to make your experience ultra-realistic. 

In this episode we meet Actronika’s CEO Gilles Meyer and CTO Rafal Pijewski. Learn about the three parts of touch that are required to create these high-definition sensations. Hear as our producer is plunged into a virtual world, bombarded with feelings of wind, hail and fireballs. And find out what the future holds for the technology, and the industries it's set to transform. 

Also hear from Jon Hirschtick, who heads up PTC’s Onshape division. He explains the role Onshape is playing in bringing the vision of Skinetic to life.

Find out more about Actronika here, and Skinetic here.

Find out more about Onshape here.

Your host is Paul Haimes from industrial software company PTC

Episodes are released bi-weekly. Follow us on LinkedIn and Twitter for updates.

This is an 18Sixty production for PTC. Executive producer is Jacqui Cook. Sound design and editing by Ollie Guillou. Location recording by Rebecca Rosman. Music by Rowan Bishop.

Welcome to Third Angle. Today, we find out how to feel in a virtual world. 

I’m your host, Paul Haimes from industrial software company PTC. In this podcast we share the moments where digital transforms physical, and meet the brilliant minds behind some of the most innovative products around the world – each powered by PTC technology.

From meetings on Zoom to the promise of the metaverse, the time we spend in virtual worlds is only set to increase. But to feel truly immersed in that world, we need more than sight and sound: we need to feel. Actronika is a haptics company, creating technology that recreates the sense of touch on the skin so we can literally feel what’s happening on our TV and on our computer screens. Today, we’re learning about a vest they’ve developed that’s set to change the virtual reality experience forever. It’s called Skinetic. And our producer Rebecca Rosman visited Actronika’s HQ in Paris to learn more and test it out. She met up with the company’s CEO Gilles Meyer and CTO Rafal Pijewski. Here’s Gilles.

The biggest organ in the human body is skin. You have seven different receptors on your skin, and it depends for whom, but it’s about seven kilos. So essentially, it’s huge; it’s bigger than your liver. And it’s always on. Not addressing it is totally impossible in any type of Metaverse or parallel world that you would like to emulate. So this is what we’re trying to do. With the torso being essentially the largest part of your body, this is where you have the most skin. So this is a must have for any immersive experience.

Haptics is all related to the sense of touch. And then, if you think about the sense of touch, you have to think about the pressure applied to your skin, then the temperature. And then there’s this third part, and what we consider the most important one, is the vibrotactile. Meaning that whenever you touch an object, you can tell if it is warm or cold, you can tell if it’s hard or soft, but the actual fine quality of objects is complex, it’s actually impossible to perceive unless you start sliding your fingers on the surface. And that’s where the small vibrations are going to be created by your fingerprints, they will create these small vibrations that will give you information about the quality of the surface, and you will immediately be able to tell if it’s wood, metal, plastic, or something else. And that’s what we try to emulate with vibration. Because we think that this part is one of the most important ones.

In the next 10 years, most interfaces are going to be enriched by some kind of touch feeling, and then smell, taste, and so on. And you’re going to be totally immersed. This is essentially the direction of history. Things are going faster, we’re able now to give a sense of touch that is pretty close to reality. Until now, we had a problem because the computation speeds were not optimal. So touch is the best sense on your body that is actually the fastest. So you needed to have response times that were under 10 milliseconds. And now we were able to do that, so we can incorporate that sense of touch into interfaces.

This is what we call a technological demonstration. Basically, you would be in VR and some avatars will be circulating around you, shooting at you, throwing fireballs at you, so that you can get some different sensations, how they can feel different, which is the biggest selling point of kinetic is that you can actually make the difference between different gunshots and fireballs and lasers, which is not necessarily the case for what you have on the market right now.

It's also important to know that it doesn’t hurt. This is how we imagine the fireball to feel like, but eventually a game developer can think of something else. We also strongly believe that eventually there’ll be people specialising in that, so like audio designers, they will become haptic designers along with the audio. They will design the haptic experience. It will become a part of the design process.

In the real world, everything is synchronised. Whenever you grab an object, you would touch the object. You would most probably feel and see the object and hear at the same time. If you want to provide that immersive experience in the virtual world, you have to be able to do the same because your brain has been trained for the whole life outside of VR. So if you expect your brain to be tricked into thinking that it’s a new reality, we need to respect the same laws, and the law would be that synchronisation of visual, sound and haptics.

Regarding the library of effects, we basically design them by designing audio signals that are within the haptic range, which is very low frequency content. If you think about audio, the main difference between regular audio and haptics is that the waveforms are the same, but the frequency content is at the lower end of that. And the other one is more for synthesisers. So we create synthesisers that are meant to create very specific effects, let’s say, fire. Because we are focusing on environmental interactions, so this is not something you touch, it’s something that is touching you. So if you think about rain, wind, we have created synthesisers that have all these parameters that you can think of. If you think about rain, it’s the raindrop size, intensity, the kind of rain – light rain, heavy rain, that sort of stuff – and we give you access to these parameters and you can basically control them independently. And that’s what creates that haptic vibration. If you think about the rain, it will drop on your shoulder when you stay straight. But when you bend, it will drop on your back. This is the specialisation we provide, which gives the extra immersive part to the experience as this is exactly what would happen in the real world. 

The most obvious and present place for haptics that have been used is the mobile industry. Everyone has a phone with haptics in it, it’s just that the quality of haptics that they provide is not up to the task. That means that even you have a device that is haptically enabled. It’s just the understanding of haptics was different from what it is right now and the market will take time to catch up with the high-definition haptics, as we call it. On our end, as a company we’ve been working a lot with automotive, and this is the market that we feel is pursuing a good approach. We also work with people from the entertainment market. This is also something that haptics can bring a lot because today, going to a cinema is not the most obvious entertainment any more, as there is Netflix, all these platforms, they want you to stay home and watch movies on your TV. And they have to reinvent the whole cinema experience to be something more than just turning on a TV and watching your movie. 

For the cinema-like experience, you don’t need a headset, but you will be able to experience an enhanced movie-watching experience. The haptic content has been designed in terms of the vibration and the specialisation. And we provide software that helps designers to add haptic component in the most seamless way and the most efficient way. Two concepts that you can exploit as a designer is a third-person view, meaning that you are a spectator, or a character. 

Overusing haptics is also not the best idea, so we try to use haptics in an intelligent way. So we have haptic silences where there are no haptics, because there’s no good reason to hepatise anything. Because if you get a constant vibration of the whole movie, it would be too much. This is also the approach that we have. So designing haptic specifically for the experience, not just filtering stuff, it gives that freedom. And this is a decision to haptic something – not just because the audio is the way it is.

To design Skinetic and bring the vision to life, Actronika has been using Onshape from PTC, our cloud-based computer aided design and product data management platform. Actronika needs communication and collaboration with its external experts and its suppliers to be seamless. Onshape helps Actronika collaborate with suppliers and experts, their extended team, in a really unique way through several key aspects of Onshape, maybe three aspects, I’d say. Access, sharing the data, and then management of change. So first thing, access. Everyone, anywhere on Earth can access immediately the whole system. Onshape is the only system like this in the world. It runs in a web browser. So the first thing about collaborating with someone within a system, they have to be able to access the system. So everyone can get direct access to the system. Point two: collaboration in real time. There’s no copying of files, we all share the same master data. So models, drawings, assemblies, parts, all that stuff, anyone on the team can see, and they see changes happen instantly, no matter where they are in earth. Number three: change management. PDM has been a really unliked system for many years. With a new generation of PDM, you can manage the changes, new versions, revisions and so forth, without any copying, without any locking. So those three things make Onshape perfect for this kind of extended group of suppliers, experts and so forth.

Before using Onshape, Actronika’s teams faced many problems related to data management, including unknowingly working on an outdated version of a file, for example. So how has Onshape fixed this? 

Onshape has fixed the PDM problem in a big way. We have a new generation of PDM with Onshape. Yes, it does familiar things like versioning and release management. That’s really good. But we do it with none of the old problems and a whole lot more power than what you had with old PDM. The old problems meaning you can’t trace activities, you can’t trace individual edits, you just have a copy of a file. Well, what happened? I don’t know. Maybe there’s a few words of comment. With Onshape, you can trace every operation performed. Versioning and release management, we have that. And in some ways, it’s more powerful because you can make custom workflows very easily. You can go back and version a state from the past. You can’t do that in old PDM, usually. If there’s a version at the time, if it’s not a file, you lost it. So you’ve got all the things you need in PDM: much more power, and none of the hassles of the old systems. And I think part of the reason that Actronika has been able to save, I think they report 15 to 20% of their time, is because of our new generation PDM that’s built in Onshape. I’m personally really excited about what Actronika is doing, because I’m a big fan of augmented reality, AR. I think Actronika’s products are going to be great as part of augmented reality. I’m going to be excited to see people using Actronika in augmented reality applications. I can’t wait to try that myself.

People on this episode