Scratchwerk ^EDU

Werk Week News Update - Smart Health, Smart Meetings, Smart Glasses

Scratchwerk Tech

Tech giants are pushing boundaries with AI in healthcare, meeting participation, and augmented reality wearables. These advancements signal a fundamental shift in how we interact with technology in our daily lives.

• Apple developing Project Mulberry, an AI health companion aimed at replicating primary care physician functions through iPhones and Apple Watches
• Otter.ai has launched a voice assistant that actively participates in meetings by speaking and responding to questions in real-time
• Meta revealed details about upcoming $1,000 smart glasses featuring voice control, real-time translation, and AR overlays

Please follow and like us on Spotify, Apple Podcasts, or wherever you listen to your podcast. Until next time, keep up the Scratchwerk, keep building.


Send us a text

Speaker 1:

Welcome back, architects. Today is April the 2nd and this is your work week news update. Today we're going to discuss Apple's push into AI-powered healthcare, otterai's new voice-enabled meeting assistant, which is interesting, and Meta's upgraded smart glasses. So let's get started so up. First, apple has announced that it is working on a project, mulberry, which is an AI assistant or kind of a health companion for users, and so Apple is now developing. This project is kind of the secretive health initiative aimed at personalizing health care through your iPhones and your Apple Watches. The project, reportedly, is kind of seeking to replicate some of the basic functions of a primary care physician, such as monitoring symptoms, offering advice, tracking long-term health patterns, so on and so forth. And Apple is really planning to use really LLMs we say this all the time large language models, which is a type of AI model. So Apple is planning to use these LLMs on their devices to ensure privacy and speed, but really wanting to offer this kind of health service in a similar way like Siri. You know, if you say hey, siri, and all the things that that can do and tell you it's looking to have the same type of functionality, basically as a private healthcare physician at your fingertips, and this really kind of aligns with Apple's broader health ambitions that they've been seeking over the last several years. They've gotten into EKG readings, fall detection, sleep tracking, so on and so forth. I mean, they have really jump into the health space beyond just kind of your activity rings and your steps and your heart rate, all those those types of things. So this is really going to position them as a major player in this digital health care ecosystem. So if they're successful and I believe that they will be this will kind of disrupt the you know, $5 trillion health healthcare industry that exists today by empowering users, in my opinion, to detect and really act on any medical issues a lot earlier than they currently are today. So next up, Otterai has launched a voice that talks back. We mentioned this a while back on a previous episode. It has finally been released. So now AutoAI is not just transcribing meetings, it is actively participating in meetings, speaking up in meetings, summarizing, answering questions in real time.

Speaker 1:

I felt the need to kind of bring this particular topic up again because I had a chance to try it out. It is about as freaky as you would think it would be, where you can sit in a meeting and say, hey, otter, what have we been talking about? Tell me a little bit more about my company, give me some ideas based on everything that you're hearing, so on and so forth, and it will literally spit it back to you. No different than Siri or Alexa or anything like that. So it is. It is truly. I can see how it can be helpful, but this, this definitely takes us another step in the direction, and we've we've said this even a year ago. We will, I believe, get to a point where the actual human, the person that's supposed to be attending the meetings, might not be necessarily attending the meetings. I am going to send my AI autobot to the meeting with all of the functionality, all of the memory, all of the data that you would need to answer questions right there on the spot. So this is a major evolution, in my mind, from just the passive note taking tools that go on in meetings. Now this is really transformed into a much more active AI collaborator in this hybrid and remote work environment that we live in. And so, when AI can now jump into meetings real-time, provide insights, take notes, create calendar invites, just do all the different things in response to what is being said in a meeting, that is getting us very, very close to not needing the assistant to do all those things while we are in the meeting. So this is Otter's push to really reflect a broader trend towards intelligent agents that perform all of the tasks that we need from a human.

Speaker 1:

And that's just not Otter. This is going to be Zoom. This is going to be Microsoft Teams. This is going to be any one of the tools that we use to meet with people online. You're going to start seeing them move more and more into this direction. Microsoft Copilot is another one. So just be on the lookout for this. The next time you're on a Zoom call, you see the little. It's almost like a little flashing blue bubble, yeah. And again, if you say hey, otter, and talk to it, it will absolutely talk back. Try it out when you get a chance.

Speaker 1:

And last but not least, the Meta has revealed new details, updated details, about its upcoming $1,000 smart glasses. So anybody that owns the Ray-Bans I think those are about $350. But these are new, upcoming $1,000 smart glasses from Meta and they really revealed and showcased some of the enhanced features that will be used for these glasses for everyday use. Really. Some of these features include voice control search, real-time translation. Here we go. We're getting into do I need to read? So anybody that has listened to previous podcasts before that ability to look at foreign languages or look at other things and have it translated on the spot. We're getting closer and closer to that point. But we have augmented reality overlays for directions, so you have your glasses on and kind of overlay what you're seeing, tell you to turn right, turn left. You can see your messages, contextual information, so if you're looking at a building, maybe being able to tell you some more about that particular building. All of this powered by Meta's AI. So these glasses will also integrate into Meta's large language model, the Lama 3.

Speaker 1:

And this is again just another major leap towards these kind of conversational wearables. When you think about your AirPods for those that have those you think about these smart glasses. Any of these things will start allowing us to get closer and closer to having a different kind of communication, right, having a different kind of of interaction. As we are moving about the world, people aren't going to even be seeing the same things. We can be sitting in the, in the restaurant, together. One person has glasses on, the other one does not. They are seeing things differently. They People aren't going to even be seeing the same things. We can be sitting in the restaurant together. One person has glasses on, the other one does not. They are seeing things differently, they are reading things differently. I think we are going to get increasingly closer and closer to an environment where it's half reality, half augmented reality, literally.

Speaker 1:

And Meta is actually positioning these glasses as kind of like that bridge between smartphones and these future full augmented reality headsets. And for those that don't quite know what we mean when we say augmented reality, you have virtual reality, which is something that's going to be over your eyes. It's going to replace everything that you are looking at with some kind of virtual environment. That's virtual reality. Augmented reality is basically, you see what's in front of you, but there's other things that's augmented on top of that. The example with the directions you can actually be walking down the street, but you see the arrow that you know is not physically there, but you see it there, and that'd be augmented reality.

Speaker 1:

And so they're really positioning these glasses to kind of be that bridge between smartphones and these kind of full augmented reality headsets and really again blending the social ecosystem with just the utility and the productivity functions that we need on a day-to-day basis. So this is still kind of pricey at $1,000 for a pair of glasses, but this really is a signal towards Meta's vision for these kind of practical, socially acceptable because that's another thing, right? Nobody wants to walk around with a huge headset that looks like headgear on your head all the time, but just to be able to have an actual, acceptable augmented reality pair of glasses. It's really a direct challenge to Apple's Vision Pro and then the previous version of Meta's Ray-Ban glasses as well. And that is really it for this week's Workweek News Update. Please don't forget to follow and like us on Spotify, apple Podcasts or wherever you listen to your podcast. And until next time, keep up the scratch work, keep building. Bye.