Heliox: Where Evidence Meets Empathy πŸ‡¨πŸ‡¦β€¬

🍎 Apple WWDC 2025: How Apple Just Rewrote the Rules of Personal Computing

β€’ by SC Zoomers β€’ Season 4 β€’ Episode 52

Send us a text

Please see our episode substack for a more detailed breakdown and a comic.

Something profound happened at Apple's WWDC 2025, and most people missed it entirely. While the tech press got distracted by shiny new features and incremental updates, Apple quietly orchestrated what might be the most significant shift in personal computing since the original iPhone. This wasn't just another product announcementβ€”it was a declaration of independence from the surveillance capitalism that has defined our digital age.

Let me tell you what really happened, because the implications are staggering.

WWDC 2025 - June 9 Keynote

This is Heliox: Where Evidence Meets Empathy

Independent, moderated, timely, deep, gentle, clinical, global, and community conversations about things that matter.  Breathe Easy, we go deep and lightly surface the big ideas.

Thanks for listening today!

Four recurring narratives underlie every episode: boundary dissolution, adaptive complexity, embodied knowledge, and quantum-like uncertainty. These aren’t just philosophical musings but frameworks for understanding our modern world. 

We hope you continue exploring our other podcasts, responding to the content, and checking out our related articles on the Heliox Podcast on Substack

Support the show

About SCZoomers:

https://www.facebook.com/groups/1632045180447285
https://x.com/SCZoomers
https://mstdn.ca/@SCZoomers
https://bsky.app/profile/safety.bsky.app


Spoken word, short and sweet, with rhythm and a catchy beat.
http://tinyurl.com/stonefolksongs

Curated, independent, moderated, timely, deep, gentle, evidenced-based, clinical & community information regarding COVID-19. Since 2017, it has focused on Covid since Feb 2020, with Multiple Stores per day, hence a large searchable base of stories to date. More than 4000 stories on COVID-19 alone. Hundreds of stories on Climate Change.

Zoomers of the Sunshine Coast is a news organization with the advantages of deeply rooted connections within our local community, combined with a provincial, national and global following and exposure. In written form, audio, and video, we provide evidence-based and referenced stories interspersed with curated commentary, satire and humour. We reference where our stories come from and who wrote, published, and even inspired them. Using a social media platform means we have a much higher degree of interaction with our readers than conventional media and provides a significant amplification effect, positively. We expect the same courtesy of other media referencing our stories.


Welcome to the Deep Dive. We've got quite the source stack today. It's all centered around one big thing, the full transcript from Apple's WWDC 2025 keynote. Yeah, and our mission really is to dig into it properly. We're going to cut through all the keynote speak, the quick news hits, and pull out what actually matters, the real nuggets of knowledge, the insights. Think of it as like your shortcut to really getting what's coming next for the phone, the watch, the Mac, and... the devices you use every single day. Exactly. We'll hit the big themes for sure. But also, you know, pull out those surprising little details, things that might have flown under the radar otherwise. So whether you're prepping for a meeting, just trying to keep up, or you're just curious about how this stuff is changing, we're here to walk you through it. Updates for iOS, watchOS, tvOS, macOS, visionOS, and iPadOS, all landing this fall. It was absolutely packed. But honestly, right at the top, there were these two foundational things that everything else seemed to build on. Okay, foundational. Let's unpack those first then. What were the two big pillars? Well, first, they're really building on what they started last year with intelligence features. A major expansion of Apple intelligence itself. Okay, Apple intelligence expansion. What's actually new there and what does it really mean? Right. So, okay, beyond just the basics, like expanding to more languages, making the underlying AI models, the generative models more capable, more efficient and putting it in more places, you know, iPhone, watch, vision pro, Mac, iPad. The really big news, and this is where it gets super interesting, I think, is for developers. Oh, OK. Tell me about that. What did they announce specifically? So for the very first time, they are opening up direct access to the on-device large language model. You know, the AI that's at the core of Apple intelligence. Direct access. How? Through a new framework they're calling the Foundation Models Framework. Okay. Foundation Models Framework. Break that down. What does that actually let developers do? What does it mean for you listening and the apps you use? This is potentially huge. It means developers can now directly tap into that really powerful, fast, powerful, privacy-focused intelligence right there on your device, even if you're offline. Offline, wow. Exactly. And this is crucial without racking up cloud API costs for using it on the device. Oh, okay. This capability, baked right into apps, could genuinely spark like a whole new wave of intelligent features, stuff that feels really integrated and importantly, private. Can you give us an example, something they mentioned that makes that concrete? Sure. They talked about an app like Kahoot. Imagine it creating a personalized quiz for you right from your study notes, like instantly on your iPad. Or all trails. You're camping totally off-grid. It could use these on-device models to understand what kind of hike you're looking for and suggest trails right there. No signal needed. Yeah, that really hits home the offline and privacy angle. Okay, so that's a massive, powerful on-device AI for developers. Now, the other big piece. this new universal design language. What's the thinking behind that? The goal seemed to be creating a much more harmonious feel as you jump between your iPhone, your Mac, your iPad, but still letting each platform feel unique. They said they were really inspired by the sort of and richness they built for VisionOS. Right. And wanted to bring some of that feeling, that sense of things being natural and alive to digital elements everywhere else. And that led to like a new material. Exactly. A totally new visual material they're calling liquid glass. Think of it like it has the optical stuff of glass. It's translucent. It refracts light. Okay. But it also has this fluidity. It can sort of transform and react dynamically depending on your content or what you're doing. Liquid glass. Interesting name How does it actually You know work How does it improve things Well it's designed To bring clarity Focus It subtly refracts light It reacts when you Move your device Or touch the screen With these little highlights Uh huh It can morph When you need more options Or when you're moving Between views elements are redesigned to fit like concentrically with the rounded hardware corners. Oh, right. Which actually frees up a bit of visual space. It's always translucent. So it's kind of informed by your content underneath. And it adapts really nicely between light and dark modes. Yeah. It looks beautiful. Yeah. Yeah. But it's also functional. It helps guide your eye, helps you understand the interface better. So not just pretty, it actually does something. Where do we see it? What are some examples? They showed it refining core things like alerts. They now sort of flow out from where you tapped. Okay. Context menus expand with this glassy effect. Tab bars, like in Safari, they shrink down when you scroll, give you more screen. Yeah, I hate when they block stuff. Exactly. And then they fluidly reappear when you scroll back up. Yeah. It's all about adding this sense of depth and responsiveness to everyday interactions. And connecting back to that universal idea, they mentioned unifying the version numbers, too. Yeah. And another little signal of this cohesive push, the big software updates coming this fall, iOS, macOS, watchOS, tvOS, visionOS, PCOS. iPad OS, they're all going to be version 26. Version 26 across the board. Okay, it definitely signals a unified direction. All right, so with those big themes, the expanded AI, this new design, how does it all actually show up on the devices we use? Let's dive into the platforms. Starting with, well, the iPhone. Right, iOS 26. This is where you'll really see Apple intelligence and that new liquid glass design woven throughout everything, plus some really practical, useful improvements. What jumps out first on the lock screen and home screen? Well, the visuals, definitely. The time and the controls on the lock screen use liquid glass. It gives them depth. When you swipe over to the home screen, there's this subtle glass edge that follows your finger. Even the app icons are getting tweaked. They're crafted from liquid glass. They adapt dynamically in dark mode. And there are new styles, like an all-clear look, that really lets your wallpaper shine through. And the dynamic wallpaper sounded interesting. You mentioned something about 3D. Yeah, they're leaning into depth. On PhotoShuffle wallpapers, the time actually adapts its size and position to fit nicely into your photo. using the San Francisco font, but scaling dynamically. Okay. And this is really neat. Using the Neural Engine's computer vision, it can take your regular 2D photos and generate this subtle 3D effect. Oh, wow. So as you tilt your phone, the picture shifts slightly, like you're looking into it. It makes memories feel more alive. The now playing screen also gets a really nice makeover, with the album art interacting with those glass controls. What about the camera app? It can feel a bit... busy sometimes. They've redesigned it, aiming for more intuitive controls. Photo and video modes are clear. Swipe left or right for things like cinematic or portrait. Swipe up for settings like aspect ratio, timers. Okay, simpler. Yeah. And a single tap brings up format options. like 4K. Yeah. The Photos app itself also gets the new design. And importantly, they're bringing back separate library and collections tabs. Oh, good. Yeah, better organization. And that 3D photo effect, it's in the Photos app too. Did Safari and FaceTime get touched by this new design? Absolutely. Safari pages feel more edge to edge now, more expansive. The tab bar floats at the bottom and shrinks down when you scroll, like we mentioned. FaceTime controls float and kind of recede when you don't need them. And the main landing page is redesigned, puts more emphasis on your contacts with bigger contact posters and previews of video messages. How does all this carry over into the car with CarPlay? CarPlay gets the new look, too, for consistency, including the icons. And they've added some useful features for staying connected safely, like a more compact design for incoming calls, support for tap backs, and pinned conversations in messages. And you can now see widgets and live activities right on the CarPlay screen for quick glances at info. These are the same widgets developers already built for the iPhone. And then there's CarPlay Ultra. For cars that support it, it integrates across all the driver screens. In the phone app, there's this new optional unified layout. It puts favorites, recents, and voicemails all in one view. Optional. Yeah. Favorites stay right at the top. Easy access. Recents and voicemails are in a single list below that, and Apple Intelligence jumps in to provide summaries for your voicemails. Voicemail summaries. Yes, please. But what about all those spam calls, unknown numbers? They're tackling that directly with a new feature called call screening. Call screening. How does it work? It automatically answers calls from unknown numbers silently in the background. It prompts the caller to say who they are and why they're calling. And then it rings your phone. Showing you a real time transcript of what they said So you can see who it is and decide Right there whether to pick up or just ignore it Oh that sounds incredibly useful What about being stuck on hold The dreaded hold music Yes they actually address that too with hold assist Seriously Yeah. The phone app can basically keep your place in the queue for you while you wait for a human agent. It detects the hold music, asks if you want it to wait. It stops the music playing on your end. The call's still connected, mind you, and then it rings you back when an agent actually picks up. It even tells the agent you'll be right there. Think about calling airlines, utilities. Yeah. Getting that time back. That is a game changer. Okay. Messages. Group chats can be, well, chaos sometimes, especially planning things. Totally. So messages get some visual fun with backgrounds for your chats, dynamic ones, your photos, even AI generated ones via image playground. But for group chats, the big thing is polls. You can easily create polls to see what everyone wants, like choosing a dinner spot or movie time. And Apple Intelligence can even suggest creating a poll if it sees you discussing plans. Oh, smart. Yeah. Anyone can add options. Votes come in live. Also, you can now request, send, and receive Apple Cash right within group chats. And finally, typing indicators for group chats. Finally. Okay. What about cleaning up the main messages list itself? It gets cluttered. On-device spam detection is still there, getting better. But now you can also screen new senders, unknown people messaging you, land in this separate area. At the filter. Kind of. You can quickly decide, mark them as known, ask who they are, or just delete the conversation. They stay silenced, no notifications, until you explicitly accept them. Important stuff like 2FA codes still get through, though. More ways to get creative with AI. Emoji images. Yep. You can now mix two emoji together, or an emoji and a text description, to create custom Genmoji. Genmoji, huh? Yeah. And ImagePlayground gets more control for tweaking AI images, like changing someone's expression or hairstyle in a generated picture. It also integrates ChatGPT now. Interesting. Offering styles like oil painting or an any style option for more realistic stuff. And they emphasize controls, making sure nothing gets shared with ChatGPT without you explicitly okaying it. There's an API for developers too. Let's talk about live translation. That sounds like a really powerful feature. It really does. It's coming to messages, FaceTime, and the phone app. And critically, it's powered by Apple's own on-device AI model, so it's private. How does it work in practice? In messages, text can just automatically translate as you type and as you receive it. In FaceTime, you get live translated captions on screen while still hearing the person's original voice. Wow. And on actual phone calls, your words get translated and spoken out loud to the other person, and you hear a spoken translation of what they're saying. even if they don't have an iPhone. That's incredible, even if they don't have an iPhone. Yeah. And there's an API, so other communication apps can potentially use this too. That could really break down language barriers. Okay, quick hits on service updates. Music, maps. Apple Music adds lyrics translation and pronunciation helpful. and AutoMix for smooth transitions, like a DJ. Music pins let you stick favorite artists or playlists to the top of your library. Handy. Maps. Maps gets smarter about your usual routes, including stops, and offers them up. The Witty checks your commute time. And notifications will warn you about big delays on your route even before you start navigation. And something about remembering places. Yeah, visited places. Your iPhone can kind of log places you go like restaurants or shops and you can see them in your maps library. It's all end-to-end encrypted. They stress the privacy. Wallet always gets updates. Keys, IDs, payments. Carkey support keeps growing. Lots of brands now. Digital ID for U.S. passports is launching this fall for domestic TSA and some other verification spots. But again, not a replacement for your physical passport. Okay, important distinction. Boarding passes look refreshed. Linked airport maps. Find my bag tracking. Shareable live flight status. Apple Pay lets you use loyalty points or pay in installments in person now. And with Apple Intelligence, Wallet can find and summarize order tracking info from your emails, even if you didn't use Apple Pay. Gaming on iPhone seems to be getting more serious. Yeah, they're launching a dedicated games app. A central place to manage your games, find new ones, see what friends are playing via the Play Together tab, and invite them. Okay. And a new feature called Challenges lets you compete on scores in supported games using Game Center leaderboards. Turn single-player games into mini-competitions. Lastly, for iOS, visual intelligence. It's moving beyond just the camera view. Exactly. Visual intelligence now works on anything displayed on your iPhone screen. Anything? How? Use the screenshot button combo. It lets you search or take action based on whatever you're looking at in any app. You can image search something on Google, highlight text to search with an essay, pull event details, date, time, location right off the screen into your calendar. Or even ask ChatGPT about something on screen without leaving the app you're in. Developers can plug into this with App Intents. It basically unlocks the context of your whole screen. That's a ton for the iPhone. Okay, shifting from the pocket to the wrist, what's the story with watchOS 26? WatchOS gets the liquid glass look in places like the SmartStack control center. But the headline feature is definitely Workout Buddy. Workout Buddy. Tell me more. It uses Apple intelligence plus all your past fitness data to give you personalized encouragement and insights during your workout in real time. So like a personal coach whispering in your ear. Kind of, yeah. Yeah. delivered with this dynamic AI-generated voice based on a Fitness Plus trainer. It gives you little pep talks, points out when you hit your fastest mile, celebrates milestones, feels very personal, very private. The workout app itself gets a cleaner layout, easier access to custom workouts, race routes, and new media suggestions, playlists, or podcasts based on your activity type. Any other neat watchOS tricks? The SmartStack gets smarter with predictions. Yes. uses on-device learning about your routines to suggest widgets, like offering to start your workout when you get to the gym, or suggesting backtrack if you're hiking somewhere remote. Oh, active. Nice. Notifications adjust their volume based on ambient noise. And there's a new wrist flick gesture, lets you dismiss notifications mute calls silence timers or close the smart stack without touching the screen super handy if your hands are full yeah definitely messages on the watch gets live translation backgrounds quick actions like sharing location and finally the notes app is coming to apple watch quick notes on the wrist makes sense okay let's move to the living room tvos 26 tvos gets the liquid glass treatment too App icons look more vibrant. Playback controls refract the video playing behind them. The main Apple TV app has a bolder look, more focused on big movie poster art. Profiles can show up right when it wakes, so everyone can jump straight into their stuff faster. And a new automatic sign-in API should make setting up apps on a new Apple TV way easier, linking logins to your Apple account. And something fun for Apple Music Sync. Yeah, gets even more interactive. Your iPhone acts as the mic and your voice gets amplified through the TV speakers with visual effects. Friends can join in with their iPhones, add songs to the queue, send emoji reactions, take turns leading the song. Sounds like karaoke night just leveled up. It'd be fun. All right, shifting gears to the desktop. Mac OS Tahoe. Right, Mac OS Tahoe. It picks up a lot of the iOS goodness messages, gets backgrounds and screening, image playground, Genmoji, live translation are all there. The new design touches things like widgets, the dock, app icons with liquid glass. The menu bar goes transparent now, makes the screen feel bigger. You can customize controls right in the menu bar and control center, even add controls for mirrored iPhone apps. Personalization and organization seem like a focus too. Yeah, definitely. You can now change folder colors, head symbols, or emoji to folder icons. Oh, finally. Right. And it syncs across your devices. Great for organizing projects visually. Combine that with wallpapers and themes. It makes it easier to personalize your Mac's look and feel. Continuity is always huge for Mac users. what's new on that front? Two big things. First, live activities come to the Mac. They show up in the menu bar. Think tracking a food delivery order. You click it, and it opens the iPhone app via mirroring right there on your Mac screen so you can interact with it. Seamless. And the second? The phone app is coming to Mac. You get your synced recents, contacts, voicemail summaries. You can make and receive calls, see the nice big contact posters for incoming calls, and use all those new phone features like hold assist, call screening, and live translation directly on your Mac. That's really bringing the phone experience over. You can directly use AI models. Yeah. Example they gave. Compare an audio lecture recording to your notes and have it automatically add points you missed. And you can choose. Use the on-device models for privacy. Use Apple's private cloud compute for more power securely. Or tap into ChatGPT for its broad knowledge. Yeah. and Spotlight. They called it the biggest update ever, and it looks like it. It's not just for finding stuff anymore. It's for doing stuff.- Oh, so it gives intelligent suggestions, shows all your apps, including mirrored iPhone apps, and the huge thing, system and app actions right from the Spotlight bar. You can perform hundreds of actions without opening the app, Fill out parameters right there, like compose an email, subject, body, recipient, all in Spotlight. Access menu bar commands from anywhere. Wow, that's powerful. Plus quick keys, short character combos to trigger actions, like typing esm to start sending a message. They showed a workflow that tied this all together, right? Yeah, it was pretty slick. Showed searching documents in Spotlight, accessing clipboard history images, text, links, pasting an image, then running an action like remove background from Spotlight by accessing an app's menu command. Then using a shortcut, also from Spotlight, leveraging AI and the document's context to generate taglines, and finally adding the doc to a freeform board with another Spotlight action. It really shows how AppIntense lets developers embed their app's functions deep into the OS. makes spotlight way more central what about gaming on the mac mac os tahoe gets that new games app too and a new game overlay you summon it with a controller click lets you tweak settings chat invite friends without leaving the game nice and metal 4 their graphics tech Adds things like frame interpolation, denoising for better visuals and performance. They spotlighted some upcoming games, really leveraging Apple Silicon, MiddleFX, and the Neural Engine. Okay, let's pivot to the spatial computer. Vision OS 26, what are the highlights? Vision OS gets an expansive update. It gets the AI stuff like image playground, more languages. But a big visual change is widgets become spatial. Spatial widgets, how does that work? They integrate right into your space persistently. So your clock, weather, music widgets just float wherever you put them. Beautiful new designs for them too. Photos widgets can turn panoramas into these immersive backdrops. And apps now remember where you left them in your space. That 3D photo thing from iOS must look amazing in Vision OS. Oh yeah. They call it spatial scenes. It uses a new AI algorithm, computational depth, creates multiple viewpoints from a flat 2D photo, makes it feel like you could actually lean into the memory. It's in the Photos app, and a new curated Spatial Gallery app. And get this, it's coming to spatial browsing on the web. Articles can have inline photos that sort of pop out and come alive as you scroll past. Developers like Zillow can use it. Persona's always felt a bit uncanny valley. Any changes? A dramatic transformation is what they called it, aiming for much more realistic representation. Better hair, lashes, complexion, still created quickly on device, but looking way more lifelike for virtual interactions. Good to hear. What about sharing the Vision Pro experience with others? In-room sharing is now a thing. You and friends in the same room can watch a movie together, play a spatial game. Also big for enterprise colleagues visualizing a 3D design together, some local, some remote, using apps like Dassault's 3D Live. And for enterprise specifically, easier device sharing. Users can save their iHand data, prescriptions, accessibility settings securely to their iPhone for quick guest user setup on any device. And a new protected content API creates like a for your eyes only mode for sensitive stuff. Input methods expanding beyond just eyes and hands. Yes. Support for spatial accessories with six degrees of freedom input, meaning they track movement in 3D space. Like what? They show Logitech Muse for drawing and collaborating in 3D. And Sony is bringing support for PlayStation VR 2 Sense controllers. Yeah. That should open up more engaging games, like a pickleball game they showed called PicklePro. More spatial content? Creation tools. Definitely. Adobe's launching a new, premier-powered app for VisionOS, letting you edit and preview spatial video right in VisionPro. Very meta. Yeah. They're partnering with GoPro, Insta360, Canon for native playback of 180/360, wide, faux-V video. Relive action shots immersively. Web developers can now embed 3D objects directly in web pages. Combine that with spatial browsing, feels like the start of the spatial web. Oh, and cool new environments like being near Jupiter and watching the storms swirl. Okay, saving a potentially huge one for last, iPadOS 26 often feels like the platform with the most untapped potential. What's the story? They called iPadOS 26 a giant release. And honestly, it looks like they might live up to it. It gets the new design, obviously. Apple intelligence integration, live translation, smart shortcuts actions, many iOS, macOS features like the messages updates, the phone app, the games app. But the real focus, the game changer, is around multitasking files and creative workflows. Multitasking. Okay, this is the big one everyone asks about. What did they announce? A new windowing system on iPad, finally. It's designed for iPad, touch first, but delivers the flexibility people wanted. Apps still open full screen by default, keeping it simple. Okay, so not totally different. Right. But now apps have this little grab handle. You can just fluidly resize them into floating windows. They remember their size, their position, works great with touch or a trackpad. The trackpad pointer is getting more precise too. New buttons appear to easily close or minimize windows. Can you tile them? Snap them side by side easily? Yes. You can flick windows towards the screen edges to tile them. A grabber appears between them to resize both at once. You can open multiple windows, swipe home, you peek at the home screen, tap an app icon, it brings back all its windows where you left them. Expose! Expose the swipe up and hold gesture now spreads out all your open windows across all apps and spaces, makes switching super fast. Yeah. Swipe up again, it minimizes everything, gets you back to that clean single app view if you want. And did I hear right a menu bar on iPad? You did. A proper menu bar, always available on iPad. A range like the Macs, familiar file, edit, view menus with clear labels. Makes finding features way faster than digging through popovers. Wow. Okay, this new windowing system. It really does sound like the biggest change to iPad multitasking. ever. That's how they positioned it. And it really could transform how people use iPad for productivity. Importantly, they said it works on every iPad that can run iPadOS 26, even the base iPad and mini. And it works with stage manager and external displays too. That's huge. Okay, file management improvements. Files app gets a boost. updated list view, resizable columns, collapsible folders. It gets that same folder customization colors, symbols, emoji as the Mac syncing across. You can finally choose a default app for opening specific file types. And this is great. You can put folders right in the dock, tap it, the folder fans out showing the files inside. Perfect for quick access to downloads or project folders. And a beloved Mac app made the jump. Yes. Preview comes to iPadOS as its own dedicated app. Excellent. Huge for anyone working with documents. Full PDF viewing and editing, Apple Pencil markup, auto fill support, plus tools for editing and exporting images within PDFs. What about Creators' audio-video workflows on iPad? Several P things. Better control over audio inputs, system-wide selector, even works for web apps. System-wide voice isolation to cut out background noise for clear recordings. Very useful. You can use AirPods for studio-quality vocal recording. Now they mentioned improved tone and timbre. And you can use AirPods to remotely start and stop video recording. Oh, handy for solo creators. Much better for editing interviews. That's really smart. What about long background tasks? Exports, renders? That's been a pain point. But now, thanks to Apple Silicon, background tasks are fully enabled on iPad. Yeah. Long-running stuff, heavy computation like video exports or 3D renders can keep running in the background even if you switch apps. They show up as live activities so you can monitor them. developers get an updated API. Apps like Shap3D, DaVinci Resolve are already using it. Any other quick iPad mentions? Yeah. Advanced 3D graphing, MathNotes features, a new calligraphy read pen for Apple Pencil, and the Journal app is coming to iPad with Pencil support too. Man, it was clearly a massive year for developers underpinning all of this. Absolutely. That foundation model's framework for on-device AI is Well, foundational. App intents are crucial, letting apps plug their content and actions into spotlight, shortcuts, visual intelligence. Makes everything feel more connected. Right. And the new liquid glass design gives them a whole new visual toolkit to play with. What about the tools Apple gives them to build this stuff? They highlighted things like Icon Composer to help create those new layered icons. Xcode, the main development tool, gets more AI help, better code completion predictions, and expanded Swift Assist. Swift Assist. Yeah. Let's developers basically chat with their code using natural language, ask questions, get suggestions. Now it has built-in support for advanced AI models, including from ChatGPT, right inside Xcode. And, of course, tons more into the hood. Swift language improvements, new APIs and Swift UI for building those modern UIs across all the platforms. They have a whole week of sessions digging into the details for developers. Tim Pook wrapped it all up at the end, right? Tying it all together. Yeah, he came back out, emphasized the progress on each platform, the threat of Apple intelligence running through it, and how the new design aims for that seamless, enjoyable experience. across the board. And the release schedule. Developer betas are out now. Public betas coming next month, July. And then the final versions roll out to everyone in the fall. Standard playbook there. And he thanked the developers, ended with some fun App Store reviews. Yeah, gave a big shout out to the developers, showed some funny, real user reviews, lighthearted finish. Wow. Okay. That was a lot. A seriously packed deep dive into WWDC 2025. It really feels like more than just an incremental year. Major shifts. Agreed. You've got the universal liquid glass design bringing this new visual feel and function. You've got Apple intelligence properly woven through everything now and open to developers. And then those huge productivity leaps, especially iPad windowing, the new spotlight and shortcuts on Mac. Plus the really personal stuff like Workout Buddy. The powerful communication tools like live translation, call screening, and VisionOS just keeps expanding, becoming more human. It really does feel like a new chapter starting. Yeah, a lot of potential for things to genuinely feel different, better. So here's maybe something to think about as you go about your day. With all this new stuff, this deeply integrated AI, this more fluid, context-aware design... How might the way you actually use your Apple devices change? Your iPhone, watch, Mac, iPad, maybe Vision Pro. How might it shift how you work or create or just connect with people over the next year? Think about those little friction points, the clunky bits, and how maybe some of this smooths things out or opens up totally new ways of doing things. Definitely a lot to look forward to exploring. Thanks for joining us for this deep dive. We hope you feel a lot more clued in about what's just over the horizon.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.