The Macro AI Podcast
Welcome to "The Macro AI Podcast" - we are your guides through the transformative world of artificial intelligence.
In each episode - we'll explore how AI is reshaping the business landscape, from startups to Fortune 500 companies. Whether you're a seasoned executive, an entrepreneur, or just curious about how AI can supercharge your business, you'll discover actionable insights, hear from industry pioneers, service providers, and learn practical strategies to stay ahead of the curve.
The Macro AI Podcast
Why Apple Picked Google for AI
Episode Description (130 words):
In this episode of The Macro AI Podcast, Gary and Scott unpack why Apple chose Google’s Gemini to power the next-generation Siri — and why the move makes perfect sense when viewed through history. The hosts trace Google’s 20-year journey in artificial intelligence: from Google Brain’s “cat-video” experiment to DeepMind’s AlphaGo and the 2017 Transformer breakthrough by Google Research. They spotlight the engineers, hardware, and research culture that made Google the quiet giant of AI. The conversation then turns to Apple’s strategy — speed, scale, and privacy — and what this partnership means for the future of AI ecosystems.
Apple AI partnership, Google Gemini, Siri upgrade, DeepMind, Transformer architecture, Google Research 2017, TPU Trillium, word2vec, Google Brain, Jeff Dean, Demis Hassabis, Macro AI Podcast
Send a Text to the AI Guides on the show!
About your AI Guides
Gary Sloper
https://www.linkedin.com/in/gsloper/
Scott Bryan
https://www.linkedin.com/in/scottjbryan/
Macro AI Website:
https://www.macroaipodcast.com/
Macro AI LinkedIn Page:
https://www.linkedin.com/company/macro-ai-podcast/
Gary's Free AI Readiness Assessment:
https://macronetservices.com/events/the-comprehensive-guide-to-ai-readiness
Scott's Content & Blog
https://www.macronomics.ai/blog
00:00
Welcome to the Macro AI Podcast, where your expert guides Gary Sloper and Scott Bryan navigate the ever-evolving world of artificial intelligence. Step into the future with us as we uncover how AI is revolutionizing the global business landscape from nimble startups to Fortune 500 giants. Whether you're a seasoned executive, an ambitious entrepreneur,
00:27
or simply eager to harness AI's potential, we've got you covered. Expect actionable insights, conversations with industry trailblazers and service providers, and proven strategies to keep you ahead in a world being shaped rapidly by innovation. Gary and Scott are here to decode the complexities of AI and to bring forward ideas that can transform cutting-edge technology into real-world business success.
00:57
So join us, let's explore, learn and lead together. Welcome back to the Macarea podcast. We break down the trends shaping business, technology and the future of intelligence itself. I'm Gary. And I'm Scott. Today we're digging into why Apple turned to Google to power the next generation of Siri and why, if you know the history of artificial intelligence, that move isn't just smart.
01:23
It's almost inevitable given the Google history with AI and the tremendous amount of investment that Apple would need to make. Really just to get to parity with where the technology is today.
01:40
Yeah, really from the early days of deep learning to the rise of transformers and TPUs, Google has spent 20 years really building the foundation of modern AI. And that long game is exactly what made this partnership make sense for both Apple and Google. So let's jump in and talk more about this one. Sure. Yeah. Let's just jump right into the headline. uh I think everybody's probably seen this.
02:07
that Apple is finalizing a roughly a $1 billion a year deal with Google to use a custom version of Google Gemini. That's Google's flagship large language model. um And Apple's going to use it to power a smarter, more capable Siri across the Apple product set. And this isn't just about swapping search defaults or ads. It's strictly artificial intelligence capability. Models will run
02:35
inside Apple's new private cloud compute, which keeps Apple's privacy promises intact for its users. Yeah. And Apple did take their time and shop around. They spoke with OpenAI and Thropic and probably some others, but they ultimately landed on Google. And when you trace Google's two decade run in AI, like we talked about, that decision makes a lot of sense for Apple.
03:02
Yeah. And if we were to rewind, you know, Google's long road to AI leadership picture, you know, Mountain View, California in the early two thousands, it's the.com era. You and I were both there. Uh, had good times and bad times. Uh, searches is really booming and it's in infancy compared to where it is today. Gmail is brand new and a few engineers inside Google starts asking the question, which is what if we could teach computers to learn on their own? Yeah.
03:31
Yeah. And then, you know, fast forward a little bit by, by 2011, that curiosity morphed into Google brain, which was a small experimental group led by Andrew, Andrew Eng, Jeff Dean and Greg Corrado. And their, their first big project was focused on YouTube video, which was, you might remember Google acquired YouTube around 2006 for about 1.7 billion in stock. And what they did was
04:01
feed 10 million YouTube thumbnails into a neural network. And they wanted just simply to see what it learned on its own. You know, no labels, no instructions. They just put it in there. And what did it discover? Cats without being told to identify cats. So that, that goofy cat video experiment everyone's familiar with became proof that machines could learn patterns, not just rules.
04:27
And it marked the really the start of modern deep learning for Google. Yeah. And they, um, and the Google team obviously started to focus on that and drill into it a little bit. And they made a quiet, but a pretty major historic acquisition in, DNN research. And that was a tiny Toronto lab run by Jeff Hinton and, uh, and some of his students actually, um, Alex Krzyzewski and Ilya Stutskiewer.
04:54
And they were the team behind uh Alex net. That was the deep learning model that really kind of shocked the world by, blowing away the, uh, the image net competition, uh, earlier that year. Yeah. Yeah. And, and that deal brought the world's top neural network talent straight into Google's ecosystem. If you recall. So really happened overnight. Google became the global hub for deep learning research before anybody else was really kind of focused there. Yeah. And that's where.
05:22
Right then is when things really started to accelerate. right, just, just after that in 2013, Google released a Word2Vec, which was really a kind of a deceptively simple idea that, that changed the whole trajectory of machine learning right around the 2013 timeframe. Yeah. And Word2Vec basically taught computers, you know, that, words carry meaning, not just letters. It could tell that
05:48
King and Queen are related the same way men and women are. And ah in that shift, really understanding relationships, not just words and spelling, became the backbone of every modern language model, which is pretty cool. Yeah. And then I think right after that, we'll touch on how Google used that capability to power RankBrain, which was the first deep learning system in search. ah So all of a sudden, Google could interpret
06:18
what people meant, not just what they typed. So RankBrain was the first bridge between traditional search ranking and machine learning. And it turned Google search from uh a keyword matcher into a system that actually reasons about language. uh The same lineage that leads straight into Gemini, what Gemini is today. Yeah. And then came GMMT, the Google Neural Machine Translation System. I think it was back in 2016.
06:48
So really overnight, Google Translate stopped sounding robotic. It even learned to translate between languages it had never seen paired, essentially inventing its own interlingua. uh And interlingua is an internal shared language that a translation system invents for itself to represent meaning, not specific words. So instead of directly translating, for example, Spanish to English,
07:15
the model translates Spanish, interlingua to English. So when Google launched GNMT, researchers discovered something truly remarkable. ah They had trained the model on several language pairs. For example, we'll say English to Japanese and then English to Korean, but never directly on Japanese to Korean. So yet the system could translate between Japanese and Korean anyway, which is...
07:43
pretty mind-boggling if you think about it, humans don't learn that way, right? ah So that meant it had to learn an internal language of meaning, and using my air quotes, which is a representation of concepts and relationships that worked across all languages it knew. So it really helped propel kind of that early modeling just based on that particular example within the languages. Yeah, that was pretty amazing.
08:10
And right around that same stretch, obviously, there's a lot going on inside Google, and they're starting to realize what artificial intelligence could do. And they went out and bought DeepMind. And DeepMind at the time was a London-based lab that was founded by Demis Asabis. People probably heard his name by now in the AI world. He was a chess prodigy, but also a neuroscientist. uh
08:39
And DeepMind would become Google's crown jewel in advanced research, uh, encompassing Gemini, the, the Gemma open weight models and now Veo, the video generation models. And back in 2016, DeepMind really stunned the world with AlphaGo beating, you know, the world champion Lee Sedol in a Go match. And that was watched by hundreds of millions of people. It wasn't just a win. was
09:05
the moment reinforcement learning stepped onto the global stage and it just shocked so many people watching. Yeah, and Go obviously, you know, pretty closely related to uh chess in the complex world of games. that's so AlphaGo led to AlphaZero, which mastered chess. So in addition to chess, they had Go and then Shogi entirely from scratch. um
09:34
And then came alpha fold, which, solved the 50 year protein folding problem, which was a, obviously a huge breakthrough that changed biology and ultimately earned a Nobel prize. Yeah. Yeah. I remember that. And then I think it was a year later came the big bang of modern artificial intelligence, the paper entitled attention is all you need. was written by the Google research team. It introduced the transformer architecture, which is the blueprint that powers every major.
10:04
large language model today. for listeners- Yeah, I was going to say, think that was 2017, so now things are starting to advance. Yeah, yeah. just within essentially 12 months later. So for any listeners who might not know, a GPT is a generative pre-trained transformer. And generally, all LLMs trace their core architecture back to this paper. So a lot of the folks that you're probably partnered with today,
10:32
owe this back to this original paper. Yeah, exactly. Yeah, that attention is all you need was definitely a big one. And so now Google is really starting to make some huge advances and a lot of lot of companies might have stopped there. but Google decided to really double down at that time. um They didn't just publish the math, they started building the machines with the knowledge that processing and energy is a major bottleneck. So right around that time,
11:01
Uh, is where, you know, TPUs came into the scene. Uh, those are tensor processing units. And those are custom chips designed by Google specifically for machine learning and to power its own AI stack. Obviously competing with, uh, know, Nvidia GPUs and the ones that Nvidia sells to everyone today. Yeah. I think that's an important fact because by controlling the algorithms and the hardware, Google created an end to end AI stack. you think about it that way.
11:31
Their latest version, the TPU version 6 Trillium powers Gemini and sets efficiency records off the chart. having that in the ecosystem and kind of building that from their own IP originally, it's pretty amazing. Yeah, they knew that efficiency was going to be key as things scaled up. um So while at this time, OpenAI
11:56
started to grab the headlines when they launched ChatGPT in November of 2022, Google already quietly held all the fundamentals. They had the research, they had the compute, they had the global scale of data centers, distributed data centers, and obviously the track record and just the legacy of artificial intelligence and machine learning. And that's really what Apple is buying into in this partnership. Well, exactly. When you need a partner that's been proving
12:25
artificial intelligence at production scale for 20 years. There's really only one logical call to make there unless you decide to then spend the next, you know, maybe 10 years, because things have gotten a little bit faster to try to play catch up or you could partner. So that's, that's really what you're seeing in this, this new relationship. Yeah. And I think a big part of it was just, you know, the legacy that Google had and the people behind all of those breakthroughs. Um, let's just jump into that a little bit and talk about the people who, who
12:54
made this happen for Google because you Google's AI story is also a story of incredible talent and a lot of talent that's now really distributed across a lot of great AI startups out there that we're hearing about. Yeah. mean, start with Jeff Dean, one of Google's earliest engineers. mean, he built a bit, or I should say the distributed systems that power Google search, um, and then turned his focus to machine learning. He's kind of the
13:20
Connective tissue between research and real-world deployment across the platforms Yeah, and then we talked I mentioned Andrew Andrew and he co-founded Google brand in 2011 He then later led by do AI any and then and then he taught millions by co-founding Coursera in 2012. So some of you have probably taken Coursera courses um And then so he was really evangelizing deep learning when it was uh kind of still a fringe computer science topic
13:51
2013 when they acquired his lab, DNN Research. His work with students like Alex Krzyzewski, mentioned earlier, Ilya Suitskever literally kicked off the deep learning era. He stayed a decade, really was there shaping Google's neural network research, then left in 2023 to speak more openly about where artificial intelligence might be headed. Some of you have probably seen some of his interviews. And basically the risks that
14:20
that could potentially come with it. It's amazing to see where Hinton's students ended up. Susquever went on to co-found OpenAI, which many of you here listening use, and helped create GPT models that changed everything while ah Alex Kruszewski stepped back from the spotlight, but remains really a legend if you think about his work. ah His AlexNet model was really the spark that lit the deep learning fire for
14:50
uh the industry and many people that are you know kind of associated with today so two really good uh pupils of Hinton's original work, so it's pretty interesting. Yeah, and then just back to the deep mind side, uh Demis Isavis is really still the visionary leader. uh So he's blending neuroscience, gaming, and AI. And alongside him you have David Silver who pioneered reinforcement learning and Oriol Vinales.
15:20
who helped really push it into Gemini's DNA. So both David and Oriol are still at DeepMind and uh David continues to lead reinforcement learning research and Vignal is now heads up work on the Gemini models themselves. So the same minds that built AlphaGo are really still there driving Google's next generation of AI. Yeah, I'd also mention Fei-Fei Li who co-founded ImageNet and later served as
15:47
Google's cloud chief scientist for artificial intelligence. uh Fafey's now back at Stanford leading the human centered AI Institute. She's a powerful voice reminding the field that artificial intelligence should serve the people, not just the data. So, you know, really cultural uh icon in terms of what she's done and what she's still, you know, out there, um you know, talking about. Yeah, yeah. She's been pretty vocal lately. uh
16:17
And then don't forget the Google team behind the 2017 transformer paper. um So you have Ashish Vaswani, Yakub Gheeriskarate and Noam Shazir. They came up with the attention mechanism. And that was a simple idea that let computers focus on the right information at the right time. And it really ended up transforming the entire field of AI. And really, what's wild is where those engineers all ended up.
16:46
So Ashish co-founded Essential AI, building reasoning systems for business, for large enterprise. uh Jacob launched Inceptive, which is using transformer tech to design RNA medicines. And Noam Shazir, he co-founded, well, he founded Character AI, which is the chat companion platform that's really hugely popular recently. you know, each of them carried that 2017 breakthrough and into really
17:14
completely new domains outside of Google. Yeah, I mean, it's really an all-star lineup if you think about it. ah It's really staggering. Google has been the graduate school for an entire generation of AI innovators, many of whom went on to build OpenAI, which I just mentioned, also Anthropic, and really the broader ecosystem. So they've done an excellent job of just
17:39
know, retaining the right talent, but also enabling them to then spread their wings and help the overall ecosystem around artificial intelligence. Yeah. And that's what this, this episode is really about. That's what Apple is tapping into. They're tapping into two decades of institutional knowledge, hardware mastery, just, you know, proven global scaling. So this isn't just about borrowing a model. It's about partnering with the company that really invented the whole playbook. Right.
18:10
And so, you know, I think for Apple, this deal is really about capability and time to market. So Jim and I brings advanced reasoning and just incredible multimodal skills that smaller on-device models can't yet match. So instead of waiting years and spending billions of dollars and maybe making mistakes, most likely making mistakes along the way, Apple can move right now.
18:39
Well, they can, and it's efficient. mean, Google's transformer IP, the TPUs, you talked about a little earlier and the infrastructure, uh, mean better performance per dollar. Apple gets, if you think about it, Apple gets cutting edge AI while preserving privacy by using its own private cloud compute. So it's, really a win for Apple and it's, it's a huge win for Google because they've been such at a competitive landscape for so many years. I think it's also a win for anybody in
19:09
the artificial intelligence community because you're not seeing silos. You're seeing a global partnership here with two behemoths. Right. Yeah. And there's a lot of headlines right now about the spend in AI and they can, they can avoid a lot of that and just make this partnership. But also it allows Apple to keep its options open instead of going out and building all this. can, they can still integrate their own models later ah or even swap out some pliers if they, if they don't want to continue to work with Google. So
19:39
Well, I think that this is really a smart bridge strategy and it's not a permanent dependency upon Google. But I think you need to keep in mind that for years, Google has been paying Apple billions of dollars just to be the default search engine on iPhones. Right. So somewhere between 15 and 20 billion dollars a year. So it's really it's one of the most profitable deals in tech history. And what this uh
20:06
announcement is about is that this relationship is now evolving. As we mentioned earlier, Apple is reportedly paying Google about a billion a year for access to the Gemini AI models. So after 20 years, like we mentioned, two decades, the money is now flowing both ways. So proof that even fierce competitors can innovate. uh They just can't innovate on their own in isolation. So Gary, you want to touch on what might become a next?
20:36
Yeah. I mean, I think what's interesting, you kind of seeing the paradigm shift, you know, how you just kind of articulated the Apple Google relationship. ah And that's really interesting. So I think we could potentially see that within the ecosystem and, know, just with other partners, I think specifically to these organizations, ah you know, the quality of series, new capabilities when it launches, ah can it finally deliver on Apple's promises? uh
21:06
The economics will Google's Trillium TPUs make a large scale artificial intelligence affordable for Apple? So those are a couple of questions that I would think of. then the relationship will this spark deeper collaboration or just short-term alignment? It could also show smaller competitors out there that two large organizations can be essentially passing billions of dollars between each other so they can continue to grow into this next generation. So sometimes leading by example could spark.
21:36
you know, some smaller innovation across the ecosystem. Yeah, exactly. But I think really either way, I think this partnership will redefine how AI ecosystems connect. So it's no longer about, you know, who builds what. It's about who can deliver intelligence at scale to users and do it safely and do it reliably. And I think that's what Apple is trying to do right now. Yeah, you're absolutely spot on. mean, Apple chose speed, scale, and proven infrastructure as part of their
22:06
their go-to-market. Yeah. And Google, they obviously, like we talked about through this episode, they earned that position by doing the hard work for 20 years, pioneering the algorithms, building the silicon, building that internal heritage of people and scaling their global infrastructure. Yeah. You're absolutely spot on again. I think this is a good comment to end on.
22:32
I want to thank everyone for listening today to the Macro AI podcast. Please subscribe, share it with your network and join us next time as we keep exploring the real world examples to impact the world of artificial intelligence.