The Fractional CMO Show
The Fractional CMO Way explores the evolving world of marketing leadership through the lens of fractional Chief Marketing Officers. Hosted by the experts at RiseOpp, this podcast dives into strategies, success stories, and practical insights that help growing companies scale effectively without the full-time executive overhead. Whether you're a startup founder, a marketing leader, or a business owner looking for high-impact marketing guidance, this show will equip you with the tools and mindset to thrive.
The Fractional CMO Show
How to Optimize Content for Google AI Overviews and Boost Visibility
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Why Google’s AI Overviews Matter explores how to optimize content for Google AI Overviews and stay competitive in the evolving search landscape.
In this podcast, we break down how modern algorithms like BERT and MUM prioritize content that answers user intent, making structured data, schema markup, and readability essential for ranking well in AI-generated summaries.
Whether you're a content creator, marketer, or SEO professional, you'll learn how to adapt your content strategies, maintain freshness, and optimize for mobile to increase visibility in Google's AI-powered results.
👉 Read the full guide: How to Optimize
Content for Google AI Overviews: A Complete Guide
Imagine realizing that uh 1.5 billion people are reading the answers your business provides every single month.
SPEAKER_00Which is just a staggering number to even wrap your head around.
SPEAKER_01Right. It's massive. But here's the catch not a single one of them is actually clicking on your website.
SPEAKER_00Yeah. That is the harsh reality of search right now. It really is.
SPEAKER_01So welcome to the deep dive. Today we are looking at Google's AI overviews and how the fundamental architecture of search has been, you know, quietly but permanently rewired.
SPEAKER_00Oh, absolutely, permanently. There's no going back.
SPEAKER_01Right. So we are unpacking a really comprehensive March 2026 analysis today. This includes some fascinating data from the digital marketing agency RyzeUp. And our mission for this deep dive is to shortcut your learning curve on how to get noticed by the Internet's new AI gatekeepers.
SPEAKER_00Aaron Powell Because the era of optimizing for a traditional index like we used to do is just over.
SPEAKER_01Totally over. Keyword stuffing has been dead for a while, obviously. But this goes like way beyond that. You can't just sprinkle a phrase 50 times and hope the engine stumbles over it anymore.
SPEAKER_00Right, because we are no longer trying to convince a crawler to rank a link. We're trying to convince a neural network to actually use our data as the foundation for its own generated answers.
SPEAKER_01Aaron Powell We're dealing with machines that literally read, comprehend, and summarize. Okay, let's unpack this. Before we strategize on how to optimize our content for you guys listening, we need to look under the hood. How did the algorithm transition from a basic librarian fetching books to an active reader synthesizing information?
SPEAKER_00Aaron Powell Well, it was a monumental shift in computational linguistics.
SPEAKER_01Yeah.
SPEAKER_00And it's driven primarily by two specific models, BERT and MUM.
SPEAKER_01BERT and MUM. Sounds like an old sitcom couple.
SPEAKER_00Yeah, right. But BERT, which stands for Bidirectional Encoder Representations from Transformers, that one fundamentally changed the mechanism of how search engines process text. Aaron Powell Okay.
SPEAKER_01So how did it work before BERT?
SPEAKER_00So before BERT, natural language processing models generally read a search query linearly, like word by word, left to right.
SPEAKER_01Wait, so if it's reading linearly, it's essentially discarding the nuance of how humans actually communicate, right? Exactly. Like if I say the word bank, a linear model doesn't know if I mean a river bank or a financial institution until it gets further down the sentence.
SPEAKER_00Aaron Ross Powell And by then it's already prioritized the wrong context entirely. It's too late. But Bert solves this by reading bidirectionally. It analyzes the entire sentence simultaneously.
SPEAKER_01Aaron Powell Wait, really? It looks at the whole thing at once.
SPEAKER_00Yeah, it applies mathematical weights to every word based on the words coming before and after it.
SPEAKER_01So suddenly the search engine isn't just matching strings of characters, it's evaluating the complex contextual intent behind a conversational query.
SPEAKER_00Precisely. It's understanding context. Trevor Burrus, Jr.
SPEAKER_01It's like the difference between a bad game of gophists and a high-end concierge.
SPEAKER_00Oh, that's a great way to put it.
SPEAKER_01Right. With old SEO, you ask the engine, you know, do you have ink cards with the phrase best blue running shoes? And the engine checks its hand and says, yes, here are 10 links containing that exact string.
SPEAKER_00And half those links might be absolute spam.
SPEAKER_01Exactly. But with BERT, the concierge recognizes that because you also typed uh Arch support and marathon prep in that same query, your actual intent requires a highly specific subset of biomechanical data, not just a shoe catalog.
SPEAKER_00Right. And what's fascinating here is how MUM, the multitask unified model, takes that concierge capability and just scales it exponentially.
SPEAKER_01Okay, so MUM is the step up from BERT.
SPEAKER_00A massive step up. BERT is brilliant at language context, sure, but MUM is a thousand times more powerful because it operates across multiple languages and media types simultaneously.
SPEAKER_01Wait, so it's not just looking at text?
SPEAKER_00Nope. It doesn't rely on literal translation either. It maps concepts to this sort of language agnostic neural space.
SPEAKER_01Aaron Powell Meaning if I ask a highly complex question in English, MUM might pull the foundational data from, say, a Spanish research paper.
SPEAKER_00Yep, and then extract visual context from a Japanese video tutorial.
SPEAKER_01Oh wow. And then it synthesizes those disparate media formats and generates a cohesive English summary for me.
SPEAKER_00That is exactly what it does. It just shatters the barrier of language and format. The AI is no longer restricted to text to text matching at all.
SPEAKER_01That is wild.
SPEAKER_00And because of this, the entire strategy of SEO has shifted to aligning with three distinct categories of user intent. You've got informational, navigational, and transactional.
SPEAKER_01Okay, so learning, finding a place, and buying something.
SPEAKER_00Exactly. The algorithm predicts whether the user is trying to learn, trying to locate a specific digital destination, or preparing to execute a purchase. And then it dynamically alters the format of the AI overview to match that psychological state.
SPEAKER_01So if the AI is this hyper-intelligent, multimodal concierge perfectly predicting intent, we really have to look at the collateral damage to traditional web traffic. I mean, going back to our intro stats.
SPEAKER_00Yeah, the numbers are kind of scary for traditional marketers.
SPEAKER_01A 2025 Seamrush dataset we reviewed shows these AI overviews triggering on 16% of all queries. And a recent report from The Verge confirms those overviews are reaching over 1.5 billion users every single month.
SPEAKER_00It is a staggering volume of human queries being intercepted, just answered directly on the search results page.
SPEAKER_01Which brings up a massive operational paradox, right? I'll play devil's advocate here.
SPEAKER_00Go for it.
SPEAKER_01If the AI is doing the reading, synthesizing the video from Japan and the text from Spain, and giving the user the final answer, we get zero clicks. The user never leaves Google. So if I'm a business, why would I invest resources into optimizing my architecture just so an AI can scrape my data and give it away for free?
SPEAKER_00It's a fair question. But if we connect this to the bigger picture, the dynamic isn't just extraction, it's citation.
SPEAKER_01Citation. Okay, explain that.
SPEAKER_00AI overviews do not hallucinate answers out of a vacuum. They construct their summaries by pulling from high-quality, authoritative sources and they embed links to those specific sources within the generated text.
SPEAKER_01Ah, so you're still in there.
SPEAKER_00Right. The modern SEO objective is to be the foundational citation the AI relies on.
SPEAKER_01Yeah.
SPEAKER_00If Google's neural network trusts your data enough to use it as the bedrock of an overview, your link is placed directly in front of those 1.5 billion users as the definitive source.
SPEAKER_01But how does the AI mathematically determine trust though? I mean, it's not just counting backlinks anymore, right?
SPEAKER_00Oh, definitely not. It relies heavily on user engagement signals as a real-time feedback loop.
SPEAKER_01Engagement signals. Like what?
SPEAKER_00Well, the AI basically uses human behavior to validate its own algorithmic hypotheses. It is obsessively tracking metrics like click-through rate, dwell time, and bounce rate. And it's using them not just as ranking factors, but as actual training data.
SPEAKER_01Okay, so if a user clicks my citation in an AI overview and spends seven minutes scrolling through my methodology, my dwell time acts as a validation signal.
SPEAKER_00Exactly.
SPEAKER_01The human is basically telling the machine, yes, your AI summary accurately matched my intent, and this source you provided was deeply valuable.
SPEAKER_00You nailed it. And conversely, a high bounce rate tells the AI it made a huge mistake.
SPEAKER_01Aaron Powell Right. If users hit your page and immediately retreat to the search results, the AI learns that your content failed to satisfy the query's intent. Trevor Burrus, Jr.
SPEAKER_00Regardless of how many keywords you optimized for.
SPEAKER_01Wow. So the machine will actively demote your content in future AI overviews because you broke the concierge illusion.
SPEAKER_00Exactly. You made it look bad.
SPEAKER_01So if the AI is ruthlessly evaluating engagement to decide who gets cited, it means the bot has to be able to actually parse your pages structure effortlessly. Yes. Because if your site architecture is a labyrinth, the AI won't even bother trying to extract your brilliant insights. It will just move on to a competitor.
SPEAKER_00Aaron Powell That's the technical reality of it. The physical formatting of your content dictates whether the AI can even utilize it. Natural language processing models are designed to seek immediate relevance resolution.
SPEAKER_01Immediate relevance resolution. So they don't want a long intro story.
SPEAKER_00No, they want the answer cleanly separated from the exposition. This is why answer first formatting has become the absolute gold standard now.
SPEAKER_01Answer first formatting. So you essentially have to flip the traditional narrative structure. Instead of building up to a conclusion at the bottom of the page, you drop the definitive answer right at the top.
SPEAKER_00Right. The core data point or the exact definition directly beneath the heading.
SPEAKER_01And then you use the rest of the section to elaborate on the why and the how.
SPEAKER_00Precisely. And you also have to structure your headings as explicit questions because NLP models are processing conversational queries.
SPEAKER_01Right. Like when a user asks their phone, how do I optimize for Google AI? The machine looks for a corresponding H2 heading that perfectly mirrors that interrogative syntax.
SPEAKER_00Exactly. And once it finds that matching question, it expects short paragraphs, billeted lists, and structured comparison tables immediately following it.
SPEAKER_01Ah, so it wants the data chewed up and ready to swallow.
SPEAKER_00Basically, yeah. Arrays and tables are incredibly high value targets for machine extraction because the data is already mapped logically.
SPEAKER_01Well, we actually have a statistic from the RiseOp data that perfectly illustrates the ROI on this kind of technical formatting. Pages that utilize specific FAQ schema markup are cited three times more often in AI overviews than pages without it.
SPEAKER_00Three times more often. That's a huge advantage.
SPEAKER_01A 300% advantage simply for altering the hidden code behind the text. And here's where it gets really interesting. Because schema isn't just highlighting text, it's essentially slapping a universal machine readable barcode on your concept.
SPEAKER_00A barcode? That's a great analogy. A human reader just sees a standard paragraph explaining, you know, how to change a tire. But the bot scanning the JSON LD schema doesn't have to spend computational power parsing the English language syntax.
SPEAKER_01Right. It just scans the how-to barcode. And that barcode tells the bot this is a procedural tutorial. Here is the array of tools required. Here is step one, step two, and step three.
SPEAKER_00You are entirely removing the friction of interpretation.
SPEAKER_01Wow.
SPEAKER_00And the same applies to FAQ page schema for question and answer formats, or standard article schema for establishing the broader semantic context of a piece.
SPEAKER_01You are speaking directly to the AI in its native structured data format.
SPEAKER_00Exactly. By mapping out the relational structure of your information, you make it infinitely easier for MUM to extract your data and incorporate it into a multimodal summary.
SPEAKER_01Okay, so let's recap where we are. We've mapped user intent, we've implemented answer first formatting, and we've built the perfect JSON LD barcodes across our entire domain.
SPEAKER_00Sounds perfect.
SPEAKER_01But data on the internet rots quickly. The AI's demand for accuracy means a beautifully structured page from 2023 is virtually useless to an overview in 2026.
SPEAKER_00Yeah, stale data degrades the AI's trust score. This introduces the absolute necessity of continual maintenance. And that begins with content freshness.
SPEAKER_01Freshness is key.
SPEAKER_00Oh, massive. The algorithm places a huge premium on up-to-date information, particularly for volatile topics like finance, health, or technology.
SPEAKER_01And the recommendation from the source is a strict quarterly review cycle, right? You can't just go in and change the published date to today.
SPEAKER_00No, no, no. The AI is smarter than that. It checks if the underlying facts, the external outbound links, and the statistics have actually been refreshed.
SPEAKER_01Right. So if you are citing a study from four years ago, the AI assumes your entire premise is potentially compromised.
SPEAKER_00Exactly. And that maintenance extends deep into technical SEO, too. You'd have the freshest, most perfectly schema marked content in the world. But if the AI sends a human user to your site and the page takes six seconds to load, it destroys that user experience. Completely.
SPEAKER_01Which goes back to the engagement signals acting as training data, a slow load time triggers a massive bounce rate.
SPEAKER_00Precisely. Passing Google's core web vitals, which are metrics measuring visual stability, interactivity, and load speed, that is non-negotiable now.
SPEAKER_01And what about mobile?
SPEAKER_00The AI operates on a mobile first indexing paradigm. Yeah. Always. If your mobile architecture is clumsy or elements shift around during loading, or your Roblox.txt file is inadvertently throttling crawl budgets, the AI simply disqualifies you from the citation pool.
SPEAKER_01It just kicks you out. And the mobile first requirement ties directly into the explosion of voice search and local SEO, right?
SPEAKER_00Oh, absolutely.
SPEAKER_01Because models like MUM prep process conversational syntax so efficiently, the way humans physically search has changed, like typing requires brevity. We type best Chicago pizza, but speaking allows for long-tail interrogative queries. We ask our smartwatches, where's the best place to get deep dish pizza open near me right now?
SPEAKER_00And the NLP models parse that spoken sentence, identify the geographic urgency, and immediately prioritize highly accurate location data.
SPEAKER_01So if your business hasn't maintained localized schema markup or, you know, perfectly synchronized your Google My Business coordinates, what happens?
SPEAKER_00The AI won't even consider you a viable answer for that user-specific physical context.
SPEAKER_01Yeah.
SPEAKER_00You're invisible.
SPEAKER_01Wow. Navigating this ecosystem requires an entirely different operational cadence. I mean, agencies like RiseOp are utilizing a methodology they call heavy SEO to track these highly specific AI interactions. They aren't just logging into Google Analytics to look at raw session volume anymore.
SPEAKER_00Because raw sessions just don't tell the story of AI visibility. Advanced practitioners are diving into Search Console's specific performance reporting for AI overviews.
SPEAKER_01Looking for what? Exactly.
SPEAKER_00They isolate AI overview impressions, which is the exact number of times their data was pulled into a generated summary, and they track the AI-driven CTR.
SPEAKER_01Meaning they are analyzing how often users actually click their citation within that summary versus a traditional blue link?
SPEAKER_00Exactly. They cross-reference those AI clicks with dwell time and scroll depth, and then relentlessly A-B test different H2 syntaxes and schema configurations.
SPEAKER_01Just to see what specific structures trigger the AI to pull their data. It's an incredibly granular scientific approach to visibility.
SPEAKER_00It really is. And this raises an important question. If the algorithm is constantly evolving, learning from real-time engagement data and requiring this level of structural perfection, how do human teams maintain agility without burning out?
SPEAKER_01You use the machine to talk to the machine.
SPEAKER_00Exactly. You can't outrun a neural network manually. The strategic consensus right now is to heavily integrate AI tools into your own workflow to automate the architectural heavy lifting.
SPEAKER_01Right. So you deploy AI to analyze semantic intent clusters, generate the complex JSON D schema, and audit your on-page structure to ensure your H2 headings perfectly match predicted NLP queries.
SPEAKER_00Yeah, let the AI handle the bot stuff. You reserve human bandwidth for the actual subject matter expertise, the unique insights, the proprietary data, the perspective that the AI cannot generate on its own.
SPEAKER_01So what does this all mean if we synthesize this entirely new landscape for you listening? First, SEO is a dynamic, continuous feedback loop with a machine that evaluates human engagement to validate its choices. Absolutely. Second, you have to adopt the concierge mindset. Anticipate intent, don't just match strings of text. Third, your physical formatting dictates your crawlability. Use answer-first paragraphs, question-based headings, and deploy schema markup as a universal barcode.
SPEAKER_00And finally, protect your trust signals by maintaining relentless technical hygiene and content freshness.
SPEAKER_01Relentless hygiene.
SPEAKER_00The objective has fundamentally shifted from manipulating an index to partnering with a generative engine. You provide the immaculate structure and the authoritative human insight, and the AI provides the distribution.
SPEAKER_01Which leaves us with a rather profound paradox to consider as we close out this deep dive today. We are adopting AI tools to analyze intent, generate schema, and physically format our insights so they are perfectly legible to a bot. Right. Then Google's AI reads that highly optimized architecture, extracts the data, and synthesizes it into a new summary for the end user. If we are utilizing AI to write and structure the data and another AI is reading and summarizing that same data on the other end, are we rapidly approaching an internet where machines are simply communicating with other machines?
SPEAKER_00It definitely feels that way sometimes.
SPEAKER_01And in a fully automated information loop, where does the spark of human discovery actually happen? Definitely something for you to think about. Thanks for joining us on this deep dive.