Search as a Channel

Search Volume Is Dying

MarketerFirst LLC Season 1 Episode 14

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 21:15

Between 65% and 85% of what buyers now ask AI has no matching keyword in traditional databases. If your strategy still starts with keyword reports, you are using a shrinking map to describe a growing territory, and that is a dangerous place to be in a board meeting.

In this episode, we break down why search volume is becoming a lagging indicator, how to spot the demand your tools cannot see, and what the smartest CMOs and agency owners are doing instead, from mining sales call language to tracking AI citations and the "ghost citation" problem.

If your traffic is flat but your pipeline questions are getting sharper, this is the conversation that explains why, and what to do next.

SPEAKER_01

If an AI uses your company's content today to, you know, answer a customer's question, about 62% of the time, it won't even mention your name.

SPEAKER_02

Right. Which is just wild.

SPEAKER_01

It really is. I mean, you do all the heavy lifting, you supply the proprietary data, and the AI just serves it up perfectly to a potential buyer, but you are well, you're entirely invisible to them.

SPEAKER_02

Exactly.

SPEAKER_01

So welcome to this deep dive. If you're listening to this right now, whether you're a CMO trying to defend next quarter's organic marketing budget, or maybe an agency owner preparing for a pretty brutal client review.

SPEAKER_02

Oh, we've all been there.

SPEAKER_01

Right. Or even just a business decision maker trying to figure out why your inbound pipeline looks so um wildly unpredictable lately. This conversation is tailored specifically for you.

SPEAKER_02

Aaron Powell Yeah, we've got a lot to get through today.

SPEAKER_01

We really do. We are unpacking a massive stack of recent research. We have new data from Kevin Indig's growth memos, and that's backed by this sprawling global SEMrush analysis.

SPEAKER_02

Aaron Powell And the mission here is really to prove why the traditional models of search and SEO are, well, structurally breaking down. And more importantly, what your marketing team actually needs to do about it on Monday morning.

SPEAKER_01

Aaron Powell Because the stakes here are incredibly high, aren't they?

SPEAKER_02

Aaron Powell Oh, absolutely. Yeah. Trevor Burrus I mean, bringing a traditional keyword ranking report into your next executive board meeting without the context we're about to discuss is just it's a massive liability.

SPEAKER_01

It's like bringing a map from 10 years ago.

SPEAKER_02

Yeah, exactly. You'd be describing a territory that has completely reformed itself. It's a fundamental market visibility issue that goes way, way beyond a simple software glitch.

SPEAKER_01

Well, I want to challenge that right out the gate, though, because I mean, if I'm a marketing director, right, and I look at my dashboard and see that we rank number one for a keyword that gets, say, 50,000 searches a month. Right. I'm still capturing that traffic. So how can we call search volume a useless metric if it's still, you know, tracking physical clicks to my website?

SPEAKER_02

Well, it's not that those specific clicks have vanished. It's that search volume has become a lagging indicator for a really rapidly shrinking slice of total buyer intent. Okay. The SEMrush data we're looking at drops a fascinating stat to illustrate this. Between 65% and 85% of the prompts that people type into Chat GPT have absolutely no matching keyword in traditional SEO databases.

SPEAKER_01

Wait, up to 85%. So that demand is just off the radar entirely.

SPEAKER_02

Entirely.

SPEAKER_01

It's not showing up in tools like AirFs or traditional search planners at all.

SPEAKER_02

Not at all. And the mechanism behind why that happens is what actually matters here. You see, search volume was always just a crude proxy for human intent.

SPEAKER_01

Because we had to type like robots.

SPEAKER_02

Yes, exactly. It forced users to compress their complex problems into these short, stilted phrases just so a database could understand them. So a traditional query was something like best CRM for small law firms.

SPEAKER_01

Which fits neatly into a row on a spreadsheet.

SPEAKER_02

Right. It has measurable historical volume.

SPEAKER_01

Right.

SPEAKER_02

But buyer behavior has fundamentally shifted from searching to conversing.

SPEAKER_01

Okay, so a user who previously would have typed that short phrase into Google is now opening an AI interface and typing what exactly? Give me an example.

SPEAKER_02

Well, they're outlining their specific business constraints. So inside an AI interface, that same buyer's query sounds more like uh we are a small law firm growing fast, our client intake process is disorganized, we have a strict budget of$2,000 a month, and my team hates switching tools. What should we implement first?

SPEAKER_01

Oh wow, yeah, that's a whole paragraph.

SPEAKER_02

Right. And that long, messy, highly specific prompt does not exist in any keyword database because it's entirely unique. But from a business perspective, that conversational prompt is vastly more valuable.

SPEAKER_01

Because it's a buyer ready to actually buy something.

SPEAKER_02

Exactly. They are ready to make a highly specific purchasing decision.

SPEAKER_01

I see. It's the difference between measuring the ripples on the surface of the water versus measuring the deep undercurrents.

SPEAKER_02

That's a great way to put it.

SPEAKER_01

The keyword tools only measure the ripples. It really sounds like looking for your lost keys under a streetlight just because the light is good there.

SPEAKER_02

Yeah, exactly.

SPEAKER_01

So if your agency strategy still starts and ends with a keyword export, you are optimizing for the wrong language. You're optimizing for an outdated syntax while your actual buyers are having these rich, context-heavy discovery conversations with AI models.

SPEAKER_02

And that leads to a very dangerous assumption in the marketing world right now.

SPEAKER_01

Let me guess. If all of this high-intend demand is shifting to AI platforms, the natural assumption for a CMO is that Google is dying.

SPEAKER_00

Yep.

SPEAKER_01

And so we need to divert our entire organic budget away from traditional search and pour it into AI optimization.

SPEAKER_02

That is the assumption, yeah. But the data tells a completely different story. It completely shatters that narrative that AI is a one-to-one replacement for traditional search engines. Interesting. Instead, what we're actually seeing is the evolution of a connected discovery loop.

SPEAKER_01

Okay, let's pull the numbers from the growth memo to ground this because I think this is crucial. The research shows that ChatGPT's outbound referral traffic, which is the traffic it sends out to external websites, grew 206% year over year.

SPEAKER_02

Which is massive.

SPEAKER_01

Massive. But the critical data point is where that traffic is going. The share of ChatGPT's referral traffic going directly back to Google actually climbed from 14% to over 21%.

SPEAKER_02

Wait, so users aren't abandoning search for AI. They're treating AI like the rough draft of their research and then using Google as the fact checker.

SPEAKER_01

Exactly. They are just repositioning search within their workflow. Think about the underlying mechanics of how these large language models or LLMs actually function. Aaron Powell Right.

SPEAKER_02

They're brilliant at synthesizing complex ideas and giving you a baseline understanding.

SPEAKER_01

But they lack real-time authority. They hallucinate, you know? They rarely have up-to-the-minute pricing or nuanced product reviews.

SPEAKER_02

Aaron Powell It's like using Wikipedia in college.

SPEAKER_01

Yes.

SPEAKER_02

You read the Wikipedia page to grasp the broad concepts of a topic, but you absolutely cannot cite it in your final paper. You have to pivot, find the primary source listed at the bottom of the page, and reference that to pass the class.

SPEAKER_01

That captures the modern buyer journey perfectly. A business decision maker starts their exploration inside an AI. They just dump all their messy context and constraints into the model to get a short list of software options.

SPEAKER_02

Aaron Powell Right. But before they pull out a corporate credit card, they experience hallucination anxiety.

SPEAKER_01

Oh, that's a good term. Hallucination anxiety.

SPEAKER_02

Yeah, they need to trust the recommendation.

SPEAKER_01

Okay.

SPEAKER_02

So they take that short list and pivot back to traditional search to cross-reference the AI's claims, read human reviews, and validate the pricing.

SPEAKER_01

Aaron Powell, so as an agency owner, the mandate is incredibly complex. You can't just pick one battlefield.

SPEAKER_02

No, you really can't.

SPEAKER_01

You have to win the initial recommendation inside the AI, and you simultaneously have to maintain absolute visibility in traditional search to catch the user when they bounce back for validation.

SPEAKER_02

Yes. And proving to a client that you successfully initiated that journey inside the AI requires us to look at how visibility is actually measured. Which brings us to what the research calls the ghost citation problem.

SPEAKER_01

Oh, this part blew my mind. Let's lay out the methodology from the SEMrush analysis so we understand the scale of this problem.

SPEAKER_02

Yeah, it wasn't a small sample size.

SPEAKER_01

Not at all. They analyzed 3,981 domains. They ran 115 different prompts across 14 countries, testing them against four major AI search engines.

SPEAKER_02

And they were looking at two distinct outcomes.

SPEAKER_01

First was the domain cited, meaning the AI provided a clickable source hyperlink in a footnote. Second was the domain mentioned, meaning the actual brand name was written out in the AI's conversational text response.

SPEAKER_02

Aaron Powell And the gap between those two outcomes is frankly the single biggest crisis of value facing the digital marketing industry today.

SPEAKER_01

Aaron Powell The numbers are brutal. Almost 75% of domains got cited. They successfully got the footnote link.

SPEAKER_00

Okay.

SPEAKER_01

But only 38.3% actually got mentioned in the text, and a mere 13.2% got both.

SPEAKER_00

Wow.

SPEAKER_01

This leaves a massive statistical hole. 61.7% of all citations are ghost citations.

SPEAKER_02

Aaron Powell To put that in practical terms, the AI reads your company's deeply researched article. It extracts your proprietary expertise, regurgitates it directly to the user.

SPEAKER_01

Slaps a tiny little numerical footnote at the bottom.

SPEAKER_02

Exactly. Slaps a tiny footnote at the bottom of the paragraph and gives your brand absolutely zero name recognition in the text itself.

SPEAKER_01

It's literally like being a ghostwriter for a celebrity. You pour your soul into the manuscript, you do all the heavy lifting, the book becomes a massive international bestseller, but your name is nowhere on the dust jacket.

SPEAKER_02

Right. Nobody knows you wrote it. And for two decades, the entire SEO industry has operated on the belief that a hyperlink is the ultimate currency of the internet.

SPEAKER_01

Because a link meant traffic.

SPEAKER_02

A link meant traffic and a link meant authority. But the user interface of an AI platform fundamentally alters that math. In an AI-driven world, retrieval is not the same as recognition.

SPEAKER_01

Because nobody clicks the little footnote links. I mean, the AI has already answered the question. The user has no incentive to click a tiny gray number to visit your website.

SPEAKER_02

Precisely. And if your agency's reporting dashboard shows a client that they are getting cited by AI and you package that as a victory for brand awareness, you are failing the client.

SPEAKER_01

You're feeding the machine their intellectual property and securing absolutely no brand equity in return.

SPEAKER_02

Exactly. And to make it worse, we throw around the acronym LLM, large language model, as if it's one monolithic brain. We say the AI does this or the AI does that.

SPEAKER_01

Yeah, but my understanding of this research is that these engines don't even agree with each other on how to handle your brand.

SPEAKER_02

Oh, they exhibit entirely different behavioral profiles. The SEMRush data isolated four distinct engines: Gemini, ChatGPT, Google AI overviews, and Google AI mode. And the variance in how they process information is structural. It really comes down to how they are engineered.

SPEAKER_01

Let's look at the two extremes here. You have Gemini on one side and ChatGPT on the other.

SPEAKER_02

Right.

SPEAKER_01

So Gemini names brands in 83.7% of its responses, but it only generates a citation link 21.4% of the time.

SPEAKER_02

It drops brand names constantly, but almost never hands out a source link.

SPEAKER_01

Yeah. And Chat GPT is the exact inverse. It cites its sources obsessively 87% of the time, but it only mentions the brand name in the actual text in 20.7% of its answers.

SPEAKER_02

The reason for this split is deeply technical, but it's crucial for marketers to understand. Gemini is heavily tuned to act like a conversationalist. Okay. It leans on its parametric memory, which is the vast web of associations that formed during its initial training, to construct fluid human-sounding sentences. And in a natural conversation, humans use brand names. We don't speak in footnotes.

SPEAKER_01

Oh, that makes sense. So ChatGPT is different.

SPEAKER_02

Yeah, ChatGPT relies much more heavily on a mechanism called ag or retrieval augmented generation. It operates like a strict academic researcher. Its primary directive is to pull real-time data from external sources and prove where it got the data.

SPEAKER_01

Aaron Powell, which results in heavy footnoting, but a very clinical brand agnostic tone in the text itself.

SPEAKER_02

Exactly.

SPEAKER_01

That perfectly explains why the data showed that 22% of the time, the LLMs fundamentally disagreed on whether to mention a brand at all for the exact same prompt.

SPEAKER_02

Aaron Powell Yeah. The Facebook example in the research was wild.

SPEAKER_01

It was. Gemini was given a prompt and named Facebook in the text three out of three times. Google AI was given the exact same prompt, cited Facebook nine times with footnote links, but only actually wrote the word Facebook once.

SPEAKER_02

Think about the nightmare this creates for an agency trying to build a unified reporting dashboard.

SPEAKER_01

You literally cannot create a single AI visibility report for a client anymore.

SPEAKER_02

No, because an aggregate metric across these platforms is just a mathematical lie now. If you average Gemini's high mention rate with ChatGPT's low mention rate, you present a number to your client that doesn't represent reality on either platform.

SPEAKER_01

It's incredibly frustrating. You can't just run an AI visibility campaign. It has to be LLM specific.

SPEAKER_02

It does.

SPEAKER_01

But beyond the engineering of the platforms, there's also a clear bias in who the AI chooses to name and who it ghosts. The research highlighted a massive divide between strong consumer brands and digital publishers.

SPEAKER_02

Yes, because the AI evaluates the conceptual weight of the brand itself. Strong consumer brands. Yeah. You know, companies with a massive multi-channel public identity get mentioned in the output near 100% of the time.

SPEAKER_01

Aaron Powell The AI just recognizes them as core entities.

SPEAKER_02

Exactly. It doesn't feel the need to link to them to prove they exist. It just discusses them as facts of life.

SPEAKER_01

Aaron Powell But the content aggregators and the media publishers get completely slaughtered here. I mean, Medium.com was cited 16 times across three engines for the same set of prompts and named exactly zero times.

SPEAKER_02

Zero.

SPEAKER_01

Wikipedia was cited 27 times and only named twice. Wired, ScienceDirect, Harvard, they all fit the exact same pattern. Massive citation rates, zero brand recognition.

SPEAKER_02

Aaron Ross Powell The mechanics of this are tied to a concept we can call the public utility threshold. The AI treats these publishers as anonymous data containers. Ouch. It's true though. It doesn't feel the need to attribute the information to Wikipedia because the AI considers the information itself to be common unowned knowledge. It extracts the raw facts and discards the vessel.

SPEAKER_01

Which is terrifying. If you're a business whose entire economic model relies on establishing information authority to drive display ad impressions, this is a structural crisis.

SPEAKER_02

It really is.

SPEAKER_01

The research even showed this variance extending to geography, controlling for the specific AI model. India and Sweden showed massive brand mention rates hovering around 50%.

SPEAKER_02

Yeah, that was fascinating.

SPEAKER_01

But Italy, Brazil, and the Netherlands had incredibly low mention rates down around 20%, while their citation rates were up in the 80s and 90s.

SPEAKER_02

And while the data doesn't explicitly state why, the underlying mechanism is almost certainly tied to cultural prompt structures. Users in Sweden might be asking more brand forward conversational questions, prompting the AI to respond in kind.

SPEAKER_00

Ah.

SPEAKER_02

And then users in the Netherlands might structure their queries in a more academic, comparative format, triggering the LLM's strict retrieval mechanisms, resulting in heavier footnoting.

SPEAKER_01

So we have diagnosed a landscape that is fundamentally chaotic. Search volume is blind to the most valuable conversational queries. The buyer journey is a complex loop between AI exploration and Google validation. Yep. Winning the AI battle usually results in a ghost citation, and the platforms evaluate your brand differently based on their architecture, your industry, and even the language of the prompt.

SPEAKER_02

It's a lot to take in.

SPEAKER_01

It is. So if I'm a CMO or an agency leader looking at my team right now, what is the actual playbook? How do we adapt our strategy on Monday morning?

SPEAKER_02

The overarching shift is moving your operation from commodity SEO to business intelligence.

SPEAKER_00

Okay, untack that.

SPEAKER_02

Well, commodity SEO is churning out generic content to capture high-volume keywords. Business intelligence is structuring content to intercept real buying decisions. There are really four immediate strategic shifts required to do this. First, as we established, your targeting must be LLM specific. Right.

SPEAKER_01

You have to map which topics you want to appear in and test specifically which phrasing patterns produce mentions on Chat GPT versus Gemini.

SPEAKER_02

Exactly. The second shift is completely rethinking content type. The research draws a hard line here. Informational content feeds the machine anonymously. It generates ghost citations.

SPEAKER_00

So if you want to force the AI to name your brand, you have to create comparative and evaluative content.

SPEAKER_02

Let me explain why that happens. If a software company writes a blog post called What is the CRM, they're publishing commodity information. The LLM will ingest it, use it to train its baseline understanding, and never mention the company.

SPEAKER_00

Because what is a CRM is just common knowledge.

SPEAKER_02

Exactly. However, if that same company publishes a deep subjective evaluation titled The Critical Trade-Offs Between Enterprise and SMB CRMs when scaling a sales team past 50 reps, they've created something entirely different.

SPEAKER_01

They've created an opinionated framework.

SPEAKER_02

Yes. LLMs summarize facts anonymously, but they generally attribute opinions and evaluations to the authority making them. You must shift your resources toward the bottom of the funnel, the stage where buyers are actively comparing solutions and seeking expert guidance.

SPEAKER_01

Okay, that makes total sense. So the third shift involves expanding our input sources. I mean, if we know that the traditional keyword databases are missing up to 85% of these complex conversational prompts, we have to stop scraping RFs and SEMrush for our content ideas.

SPEAKER_02

You have to mind the channels where your customers are actually speaking in paragraphs rather than short phrases. You need your marketing team analyzing sales call transcripts on a long.

SPEAKER_01

Oh, that's smart.

SPEAKER_02

You need them reviewing customer success calls. You must dig into Zendesk support tickets and industry community forums. That is where the messy, constraint-laden questions live. Those transcripts mirror the exact syntax your buyers are typing into Chat GPT.

SPEAKER_01

So we need to stop optimizing for the database and start optimizing for the consultant. The winning piece of content isn't the one that repeats the keyword the most cleanly anymore. Not at all. It's the one that deeply, you know, empathetically answers the panicked, nuanced question the buyer is actually asking on a Tuesday afternoon.

SPEAKER_02

Exactly. And the fourth shift is fundamentally changing how you measure success. Traffic volume alone is going to dramatically understate your actual business impact moving forward. Marketing leaders need to build a new intelligence stack.

SPEAKER_01

Aaron Powell Meaning you must track your AI referral visits separately from traditional organic traffic.

SPEAKER_02

Trevor Burrus Right. You must utilize emerging tools to monitor brand mentions in AI outputs rather than just tracking footnote citations.

SPEAKER_01

And most importantly, you need to measure the conversion quality of AI influence sessions, right? Because a user arriving after an AI recommendation is entering your site with a completely different level of intent than someone just clicking a standard Google link.

SPEAKER_02

Aaron Powell They are primed to buy. Let's pull all of this together. The dashboard instruments we've relied on for years are no longer reflecting reality. Search for them is just a tiny visible corner of a massive territory of buyer demand.

SPEAKER_01

Right. AI is not killing Google. It's creating a fragmented discovery loop where exploration happens in the chat interface and validation happens in the search engine.

SPEAKER_02

And avoiding the dread-a-go citation means abandoning generic informational content and building strong, opinionated, comparative frameworks that force the LLM to recognize your brand's authority.

SPEAKER_01

All of this points to a really clear mandate for the modern enterprise, but it also raises a structural question about the future of the internet that we haven't touched on yet.

SPEAKER_02

Yeah, it's something for every executive to deeply consider as they plan for the next five years. Well, the data clearly shows that pure informational content only feeds the AI anonymously, offering zero brand equity. Right. If every smart publisher, agency, and enterprise brand realizes this, they will inevitably shift their entire production pipeline toward comparative bottom-of-the-funnel content to secure those brand mentions. But if that happens, what happens to the AI engines themselves?

SPEAKER_01

Wait, they lose their food source.

SPEAKER_02

Exactly. Large language models rely entirely on continuously scraping the open internet for fresh knowledge. If the primary producers of pure baseline information stop publishing it because there's no longer any measurable return on investment to do so, these AI models will eventually starve for new facts. If the Internet stops producing unopinionated data because it's bad for business, we might be witnessing the beginning of an entirely new economic model for how human knowledge is funded and distributed online.

SPEAKER_01

That concept changes the entire perspective. So the next time you're staring at a flat keyword trend line on your marketing dashboard, remember that you aren't just looking at broken instruments. The fundamental economics of how the internet creates and rewards information are shifting beneath your feet.

SPEAKER_00

Absolutely.

SPEAKER_01

Take these insights into your next strategic planning cycle. Audit your demand intelligence stack, get out of the keyword databases, and ensure your agency or your marketing team is optimizing for how your buyers actually make decisions today, not just what the legacy tools are capable of measuring. Thanks for taking the deep dive with us.