Search as a Channel

Portability Is the Metric Search Leaders Need Now

MarketerFirst LLC Season 1 Episode 16

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 20:43

Most teams measure AI visibility as one blended score. New data across 3.7 million citations from ChatGPT, Perplexity, and Google AI Overviews shows why that's a problem: only 2.37% of cited URLs appeared across all three engines, while more than 91% showed up in just one.

In this episode, we unpack why presence is no longer enough and why portability, the ability of your content to travel across AI retrieval systems, is the metric search leaders should actually care about. We cover why guides and tutorials outperform homepages in cross-engine citations, why grounding answers is a different job than ranking pages, and a three-layer measurement model covering presence, portability, and concentration that you can bring into your next leadership meeting.

If your AI visibility looks healthy but rests on one engine's habits, you may be renting visibility rather than building it.

SPEAKER_01

Imagine um waking up, pouring your morning coffee, checking your website analytics, and realizing that like half of your traffic just vanished overnight.

SPEAKER_00

Right. Which is terrifying.

SPEAKER_01

Yeah, exactly. And for millions of publishers and business owners in 2026, that isn't some weird glitch. It's well, it's the harsh everyday reality of the new Internet. We're looking at this invisible earthquake that has completely fractured how information is discovered.

SPEAKER_00

Aaron Powell It really has. The rules we've all relied on for the past two decades, you know, the basic mechanics of how someone searches for a question and lands on your website, they've fundamentally broken down.

SPEAKER_01

Aaron Powell Completely broken down. Aaron Powell Yeah.

SPEAKER_00

And if you don't understand the new architecture taking its place, you're basically optimizing your business for a game that simply does not exist anymore.

SPEAKER_01

Aaron Powell Which is exactly why today we are taking a deep dive into the underlying engineering and really the hidden data of this massive shift.

SPEAKER_00

Aaron Powell Oh, the data is wild.

SPEAKER_01

It is. We're pulling from uh three really distinct sources today. We have a deeply technical engineering blog post straight from the Microsoft Bing team, a pretty hard-hitting piece from Search Engine Journal about Google's struggle for publisher data, and a highly revealing internal strategy memo on AI search metrics.

SPEAKER_00

Aaron Powell It's a great mix of perspectives.

SPEAKER_01

Aaron Powell It really gives us the full picture. Our mission today is to decode what AI search actually means under the hood, you know, why chasing clicks is just a dead end now, and how you can figure out if your content or your brand will actually survive this transition.

SPEAKER_00

Aaron Powell Because to really grasp why the internet feels so chaotic to use right now, we have to look past the new user interfaces.

SPEAKER_01

Aaron Powell Right, the shiny new chat boxes.

SPEAKER_00

Exactly. Chat boxes are just the surface. Underneath, the foundational infrastructure of search has entirely split in two. The engineers at Bing laid this out in detail in their post. We are now dealing with two completely different optimization goals living inside the very same system.

SPEAKER_01

So let's unpack that split because the Bing team makes this critical distinction between traditional search and what they call grounding for AI.

SPEAKER_00

Yes, that's the core of it.

SPEAKER_01

Traditional search asks one very specific question, right? It asks, which pages should a user visit?

SPEAKER_00

And that one question dictates everything about how the old system was built. The goal of traditional search is breadth and recall. It evaluates the internet by looking at the document as the core unit of value. I mean the whole web page.

SPEAKER_01

Just like indexing millions of pages.

SPEAKER_00

Exactly. And it also relies heavily on this massive assumption, which is that a human being is sitting at the screen to act as a safety net.

SPEAKER_01

A human in the loop to basically do the final quality check.

SPEAKER_00

Precisely. If a traditional search engine gives you 10 blue links, and the first one is say slightly outdated or maybe irrelevant, it's not a catastrophic failure.

SPEAKER_01

You just click the back button.

SPEAKER_00

Right. You, the human, click it, realize it's unhelpful, hit back, and try the next link. The system tolerates imperfect rankings because human users self-correct. Stale content might slowly degrade in ranking, but it doesn't break the entire user experience.

SPEAKER_01

But the moment we introduce generative AI into the mix, that foundational question changes completely. It goes from which pages should the user visit to what information can an AI responsibly use to construct an answer.

SPEAKER_00

Aaron Powell And that shifts the core unit of value away from the whole page.

SPEAKER_01

Aaron Powell Entirely. It's just to what the engineers call groundable information. We are no longer indexing entire documents just to rank them. We're extracting discrete facts with clear provenance. The system isn't trying to point you to a destination anymore. It's slicing up a web page, verifying a specific claim, and feeding that data chunk into a language model to build a response from scratch. Trevor Burrus, Jr.

SPEAKER_00

It's a completely different paradigm.

SPEAKER_01

Aaron Powell It makes me think of the difference between going to a library versus hiring a professional researcher.

SPEAKER_00

Aaron Powell Oh, I like that.

SPEAKER_01

Yeah. So traditional search is the librarian. You ask a question, they hand you a stack of five books, and essentially say, good luck, the answer is in there somewhere.

SPEAKER_00

Aaron Powell You still have to do all the heavy lifting.

SPEAKER_01

Exactly. You do the reading. But grounding for AI is like handing those five books to a research assistant. They read the material, synthesize a one-page brief, and hand it to you with citations.

SPEAKER_00

Which sounds great, right?

SPEAKER_01

It sounds amazing. But here is the massive vulnerability in that setup. If your assistant misreads a crucial statistic on page 10, your final report isn't just a little off. The entire premise is ruined.

SPEAKER_00

That is the perfect way to understand the stakes here.

SPEAKER_01

Yeah.

SPEAKER_00

Because in an AI system, errors in grounding compound exponentially.

SPEAKER_01

Exponentially, yeah.

SPEAKER_00

An AI doesn't just read a fact and paste it, it uses that extracted fact as a premise for its next logical jump. So if the retrieval system pulls a stale piece of data at step one, the AI uses that bad data to draw a conclusion in step two.

SPEAKER_01

And then by step three. It sounds so authoritative while giving you terrible advice.

SPEAKER_00

Aaron Powell, which is exactly why the underlying index has to measure completely different things now. You can't just look at, you know, how many people clicked a link previously or how fresh the published date is. The index must measure factual fidelity.

SPEAKER_01

Meaning what exactly?

SPEAKER_00

Meaning. When the system chopped your article up into a data chunk, did it preserve your original context? It has to measure source attribution quality. And most importantly, it has to be able to detect contradictions in real time.

SPEAKER_01

Okay, I want to pause on that contradiction detection for a second. If two authority websites disagree on a fact, how does the AI handle that?

SPEAKER_00

That's the million-dollar question.

SPEAKER_01

Because if it just silently picks the one it likes better without telling the user, then the AI is basically arbitrating truth behind closed doors.

SPEAKER_00

That is the massive danger here. And the Bing engineers actually explicitly state that in the grounding process, knowing when to abstain is a critical feature, not a bug.

SPEAKER_01

Abstain? Like just refuse to answer?

SPEAKER_00

Yes. If the evidence is insufficient, or if top sources directly contradict each other, the AI must be engineered to abstain from answering entirely. It's a phenomenally high bar for a piece of content to clear. It has to be factual, contextually pure, and undisputed.

SPEAKER_01

Wow. So if the search engine is successfully acting as this flawless research assistant, synthesizing the perfect answer so you never actually have to click a link to read the source, what happens to the person who spent three days writing that source material?

SPEAKER_00

They get nothing.

SPEAKER_01

Right. And this leads us straight into the crisis detailed in the search engine journal report. We are witnessing a devastating click collapse across the web.

SPEAKER_00

The economic impact on creators and businesses is just profound. The data compiled in that report is staggering.

SPEAKER_01

It's hard to read, honestly.

SPEAKER_00

Yeah. A recent Pew study tracked 68,000 search queries and found a massive behavioral shift. When AI overviews appear at the top of a search, users click on the underlying results only 8% of the time.

SPEAKER_01

8%.

SPEAKER_00

Yeah. And when you remove the AI overview, that number jumps to 15%.

SPEAKER_01

So the click-through rate is essentially getting chopped in half just by the presence of the AI box.

SPEAKER_00

Exactly. And it's even more severe for certain segments. Arifs conducted an analysis of 300,000 keywords and found that top-ranking pages suffer a 58% lower click-through rate when an AI summary is present.

SPEAKER_01

58% lower. That's devastating.

SPEAKER_00

And it gets worse. Chartbeat data shows that small to medium publishers have seen a 60% drop in search referral traffic over the last two years. The traffic isn't shifting to competitors. I mean, it is simply vanishing into a data black hole.

SPEAKER_01

Which is an existential threat if you run a business that relies on top of funnel web traffic to survive.

SPEAKER_00

Absolutely.

SPEAKER_01

Watching how Google has managed the public relations side of this has been, well, fascinating. The article traces their narrative evolution. Initially, when publishers began sounding the alarm about plummeting traffic, Google's executives just deflected.

SPEAKER_00

Right. They claimed they had, quote, no data to share on how AI overviews impacted clicks.

SPEAKER_01

Which is hard to believe. Then, as independent third-party data made the drop impossible to deny, their defense shifted. The narrative became yes, clicks are down, but the remaining clicks are higher quality. Users who click through are more engaged.

SPEAKER_00

And that evolution brings us to their most recent defense, which is really something. Google VP Liz Reed recently gave an interview where she coined the term bounce clicks.

SPEAKER_01

Bounce clicks.

SPEAKER_00

Yeah. She argued that the traffic publishers are losing were just users who would have visited a page, realized it didn't have the quick answer they wanted, and immediately bounced back to the search results anyway.

SPEAKER_01

So the implication is almost hey, we're doing you a favor by filtering out these low-value visitors.

SPEAKER_00

That's exactly the spin. It's a very convenient framing for a platform that wants to keep users inside its own walled garden. Seriously. Yeah.

SPEAKER_01

But the search engine journal piece highlights a randomized field experiment that completely undermines that whole bounce click defense. Oh, that study was brilliant. Right. Researchers took a subset of queries and entirely turned off the AI overviews. Organic clicks immediately jumped by 38%. And the most crucial part of that study, user satisfaction did not drop at all.

SPEAKER_00

One bit.

SPEAKER_01

People were just as happy clicking links and finding answers themselves.

SPEAKER_00

So we have an independent study proving users are completely satisfied without the AI, and publishers get 38% more traffic when it's off.

SPEAKER_01

Yeah.

SPEAKER_00

It strongly suggests this bounce-click argument is just a convenient cover story for hoarding the audience.

SPEAKER_01

Aaron Powell The incentives are undeniably skewed. I mean, a search engine's primary financial imperative is to retain attention, but now Google has felt the pressure from publishers, and they have introduced several new features designed to send traffic outward.

SPEAKER_00

Well, designed to look like they're sending traffic outward.

SPEAKER_01

Fair point. The article mentions a few of these new link surfaces. They've started putting inline links directly inside the AI-generated text. They've added hover previews. So if you put your mouse over a citation, you actually see the publisher's name and logo.

SPEAKER_00

Aaron Powell Which is a nice touch, I suppose.

SPEAKER_01

Aaron Ross Powell Sure. There's a new explore new angles module. They are even pulling quotes directly from forums like Reddit to highlight human perspectives and adding these subscribed labels to boost visibility for publications a user already pays for.

SPEAKER_00

On the surface, those look like olive branches to the publishing community. But they completely fail to address the core foundational issue, which is measurement.

SPEAKER_01

Measurement, yes.

SPEAKER_00

Google Search Console is the primary dashboard every webmaster on Earth uses to track their performance. As of right now, that tool still does not separate an AI overview click from a traditional Blue Link search click.

SPEAKER_01

Wait, really? You just get one blended number?

SPEAKER_00

Just one blended number. You can see your total traffic, but you have zero visibility into whether a user found you through a traditional search or if they clicked one of those new inline AI citations.

SPEAKER_01

So you can't even run an A-B test to see if that shiny new subscribe label is actually driving incremental clicks.

SPEAKER_00

Exactly. You are flying completely blind. And that rack of telemetry is what makes this era so difficult for brands. I can imagine You are being forced to optimize your entire digital strategy for a machine learning system, but the platform refuses to give you the specific data required to know if your optimizations are actually working. Clicks are vanishing, and the diagnostics to understand why are deliberately obscured.

SPEAKER_01

So if you're flying blind and traditional clicks are either vanishing or impossible to track accurately, how on earth do you measure success?

SPEAKER_00

That's what everyone is trying to figure out.

SPEAKER_01

Because human nature says we just want to look at a dashboard, see our company name mentioned by the AI, and declare victory. We want to track our presence.

SPEAKER_00

Right, visibility.

SPEAKER_01

Yeah, but the internal strategy memo we reviewed completely tears that idea apart.

SPEAKER_00

It calls the measurement of presence a dangerous illusion.

SPEAKER_01

A dangerous illusion. Strong words.

SPEAKER_00

But accurate. Most leadership teams right now are treating AI visibility as a single blended scoreboard. They buy third-party tracking software, they see their brand cited a few hundred times across various AI tools, and they assume their digital strategy is secure.

SPEAKER_01

But the data in this strategy memo reveals a deeply uncomfortable reality. They ran this massive analysis looking at 3.7 million citations across ChatGPT Perplexity and Google's AI overviews.

SPEAKER_00

Huge data set.

SPEAKER_01

Massive. And out of millions of citations, only 2.37% of the URLs appeared across all three engines for the exact same prompt.

SPEAKER_00

2.37%.

SPEAKER_01

Basically nothing.

SPEAKER_00

It is. That means the absolute vast majority of the time, these highly advanced systems completely disagree on what the best source of information is.

SPEAKER_01

And the flip side of that statistic is what really blew my mind. Over 91% of the URLs appeared in only one engine.

SPEAKER_00

Aaron Powell Which means if you are just looking at a blended visibility score, you are missing massive concentration risk.

SPEAKER_01

Right.

SPEAKER_00

Why do they disagree so drastically? Because these systems have different fetching mechanics, different fine-tuning weights, and different thresholds for what they consider a reliable, groundable fact.

SPEAKER_01

So they all have their own unique definitions of truth, essentially.

SPEAKER_00

Exactly. If your brand only shows up in ChatGPT, but you are completely invisible to Google and Perplexity, you haven't built durable authority. You are just temporarily benefiting from one algorithm's current habits.

SPEAKER_01

It's like um putting your entire retirement fund into one highly volatile altcoin, looking at your portfolio balance on a good day, and thinking you are safely diversified.

SPEAKER_00

You aren't?

SPEAKER_01

No, you're really not. If perplexity updates its fetching logic tomorrow or Google changes its contradiction detection threshold, your entire digital footprint could be wiped out instantly.

SPEAKER_00

Because presence only tells you if you're seen today. It tells you absolutely nothing about how defensible that visibility is tomorrow.

SPEAKER_01

Okay. So if presence is essentially a vanity metric hiding massive vulnerability and traditional clicks are a dying currency, what is the actual metric we should be focusing on?

SPEAKER_00

We need a new North Star.

SPEAKER_01

Right. How do you measure resilience when the internet is fragmenting into all these competing answer engines?

SPEAKER_00

Well, the strategy memo introduces the concept of portability. And this is really the blueprint for surviving the shift.

SPEAKER_01

Portability. Walk me through what that actually looks like in practice.

SPEAKER_00

Aaron Powell Portability is the percentage of your content that successfully appears across multiple AI retrieval systems simultaneously.

SPEAKER_01

Okay.

SPEAKER_00

It answers a very demanding question, which is when one AI engine decides your content is useful and factual, do the other engines independently agree?

SPEAKER_01

Aaron Powell So it's a measurement of algorithmic consensus. Exactly. If someone asks a complex question about, I don't know, supply chain logistics and chat GPT Bing and perplexity all independently crawl the web, synthesize the data, and all three cite your article as the source of truth, then your content is highly portable.

SPEAKER_00

Yes. Portability separates a temporary algorithmed fluke from undeniable utility. The memo lays out a new 2026 measurement model that teams need to adopt.

SPEAKER_01

And it has three parts, right?

SPEAKER_00

Yep. You must track three dimensions simultaneously. First, presence, where do you appear at all? Second, portability. Do you appear across multiple engines for the same query? And third, concentration. How heavily reliant is your overall visibility on just one specific engine?

SPEAKER_01

Okay, if portability is the new holy grail, it begs the question: what kind of content actually travels well? Because clearly, based on that 91% stat, most of the web is failing this test. Right.

SPEAKER_00

The analysis in the memo breaks down portability by content type, and the hierarchy is incredibly revealing. Guys and tutorials have the absolute highest cross-engine overlap at 2.3%.

SPEAKER_01

Makes sense.

SPEAKER_00

Blogs and editorial pieces follow at 1.8%. Category pages and product listings just drop off a cliff. And sitting dead last with only a 1.1% portability rate.

SPEAKER_01

Home pages. Home pages are dead last.

SPEAKER_00

Dead last.

SPEAKER_01

That is going to cause a lot of panic for web designers. I mean, why is a tutorial twice as likely to be cited across multiple AIs than a company's main homepage?

SPEAKER_00

It comes down to how large language models actually parse data. A tutorial is inherently structured. It uses chronological steps, clear headers, bullet points, and definitive statements.

SPEAKER_01

It's organized.

SPEAKER_00

Exactly. It is incredibly easy for an AI to extract a factual claim from a tutorial without hallucinating. A homepage, on the other hand, is usually a nightmare for an AI. It's full of dynamic carousels, vague marketing fluff, and statements like, you know, we provide synergistic solutions for the modern enterprise.

SPEAKER_01

Ah, right. An AI cannot ground a fact based on synergistic solutions. It means nothing.

SPEAKER_00

Nothing at all. There's no groundable information there. So the AI just skips the homepage and looks for a page that actually explains a concept clearly.

SPEAKER_01

Aaron Powell So what does a business owner listening to this do with that information? Do you just abandon your shiny, expensive homepage entirely? No, definitely not. Like do you stop writing marketing copy and just churn out dense, highly structured tutorials?

SPEAKER_00

Aaron Powell You have to separate your human audience from your AI audience. The homepage is certainly not dead for human beings. When a person already knows your brand, they type your name into the browser and they want that clean, branded, emotional experience.

SPEAKER_01

They want to see what you're about.

SPEAKER_00

Exactly. You still need that. But for AI discoverability, utility is the absolute king. The asset that travels best is the one that best explains, compares, or teaches.

SPEAKER_01

Aaron Powell Because the AI isn't holding a credit card. It's not trying to buy your product or vibe with your brand aesthetic. It is trying to fulfill a user's prompt with verifiable data.

SPEAKER_00

Which means content creation is no longer just an editorial function. It is your core infrastructure for discoverability.

SPEAKER_01

That's a huge mindset shift.

SPEAKER_00

It is. To build portable assets, you have to look at your website and ask a hard question. Which of these pages would still be intensely useful to a reader even if they never click through to buy anything from us?

SPEAKER_01

And those are the ones that win.

SPEAKER_00

Those are the pages the AI engines will agree on. Those are the pages that will survive.

SPEAKER_01

Man, this entire deep dive really forces a complete unwiring of how we think about the web.

SPEAKER_00

It really does.

SPEAKER_01

We've traced this massive architectural shift from traditional search, which, you know, just pointed humans to documents and let them figure it out, to the high-stakes world of AI grounding, where machines synthesize facts and can compound their own errors. Trevor Burrus, Jr.

SPEAKER_00

This whole new world.

SPEAKER_01

It is. And we've seen how that shift has triggered a click collapse, starving creators of traffic while platforms like Google obscure the actual data with these blended dashboards. And we've learned that surviving this fragmentation means abandoning vanity metrics like presence.

SPEAKER_00

Can't rely on it.

SPEAKER_01

No. You can't just be seen by one algorithm. You have to build highly structured, objective content that achieves portability across multiple AI systems.

SPEAKER_00

Aaron Powell It is a phenomenal transition from optimizing for clicks to optimizing for undeniable utility.

SPEAKER_01

Yeah.

SPEAKER_00

But you know, looking at the types of content that achieve that portability, the heavily structured guides, the objective tutorials, it brings me to something that really concerns me about the future of this space.

SPEAKER_01

Aaron Powell Oh, what's that?

SPEAKER_00

Aaron Powell Well, if the undisputed data shows that AI engines increasingly prioritize and agree on highly objective explanatory content over anything brand centric, what happens to the concept of voice on the Internet? Oh, wow. Right. The only way to achieve portability and reliably get cited by these massive AI systems is to structure your writing like a neutral, hyper-efficient textbook. Do we risk losing the personality, the strong opinions, and the wonderful messy weirdness of the web entirely?

SPEAKER_01

That is a really unsettling thought.

SPEAKER_00

If we are forced to write strictly for the machines to survive, what exactly is left for the humans?

SPEAKER_01

That is an incredibly heavy thought to leave off on, but it is exactly what you all need to be pondering as you navigate this new digital landscape. You might still be looking at what appears to be the exact same search bar you've used for 20 years, but the machinery underneath has completely changed. Thanks for taking this deep dive with us.