Found in AI: AI Search Visibility, SEO, & GEO
Found in AI is a podcast for marketers, founders, and content strategists who want to understand—and win—AI search visibility in the new era of search.
Hosted by Cassie Clark, fractional content strategist and AI search optimization expert, the show explores how platforms like ChatGPT, Perplexity, Gemini, and Google’s AI-powered search experiences discover, select, and surface content.
Each episode breaks down real-world experiments, SEO, GEO / AEO, and content marketing strategies designed to help brands get found in AI-generated answers, not just traditional search results.
You’ll learn how to:
-Optimize content for AI-driven search and answer engines
-Blend traditional SEO with AI search optimization
-Build entity authority across search, social, and AI platforms
-Drive traffic, leads, and trust as search behavior continues to evolve
If you’re trying to future-proof your content strategy and understand how AI is reshaping discovery, Found in AI gives you the frameworks, insights, and tactics to stay visible—wherever search happens next.
Found in AI: AI Search Visibility, SEO, & GEO
LinkedIn Is Now the #2 Source for AI Answers (and What That Means for Your Strategy)
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
📬 Love the podcast? You’ll love the newsletter.
Get the weekly 3-2-1 on AI search + marketing: Subscribe
Semrush just dropped a massive study analyzing 89,000 unique LinkedIn URLs cited by ChatGPT Search, Google AI Mode, and Perplexity—and the findings have major implications for how brands think about LinkedIn content, AI visibility, and distribution strategy.
In this episode, Cassie breaks down the key findings and what they mean through the lens of the FSA Framework (Freshness, Structure, Authority).
What we cover:
- Why LinkedIn is now the second most cited domain across major AI engines — ahead of Wikipedia, YouTube, and every major news publisher
- What semantic similarity scores reveal about how much influence your LinkedIn content has on AI-generated answers
- The content formats, lengths, and intent types that get cited most (and what gets ignored)
- Why consistency and expertise matter more than follower count or viral reach
- How Company Pages and individual creators get treated differently across ChatGPT Search, Perplexity, and Google AI Mode
- What all of this means through the FSA Framework, and how to apply it to your LinkedIn strategy now
Resources:
Semrush's LinkedIn AI Visibility Study
Let’s connect:
LinkedIn → Cassie Clark | Fractional Content Strategist
Website → https://cassieclarkmarketing.com
Download Freshness, Structure, Authority: The Framework for AI Search Visibility:
P.S. Is your brand losing its "Answer Authority"?
Most series A/B and enterprise brands are being "nudged" out of AI search results because of entity gaps and "stale" content. I am opening a limited number of specialized audit slots to help you reclaim your Share of Voice using the FSA Framework (Freshness, Structure, Authority).
Request your 7-Day AI Search Visibility Audit: https://cassieclarkmarketing.com/ai-search-visibility-audit/
Hey, welcome back to Found and AI. I'm Cassie Clark, a fractional content strategist, AI search optimization expert, aka a nerd, and your host for all things GEO, AEO, and AI Visibility Strategy. Today is Thursday, March 12th, and we're covering a study that dropped this week that I think every marketer, founder, and content strategist needs to pay attention to. SimRush just published research where they analyzed 89,000 unique LinkedIn URLs that were cited by ChatGPT Search, Google AI Mode, and Perplexity. And the findings are incredibly validating if you've been following along with the FSA framework, but also full of practical takeaways that you can act on immediately. So let's get into it. Okay, so first, the headline number. LinkedIn is the second most cited domain across ChatGPT search, Google AI mode, and perplexity. Second, ahead of Wikipedia, ahead of YouTube, ahead of every major news publisher. On average, about 11% of AI-generated responses reference LinkedIn content. And it varies by engine. For example, Perplexity Sites LinkedIn and roughly 5% of responses. Google AI mode is closer to 13 or 14%, and ChatGPT Search is the highest at around 14%. Now, if you're a B2B brand or a founder building thought leadership, especially if you're in tech, business services, or finance, this should change how you think about LinkedIn. It's not just a social platform for networking, although it is super helpful, it's also a source that these AI engines are actively pulling from when they generate answers. So if your brand isn't publishing consistently on LinkedIn, someone else's content is filling up that space. AI engines are still going to answer the questions, they're just going to answer it with someone else's words. Now, the next finding in this study is maybe one of the most important, and I think it might get overlooked a little, so I want to call it out. Simrush measured something called semantic similarity. Basically, how closely the AI-generated response mirrors the meaning of the original LinkedIn content. LinkedIn scored between 0.57 and 0.60, which is higher than what we've been seeing with Reddit or Quora content in previous studies. I always feel like I say Quora wrong. Anyway, so what does that mean in plain English? Well, when AI cites your LinkedIn content, it's not just linking to it, it's echoing it back. So the meaning of your original post or article actually shapes how the AI explains the topic. So your positioning, your terminology, how you're framing a prop problem, it's being picked up and reflected back to buyers and AI-generated answers. Which means that the words that you use on LinkedIn aren't just for your followers. They're potentially becoming the default explanation AI gives to anyone asking about your category. This is why I'm constantly telling my audience and my clients, be intentional with your language, define your terms clearly, state your core message in those first few lines, because AI is reading it and repeating it. Okay, so let's talk about what types of LinkedIn content AI engines actually pick up, because this is where it gets practical and helpful. Long-form LinkedIn articles dominate AI citations across all three models. They account for somewhere between 50 and 66% of all cited LinkedIn content, depending on the engine. Now, feed posts, the shorter ones, they make up about 15 to 28%. And the sweet spot for article length, well, it's anywhere between 500 to 2,000 words. Long enough to be comprehensive, short enough to stay focused. For feed posts, mid length, like 50 to 299 words, those get cited the most. Now, here's a stat that I really loved and appreciated. Approximately 95% of cited posts are original content. Reshares barely register, it's just like 5%. So if your strategy has been relying on resharing other people's content and then adding a quick take to it, that's not what these AI engines are picking up. They want original thinking, and that lines up with everything that we've learned about these AI engines so far. And as it turns out, the intent of the content matters too. Over half, and in some models, nearly two-thirds, the cited content is educational or advice-driven. It's knowledge sharing, practical guidance, explaining how things actually work. Product promotion, they get some kind of citation, but it's significantly less. So these AI engines, they're acting like editors, they're surfacing the content that's most helpful to the person asking the question, not really the content that's trying the hardest to sell something. So for everyone who's thought about LinkedIn and thought, I don't have a big enough audience for this to matter. Why am I posting here? Well, the study found that about 75% of cited LinkedIn Post authors are frequent posters. People who publish at least five posts in a four-week period. The occasional contributors, they get cited far less often. And here's where it gets even more interesting. Nearly half of cited authors have over 2,000 followers, which, yeah, I'm sure that helps. But creators with fewer than 500 followers are just as likely to be cited as those with more than 500. So you don't need a massive audience, you just need consistency and expertise. The engagement numbers back this up too. The medium-sided LinkedIn post has about 15 to 25 reactions and no more than just one comment. So these are not those viral posts that you see around. They're relevant posts. AI Verdeville doesn't care about your light count, but it does care whether your content answers the question clearly. This connects directly with the authority pillar of the FSA framework. Authority isn't about being famous, it's just about showing up consistently with credible use for expertise across channels, and doing it often enough that AI engines recognize you as a reliable source. So there's one more finding worth calling out here because it affects how you structure your LinkedIn strategy. Not all AI models treat company content and individual content the same way. Perplexity overwhelmingly favors company pages. They make up about 59% of its LinkedIn citations. But on the flip side, ChatGPT Search and Google AI mode really like individual creators. They account for 59% of citations on both. So the takeaway here isn't just pick one, you need both. You need to invest in your company page, keep it active with your current positioning and publish regularly. And you need to be intentional about building out employee thought leadership. Encourage your subject matter experts to post, give them editorial support, templates, talk topic ownership, whatever that they need to be consistent with their posting. Because depending on which AI antenna buyer is using, they might encounter your brand through your company content or through an individual who works there. The brands that cover both bases are gonna have the significant advantage here. Okay, so let's tie this all together with the FSA framework because it feels like it validates every pillar of the framework. So for freshness, we know now that frequent posters, trying to get that out, frequent posters get cited more often. Consistent publishing signals that you're active, that your content is current, and that's the F of the FSA framework. Now, for structure, those long-form articles, which are inherently more structured with clear headlines, logical flow, direct answers, they are dominating those citations. AI engines can parse and extract from structured content more easily compared to something that's just a wall of text, and that's the S. Authority. So the original content, educational intent, consistent publishing, cross-platform presence between company pages and individual creators. All of that compounds into entity authority, which is exactly what those AI engines use to decide whether your brain is trustworthy enough to cite. That's the A of FSA. And the semantic similarity finding? Well, that tells us something even deeper. It's not just about whether you get cited, it's about how much influence your actual words have on the answer. When your content is fresh, structured, and authoritative, AI doesn't just reference you, it reflects you. That's the whole game. And actually, a quick aside here, I am testing something right now that's directly related to this study. I just launched a LinkedIn newsletter called Found in AI, LinkedIn Edition. Very clever, I know. The idea here is super simple. I'm just repurposing my blog content, publishing it as a LinkedIn newsletter and tracking whether it gets pulled into AI citations, basically running this experiment in real time. So if you're on LinkedIn, connect with me over there. I'd love for you to follow along and see what happens. I'm curious. This has been on the to-do list for uh two or three months now, but life got in the way, so I'm trying it now. So if you take nothing else for this today's episode, let it be this. LinkedIn is no longer just a social platform for your marketing mix. It's a source layer for AI-generated answers. And the content that gets cited isn't viral, so we don't need to worry about that. It's also not super promotional. It's just consistent, original, educational, and well structured. If you've been putting off a LinkedIn content strategy for yourself or for your brand, take this study as a wake-up call. Now is the time to get on it. I'll drop the link to the full SimRush study in the show notes below so you can dig into the data yourself. And if you're trying to figure out where your brand stands in AI Search right now, head over to CassieClarkmarketing.com to get started with that AI visibility audit. That's it for today's update. I will see you next week. Until then, stay visible.