In this week’s episode of the Niche Pursuits podcast, Sergey Lucktinov and I discuss how Semantic SEO is evolving in response to the rise of AI-powered search engines and large language models (LLM). We explore how AI content discovery and ranking systems differ from traditional SEO and what website owners need to change in their approach to content creation, site structure and technical performance to stay ahead.
While the interview delves into some deep technical insights, the bottom line is this: SEO is changing the game, and content creators must adapt or be left behind.
Watch Full Episode
From Traditional SEO to Semantic SEO: The Evolution
Sergey has been working in SEO for over 15 years, first in-house, then in agencies, and finally managing his own affiliate sites. For most of his career, he never really had a problem with Google updates until a few years ago, when one hit him hard.
This prompted a deeper look into Semantic SEO, specifically the approach developed by Koray Tuğberk Gübür, which emphasizes building relevant authority and semantic structuring of websites rather than relying heavily on backlinks.
Semantic SEO is based on:
- Macro semantics: How your website is organized by categories and topics.
- Micro semantics: How custom pages are designed and written.
- Current authority: Comprehensive coverage of the topic to build trust with search engines.
- Actual maps: Organizing content into macro (broad), master (middle level), and node (custom) pages.
After studying the AI infrastructure, Sergey realized that about 90% of Koray’s system reflected AI engineering principles.
How LLMs Access and Sort Content
The biggest change Sergey highlighted is how LLMs like ChatGPT access data differently than Google’s traditional search engine.
- Search engines are deterministic; they return the same results for the same query under the same conditions.
- LLMs use probabilities. They extract information from various sources and synthesize an answer using the “cheapest” and clearest content available.
This change in methodology means that your content must now serve two masters: the algorithmic sequencing of search engines and the probabilistic logic of LLMs. What LLMs look for when sourcing content:
- Clarity and confidence in language.
- A dense, well-defined structure that reflects their internal knowledge systems.
- Semantic coherence, including related entities and topics.
- Delivery speed, because slow websites are immediately excluded from search.
According to Sergey, even a five-second delay can cause your page to lose consideration in the search process.
What is Semantic Search Optimization (SRO)?
Sergey introduces his concept of Semantic Search Optimization (SRO), an evolution of Semantic SEO, tailored specifically to how LLMs process and extract content.
SRO is about shaping your content and website structure to adapt to how AI systems acquire, evaluate and aggregate responses. The main components of SRO:
Structure of the website
- Macro pages cover a wide range of topics and link to main pages.
- Seed pages cover narrower topics and link to node pages.
- Node pages address long-tail requests and link the chain with redundancy.
Strict hierarchical relationship
- Macro → Seed → Node.
- Node → Seed → Macro.
- Never skip levels on your own.
Content Clarity
- Each page and section should focus on one idea.
- Use H2 and H3 as discrete “chunks” of information.
Micro Semantics: Writing for LLM
Once your site structure is assembled, your content should match. This is where micro semantics come into play.
What makes content “high quality” for LLM:
- The use of semantic triplets: Simple, clear sentence structures like “X is Y” that help AI understand relationships.
- Short, focused paragraphs: LLMs divide the content into chunks and each chunk should cover a topic.
- Actual accuracy: LLMs will penalize content that is inconsistent with their prior training or contradicts known facts.
- Rich writing in terms of presence: Pages should mention closely related topics (eg “Paris” when discussing “Eiffel Tower”).
According to Sergey, the ideal article is readable, accurate and structured to be “cheap” for the LLM to run. This balance is key to ranking in AI-driven environments.
The Role of Technical SEO in an AI-Driven World
Speed is everything. While structure and semantics are important, Sergey emphasizes that technical SEO is still the foundation. He estimates that fixing speed, structure, and micro-semantics covers 80%-90% of what SRO requires.
Technical factors to be prioritized:
- Page speed: If your content doesn’t load fast enough, it won’t be considered for search.
- Site structure: Clean navigation helps LLMs understand content links.
- Clean code and layout: AI systems value well-marked, structured content.
Why Speed Matters:
- LLMs can generate 200+ results per query.
- Sites are immediately disqualified if they are too slow.
- The first layer of filtering is based entirely on speed.
Sergey describes the AI search process as a multi-stage filtering system:
- LLMs generate lead queries based on user input.
- They extract the best results from search engines.
- Only the fastest, clearest and most reliable content makes it to the final answer.
Injecting new data: the right way
LLMs use pre-made information for general knowledge, but for recent or niche information, they rely on web content. So, incorporating new ideas is a great way to stand out if you do it right.
Strategies for introducing new content:
- Support claims with logic or cited case studies.
- Avoid wild, unsupported statements.
- When citing data, mention research institutes or studies, even without outbound links.
LLMs don’t necessarily follow links, but they do value context and perceived authority.
Keywords Are Dead: Long Live Meaning
One of Sergey’s most effective recommendations is to move away from thinking keyword first and start optimizing for meaning and intent.
How to change your mind:
- Start with your customer journey, not keywords.
- Identify pain points and tailor content to their questions.
- Think about what the customer needs at each stage and write content that solves it.
Keyword volume isn’t that relevant anymore. AI is not interested in whether your phrase gets 1000 searches per month, but in the depth and clarity of your answers.
Tools and Tactics: What You Can Use
Sergey builds a custom SaaS package to support this methodology, but this time he uses:
- Individual GPTs trained in Semantic SEO principles.
- SEO surfing for basic optimization (though not LLM oriented).
- Manual content audits to cut unnecessary or conflicting pages.
He notes that while object graphing tools exist, many are difficult to use and do not offer effective insights.
Final Thoughts
Semantic SEO has gone beyond search engine rankings. With the rise of LLMs, Sergey Lucktinov’s Semantic Search Optimization offers a way to future-proof your content strategy. His data-driven insights, 90% alignment between AI systems and semantic SEO, speed-based disqualification, and penalties for object mismatches highlight just how different this new era of optimization is.
Here’s how to stay competitive:
- Structure your site to reflect macro, seed, and node logic.
- Write for clarity, consistency, and semantic accuracy.
- Prioritize page speed and actual accuracy.
- Focus on meaning over keywords.
- Train writers to use semantic triplets and microsemantic tactics.
The future of content isn’t just SEO friendly. Ready for artificial intelligence.
Links and Resources






