SEO in 2026: what has really changed

Nearly 60% of searches are zero‑click. Learn how to adapt your SEO, AEO, GEO, and llms.txt to continue generating qualified leads in 2026.

Nearly 60% of searches are zero‑click. Learn how to adapt your SEO, AEO, GEO, and llms.txt to continue generating qualified leads in 2026.
Salomé

Salomé

Lead Creative Webdesigner

January 2026

Summary of the article

In 2026, SEO was not dead, but the way to approach it changed with the explosion of zero‑click searches and generative AIs. The article explains the difference between SEO, AEO, and GEO, and shows why raw traffic is falling while the remaining clicks are more qualified. He details how to adapt his content strategy: combine “classic” keywords and conversational queries, write in the real language of prospects, and accept that certain answers can be directly played out in AI Overviews. It also clarifies what llms.txt does (and isn't doing yet) and describes the We‑R method for analyzing results on Google, Perplexity, and ChatGPT, and then adjusting the content to remain a credible and useful option in this new environment.

In 2026, SEO is far from dead, even if some “experts” love to say the opposite. What has changed isn't the usefulness of SEO, it's the smart way to do it.
AIs like Perplexity, ChatGPT or Gemini have become a reflex for many people, where, for years, Google was a must.

At the same time, several recent studies show that around 55 to 60% of Google searches do not generate any clicks to a site. The user finds his answer directly in the results (snippets, AI Overviews, knowledge panels) or via an AI... and stops there.
Seen only in terms of traffic, this is not reassuring. But the story is a bit more nuanced: if, after reading a summary in an AI Overview, a prospect still decides to click on your site, we can assume that they are more advanced in their thinking and that they have a stronger intention to take action (contact you, ask for a quote, book a call, etc.).

The objective of this article is therefore not to explain to you “the definitive truth about SEO in 2026”. Nobody has it, and those who claim to have it are probably simplifying a topic that isn't.
The idea is rather to take stock: what we observe, what we test, what we don't yet know, and how to continue to move forward without getting lost.

SEO, AEO, GEO: we're putting things back together

We hear a lot of terms: SEO, AEO, GEO, AIO... We will stick to the three main ones and explain them simply.

SEO: the base

“Classic” SEO is:

  • Have a technically clean site (fast, accessible, well structured).
  • Have content that meets real needs (articles, service pages, FAQ...).
  • Have authority signals (links, mentions, opinions...).

SEO refers to the optimizations put in place for search engines (Google, Bing,...). So far, not for AIs a prima facie.

In short: the aim is for your website be shown with a particular page.

AEO: helping engines respond

AEO (Answer Engine Optimization) is the idea of optimizing your content so that it can serve as a “direct answer” to a question.
In concrete terms, this means:

  • Pages that focus on a specific question (“How do I choose a B2B SEO agency?”).
  • Clear and visible answers (introduction that answers quickly, conclusion that summarizes).
  • FAQ sections that include questions that your prospects really have.

This is what helps to appear in snippets, to be included in answer blocks in Google IA Overview, Perplexity, ChatGPT,...

In short: the aim is for contents of your website be extracted and cited by an AI (with a link to your page).

GEO: being cited by AIs

GEO (Generative Engine Optimization) targets AIs like Perplexity or ChatGPT that respond by synthesizing several sources.
Your challenge here is:

  • That your content is clear and credible enough to be used as a source.
  • Let the model judge that you are a trustworthy site on a given subject.

What matters:

  • The quality of information.
  • The coherence of the site on a theme (thematic authority).
  • Expertise signals (concrete cases, data, opinions, external mentions).

In short: your site won't be listed, and neither will an excerpt of your content. Here, your content is used so that the AI can create its own response. You can, at best, appear in the sources.

In practice, SEO, AEO and GEO intersect enormously. You don't need to launch three separate strategies with three independent action plans. What really matters is not what label you put on it (“Global SEO”, “SEO + GEO”, “SEO + GEO”, “SEO IA”, etc.), but how your content adapts to new ways of looking. As long as you (or your SEO agency) keep up with changing search behaviors and incorporate them into your editorial strategy, you're in the right direction.

What has really changed: what we see and what we don't know

1. Zero-clicks: less traffic, but more qualified clicks

Recent reports say about 58% of Google searches don't result in any clicks.
It's logical: Google's AI Overview or Google's AI Mode are often more than enough to cover user questions.

What you may have noticed in the last few months:

  • Declining traffic curves
  • Decrease or stability in conversions, but with more qualified prospects from before.

So it's pretty good news if the objective of your SEO strategy is to generate leads.

2. The way to search has changed

This is something we are certain of, simply because it makes sense:

On Google, we will tend to write keywords (short query: “Brussels SEO agency” or question: “How much does an SEO agency cost?”).

In ChatGPT for example, we will tend to be more conversational: we will put a context on our situation (“I have a company in [sector] in Brussels”), the results we are looking for (“I am looking to have more customers thanks to my website but I don't know how to do it”), we will ask him for advice on the solutions that exist (“how to get more leads and how much does it cost?” , etc.

In short: Google = 1 request, AI = 1 conversation.

How does this impact your SEO strategy?

To put it simply, you should neither do only “keyword” content, nor should you only do “long-tail” content to target more conversational queries.
The idea is to be present in both places: continue to include the keywords that interest you in your content, while also working on more natural formulations.

For example, I know that the terms “SEO agency” and “SEO strategy” are highly sought after by my prospects, so I naturally continue to create articles that target these keywords (like this article, for example).
I also know that some leads are less educated on the strategies to put in place for their site to generate customers, and that they will rather discuss this problem with an AI. So I also create more precise content, with longer queries.

Concretely, this results in blog articles in which we find sentences like:
“How to get customers thanks to your website? [...] Among these strategies, we find SEO. An SEO agency can help you get more leads via your website.”
And above all, content in which I try to use the language of my prospects as much as possible, to get as close as possible to the sentences they would use themselves.

So, your content has more impact when it:

  • Answer specific questions.
  • Use normal language, similar to that of your customers.
  • They go into a real level of detail, instead of being generic.
  • Continue to include your main keywords.

llms.txt: what we do, what we know, what we don't know

We talk a lot about llms.txt like an “AI sitemap”.

What it is, very concretely

  • It's a text file at the root of your site.
  • It is designed for AI models (LLM): it can list the areas to be explored, avoided, or prioritized.

On paper, the idea is good:

  • Give a bit of structure to how AIs read your site.
  • Avoid them focusing on useless areas (technical pages, admin, etc.).
  • Give him guidelines on when to cite your site, what you don't allow to do with your content,...

What is the problem?

The problem is that for now:

  • Serious studies do not show a clear correlation between “having a llms.txt” and “being cited more by AIs”.
  • Google said that llms.txt is not used to decide who appears in its AI Overviews, and that traditional SEO remains the main criteria.
  • Large-scale tests sometimes show a neutral impact, sometimes a slight degradation if the file is misconfigured and blocks useful pages.

So honestly:

  • No, llms.txt is not today a “magic lever” for your AI visibility.
  • On the other hand, yes it is a potential opportunity. Maybe AIs are not taking this into account today and will take it into account tomorrow.

My advice: inject a llms.txt file into your site that is as comprehensive as possible (by checking its content carefully). You have nothing to lose and everything to gain.

How do we work concretely with AIs and research at We-R

We decided to make all this fog pretty simple. Regardless of the hacks, no matter how the algorithm works, the objective of AIs and Google is the same: to offer content that best meets the expectations of their users. So that becomes our only objective as well.

So, we always start with the same simple question:

If I were my prospect, what would I do?

For a client, our approach looks like this:

1. We put ourselves in the shoes of the prospect

  1. We list the questions he could ask himself: “How to choose...”, “How much does it cost...”, “What is the best...”, “What is the difference between...”.
  2. They range from the broadest to the most precise.

2. We test these questions on several tools

  1. Perplexity.
  2. GPT chat.
  3. Google (with and without AI Overview).

3. We look at what comes out

  1. What types of responses come out first.
  2. Which sites are cited most often.
  3. Which questions come up again and again.
  4. How machines interpret these questions and what types of answers are offered.

4. We adapt the content strategy

  1. We create or improve pages that answer the questions that come up the most.
  2. FAQs are added at the bottom of the page, with short questions and clear answers.
  3. We make sure that each piece of content has a very clear main idea, instead of trying to cover everything on the surface.

5. We follow the results

  1. Evolution of traffic and time spent on the page.
  2. Distribution between branched and non-branded requests.
  3. Appearance of the site in the sources cited by the AIs (by repeating steps 1 and 2, and see if the results have changed).

Does it really work?

I'll let you answer that question. If you are on this article it is probably because you found it on Google. And if you have read it so far, yes, we create content that meets the requests of our targets:)

To close: we don't have all the answers, and that's okay

What to accept in 2026:

  • It's unclear exactly how AIs choose their sources.
  • It is also unclear how Google evolves in its requirements.
  • We don't know if certain “trendy” practices (like llms.txt) will have real weight in two years.
  • On the other hand, we know that research continues to change very quickly.

In this context, what matters is not being right about everything.
It is to have:

  • A clear strategy and monitoring of results
  • Dare to test different things
  • But above all, offer the content that your prospects really need.

The aim of all this is not to do “SEO for AI” or “SEO to check boxes”.
It's about making sure that when someone is looking for an answer on your subject — whether on Google or on an AI — you are a credible, clear, and useful option.

Use cases

These businesses have migrated to Webflow

And their marketing teams thank us for it.

No items found.
FAQ

Any questions?

Full transparency to guide smart, informed decisions.