Why Prediction Of 25% Search Volume Drop Due to Chatbots Fails Scrutiny via @sejournal, @martinibuster

Seven facts show why AI chatbots have no chance of causing a 25% drop in search volume by 2026 The post Why Prediction Of 25% Search Volume Drop Due to Chatbots Fails Scrutiny appeared first on Search Engine Journal.

Why Prediction Of 25% Search Volume Drop Due to Chatbots Fails Scrutiny via @sejournal, @martinibuster

Gartner’s predictions that AI Chatbots are the future and will account for a 25% drop in search market share got a lot of attention. What didn’t get attention is the fact that the claim fails to account for seven facts that call into question the accuracy of the prediction and demonstrates that it simply does not hold up to scrutiny.

1. AI Search Engines Don’t Actually Exist

The problem with AI technology is that it’s currently impossible to use AI infrastructure to create a constantly updated search index of web content in addition to billions of pages of news and social media that is constantly generated in real-time. Attempts to create a real-time AI search index fail because the nature of the technology requires retraining the entire language model to update it with new information. That’s why language models like GPT-4 don’t have access to current information.

So-called AI search engines aren’t really AI search engines. In practice, they’re chatbots that are inserted between the searcher and a traditional search engine.  When a user asks a question, a traditional search engine finds the answers and the AI chatbot chooses the best answer and summarizes them in a natural language response.

So, when you use a chatbot AI search engine what’s essentially happening is that you’re asking a chatbot to Google/Bing it for you. This is true for Bing Copilot, Google SGE and Perplexity. It’s an interesting way to search but it’s not an actual AI-based search engine, there’s still a traditional search engine behind the chatbot.

The time to panic is when the transformer technology goes through a significant change so that it can handle a real-time updated search index (or another technology replaces it). But that time is not here yet, which makes the prediction of a 25% drop in search demand by 2026 appear a bit premature.

2. Generative AI Is Not Ready For Widescale Use

The recent fiasco with Gemini’s image search underscores the fact that generative AI as a technology is still in its infancy. Microsoft Copilot completely went off the rails in March 2024 by assuming a godlike persona, calling itself “SupremacyAGI,” and demanding to be worshipped under the threat of imprisoning users of the service.

This is the technology that Gartner predicts will take away 25% of market share? Really?

Generative AI is unsafe and despite attempts to add guardrails the technology still manages to jump off the cliffs with harmful responses. The technology is literally in its infancy. To assert that it will be ready for widescale use in two years is excessively optimistic about the progress of the technology

3. True AI Search Engines Are Not Economically Viable

AI Search Engines are exponentially more expensive than traditional search engines. It currently costs $20/month to subscribe to a Generative AI chatbot and that comes with limits of 40 queries every 3 hours and the reason for that is because generating AI answers is vastly more expensive than generating traditional search engine responses.

Google last year admitted that an AI chat is ten times more expensive than a regular search engine query.  Microsoft’s GitHub Copilot is reported to lose an average of $20 per user every month. The economic realities of AI technology at this time basically rules out the use of an AI search engine as a replacement for traditional search engines.

4. Gartner’s Prediction Of 25% Decrease Assumes Search Engines Will Remain Unchanged

Gartner predicts a 25% decrease in traditional search query volume by 2026 but that prediction assumes that traditional search engines will remain the same. The Gartner analysis fails to account for the fact that search engines evolve not just on a yearly basis but on a month to month basis.

Search engines currently integrate AI technologies that increase search relevance in ways that innovate the entire search engine paradigm. For example, Google makes images tappable so that users can launch an image-based search for answers about the subject that’s in the image.

That’s called multi-modal search, a way to search using sound and vision in addition to traditional text-based searching.  There is absolutely no mention of multimodality in traditional search, a technology that shows how traditional search engines evolve to meet user’s needs.

So-called AI chatbot search engines are in their infancy and offer zero multimodality. How can a technology so comparatively primitive even be considered competitive to traditional search?

5. Why Claim That AI Chatbots Will Steal Market Share Is Unrealistic

The Gartner report assumes that AI chatbots and virtual agents will become more popular but that fails to consider that Gartner’s own research from June 2023 shows that users distrust AI Chatbots.

Gartner’s own report states:

“Only 8% of customers used a chatbot during their most recent customer service experience, according to a survey by Gartner, Inc. Of those, just 25% said they would use that chatbot again in the future.”

Customer’s lack of trust is especially noticeable in Your Money Or Your Life (YMYL) tasks that involve money.

Gartner reported:

“Just 17% of billing disputes are resolved by customers who used a chatbot at some stage in their journey…”

Gartner’s enthusiastic assumption that users will trust AI chatbots may be unfounded because it may not have considered that users do not trust chatbots for important YMYL search queries, according to Gartner’s own research data.

6. Gartner Advice Is To Rethink What?

Gartner’s advice to search marketers is to incorporate more experience, expertise, authoritativeness and trustworthiness in their content, which betrays a misunderstanding what EEAT actually is. For example, trustworthiness is not something that is added to content like a feature, trustworthiness is the sum of the experience, expertise and authoritativeness that the author of the content brings to an article.

Secondly, EEAT is a concept of what Google aspires to rank in search engines but they’re not actual ranking factors, they’re just concepts.

Third, marketers are already furiously incorporating the concept of EEAT into their search marketing strategy. So the advice to incorporate EEAT as part of the future marketing strategy is itself too late and a bit bereft of unique insight.

The advice also fails to acknowledge that user interactions and user engagement not only a role in search engine success in the present but that they will likely increase in importance as search engines incorporate AI to improve their relevance and meaningfulness to users.

That means traditional that search marketing will remain effective and in demand for creating awareness and demand.

7. Why Watermarking May Not Have An Impact

Gartner suggests that watermarking and authentication will increasingly become common due to government regulation. But that prediction fails to understand the supporting role that AI can play in content creation.

For example, there are workflows where a human reviews a product, scores it, provides a sentiment score and insights about which users may enjoy the product and then submits the review data to an AI to write the article based on the human insights. Should that be watermarked?

Another way that content creators use AI is to dictate their thoughts into a recording then hand it over to the AI with the instruction to polish it up and turn into to a professional article. Should that be watermarked as AI generated?

The ability of AI to analyze vast amounts of data complements the content production workflow and can pick out key qualities of the data such key concepts and conclusions, which in turn can be used by humans to create a document that is filled with their insights, bringing to bear their human expertise on interpreting the data. Now, what if that human then uses an AI to polish up the document and make it professional. Should that be watermarked?

The Gartner’s predictions about watermarking AI content fails to take into account how AI is actually used by many publishers to create well written content with human-first insights, which absolutely complicate the use of watermarking and calls into question the adoption of it in the long term, not to mention the adoption of it by 2026.

Gartner Predictions Don’t Hold Up To Scrutiny

The Gartner predictions cite actual facts from the real-world. But it fails to consider real-world factors that make AI technology as an impotent threat to traditional search engines. For example, there is no consideration of the inability to of AI to create a fresh search index or that AI Chatbot search engines aren’t even actual AI search engines.

It is incredible that the analysis failed to cite the fact that Bing Chat experienced no significant increase in users and has failed to peel way search volume from Google. These failures cast serious doubt on the accuracy of the predictions that search volume will decrease by 25%.

Read Gartner’s press release here:

Gartner Predicts Search Engine Volume Will Drop 25% by 2026, Due to AI Chatbots and Other Virtual Agents

Featured Image by Shutterstock/Renovacio