The world’s leading publication for data science, AI, and ML professionals.

GenAI Is Revolutionizing Search

And why you and your company should care

Picture by Agnieszka Boeske, on Unsplash
Picture by Agnieszka Boeske, on Unsplash

Advancements in Artificial Intelligence are transforming traditional search engines into answer machines. This shift is brought by both new and traditional players in the web search space and is impacting how people around the world access information.

Who are the main players in GenAI-based search and how are they approaching their solutions? What are the implications for users? How can companies ensure their content remains visible to users in this new search paradigm? And what does that mean for Product Managers, Data Scientists, and anyone working in tech? That’s what we’ll cover in this blog post!

The players

For the first time in years, Google’s dominance in the search engine market faces credible competition. New companies have emerged with core Genai search products, such as:

  • Perplexity "Where knowledge begins": GenAI-based searcher with added capabilities such as links to sources, related questions, picture display, and image generation.
Perplexity's UI with "What is a search engine" question example
Perplexity’s UI with "What is a search engine" question example
  • You.com – "Use Smart assistant to ask a question / search the web / help revise an email…": GenAI-based searcher allowing the selection of different LLMs.
  • Future Search – "Don’t ask a $10,000 question to an AI that will give you a $0.10 answer": focused on deep diving into company-related questions such as market sizing, forecasting, or sales estimations.

We’ve also seen how traditional search engines or companies have also experimented and iterated on their own products to include GenAI features:

  • Bing (Microsoft) – Moved from a dedicated copilot that acted as a chatbot and added multimodal capabilities such as image generation and web search, to integrating GenAI responses next to the "traditional" web search results. Recently they announced further iterations to create dynamic sections and content extending the LLM text response to other formats.
Microsoft's Bing UI with "What is a search engine" question example
Microsoft’s Bing UI with "What is a search engine" question example
  • Google: Through Gemini, one can access the knowledge of their LLM, which also has web search capabilities, and is able to validate responses and add attribution links to the sources of the information.

Last week, we heard a new player was also joining this competitive genAI-based search space: OpenAI! They announced it as a "temporary prototype of new AI search features that give you fast and timely answers with clear and relevant sources", and similarly to Microsoft’s approach, it isn’t limited to text response but extends to producing dynamic formats, sections, and displays depending on the question.

Implications for the Users

All these companies and products are reshaping how users interact with search engines and find and access information. **** Search engines have traditionally provided users with a list of relevant resources for a specific keyword or query. This required users to invest some time going through the different links and obtaining the right answer or context. For many years, search engines have improved their retrieval and sorting logic to try to place what the user is looking for in the first result (thus decreasing the time the user needs to scroll and check resources).

AI-driven search goes one step further, aiming to deliver concise, summarized responses directly to user queries. In that scenario, users no longer need to scroll through countless results and read lengthy pages to find what they are looking for.

Soon, users will need to play and gain sensibility over these solutions. For certain searches, the traditional search might work best, for others, conversational / genAI search might be preferred due to being faster or more efficient, or due to added creativity, rephrasing, or personalization into the responses.

There are, though, risks associated with this new approach. Probably the most important one is the potential for misinformation and "hallucination" characteristics from LLMs. This issue recently became viral as a search engine would tell users to eat rocks and other erratic answers. To help ensure the accuracy and reliability of AI-generated responses, many solutions are exploring RAG (Retrieval Augmented Generation) and other techniques to decrease hallucination risks while adding attribution links to the resources used to produce the answer so users can double-check responses and the credibility of the source. If you want to learn more about RAG and related concepts like prompting, fine-tuning, or agents, you can check my previous blog post about it:

The 4 new trendy AI concepts, and their potential in digital products

Getting straightforward responses might also make us lazier, trust too much the generated content from LLMs, and lose our critical thinking. On top of that, there are also risks related to intrinsic bias and discrimination characteristics of AI models that can amplify existing societal biases.

Implications for the Industry

As the usage of GenAI-based search engines expands, companies and digital products need to rethink strategies for optimizing content visibility. Traditional SEO techniques, which focus on keyword optimization, will need to evolve to account for whole questions, conversational contexts, and reverse engineer the behavior of LLMs to ensure certain content is featured in AI-generated responses. This is a promising field that starts by SEO optimization, but continues towards LLM optimization and specific deals (for a deep-dive into AI search optimization, check out this great conversation from Reforge’s last Ref:AI event):

  • Keep optimizing traditional SEO: as it remains valuable given that GenAI searches usually call the traditional search process to obtain several relevant documents and generate a response from there.
  • LLM Optimization: Ensuring content appears in scrappable documents with language that aligns with potential user queries. Emphasize key qualities, benefits, and usage scenarios close to product names to encourage LLMs to expand on these aspects in their responses (such as in the tennis sneakers example below!).
Example of brand visibility for a question like "What are the best tennis sneakers I can buy?"
Example of brand visibility for a question like "What are the best tennis sneakers I can buy?"
  • Specific agreements, such as the one between OpenAI and The Atlantic, that will make The Atlantic‘s articles discoverable through OpenAI’s products and include attribution links to access the full article easily. This is a great example of the growing trend of integrating media content with AI platforms to enhance discoverability.

Implications for PMs, DSs, and other tech roles

Even if you don’t work on a search-related product, soon there might appear interesting initiatives to help your company position well in the responses of all these new search engines and the information users around the world access. This is a new field that will probably require a lot of sensibility about how LLMs work, the definition of new metrics to assess optimizations, and potential simulations and trial and error tests. There will be the need to propose solutions in the line of "include this type of words in these documents that mention product X" with the goal to, for example:

  • Increase the % of search answers that contain our brand, and try this for variations of the same question, in different languages, from different users…
  • Improve the quality of the brand appearance in the response, by assessing the position, positiveness, or criticism around your brand…

In this scenario, Data Scientists can bring their expertise in the field of Natural Language Processing, the sensibility of LLMs and predictive models, and metric definition and evaluations. Other roles such as Machine Learning Engineers or Back End Developers might also participate in these initiatives to scale queries, collect the data, and deploy the processes.

It can also be the case that there are internal use cases to apply AI-based search! For online marketplaces or similar companies, comparably to web search, there is the opportunity to explore what going from traditional product search towards AI-based search means. Maybe users value a summary or some conclusions about the list of products displayed in their search, or maybe there is a degree of personalization in those text displays that can make the content more relevant to the user… As big players start including these AI features in search, more companies will see value and competitive advantage in doing so too.

Wrapping it up

The future of search is undeniably tied to AI. Most users will soon explore and adapt to this new way of finding and accessing information, and data and tech literacy will be key to decreasing risks around misinformation, bias, discrimination, and smaller critical thinking capacity.

As search engines evolve into answer machines, companies will need to adapt their strategies to stay visible and relevant. There will be a growing need to develop techniques to ensure their content is properly indexed, retrievable, and generated by AI-driven search engines. For companies offering search functionality, there is also the opportunity to explore what added value can AI-based search bring to their users. Either way, Data Scientists and other tech roles will be key to developing those solutions.

The combination of search and GenAI is fascinating, with a huge potential for many companies and cool Data Science initiatives ahead. This is just the beginning!


Related Articles