top of page

ChatGPT and enterprise search in 2023

Thanks to advances in AI such as neural networks and ChatGPT LLMs (Large Language Models), enterprise search is entering a new realm of precision and capability.





At the core of these capabilities are Large Language Models (LLMs), specifically, a particular generative LLM that ChatGPT makes possible. LLMs are not new, but the rate of innovation, capabilities, and scope are evolving and accelerating at a mind-boggling speed.


The purpose of the search is information retrieval: bringing to the surface something that already exists. Generative AI and apps like ChatGPT are generative and create something new based on what the LLM has trained on.


ChatGPT creates an imperfect reflection of the material it already knows

ChatGPT is a bit like a search in that you interact with it via conversational questions, but unlike search, ChatGPT doesn't retrieve information or content; instead, it creates an imperfect reflection of the material it already knows (into which it has been altered). It really is nothing more than a hodgepodge of words created based on probability.


Applying LLM to search was an expensive and complicated business. That changed last year when the first companies began incorporating LLMs into enterprise search. As better LLMs become available, and as existing LLMs are tuned to perform specific tasks, they will do a better job of displaying the most relevant content, giving us more focused results, and they will do so in natural language. And generative LLMs hold promise for synthesizing search results into digestible and easy-to-understand summaries.


The loss of organizational knowledge is one of the most serious problems that companies face today. High employee turnover, whether through voluntary attrition, layoffs, M&A restructuring, or downsizing, often leaves knowledge stranded on islands of information. Intelligent enterprise search avoids islands of information and enables organizations to easily find, display and share the information and corporate knowledge of their best employees.


Neural search brings the first relevant step in decades for computers

Smart enterprise search tools work much better, with results that are much more relevant than in-app search. Advances in neural search capabilities incorporate LLM technology into deep neural networks: Neural search brings the first relevant step in decades for computers to learn to work with humans and not the other way around.

Neural search is giving quality control a boost: users can extract answers to simple questions when those answers are present in the search corpus. This shortens comprehension time, allowing an employee to get a quick response and continue with their workflow without getting sidetracked on a lengthy search for information.


In this way, question-answering capabilities will extend the usefulness and value of intelligent enterprise search, making it easier than ever for employees to find what they need.


Innovation is based on knowledge and its connections. These come from the ability to interact with content and each other, gain meaning from those interactions, and create new value. Enterprise search facilitates these connections across information silos and is therefore a key enabler of innovation.


Source: venturebeat

Comments


bottom of page