Unlike conventional ecommerce search engines that rely on exact keyword matching and continuous manual updates, ATS is designed to understand the meaning and context of words in a query. This innovative approach eliminates the need for extensive manual configuration, reducing overhead for website owners and facilitating an intuitive search experience for customers. Longtail search queries can make up to 80% of site search and this is one of the key opportunities that ATS is best placed to solve.
“At Particular Audience, we’ve always focused on addressing the root causes of discovery abandonment with applied artificial intelligence,” said CEO, James Taylor. “With ATS, we’ve harnessed the power of Large Language Models, paired with our own vertical tuning to generate the most relevant search results right out of the box. No matter how niche or conversational a search is.”
Adaptive Transformer Search is built using transformer models, converting sequential long form text (retailer catalogue and website data) into vectors in high-dimensional space. The conversion of a sequence of words into a vector is known as sentence embeddings, a concept popularised by large language models such as Google’s BERT and OpenAI’s GPT. This means ATS is capable of understanding the meaning in a sentence and can, for example, understand the difference between ‘getting a laptop online using a credit card’, and ‘getting a credit card online using a laptop’.