Small language models poised to have a big impact in retail

As sustainable alternatives to resource-heavy and costly LLMs, SLMs mark a pivotal development in AI.
30 May 2024
Retail-digital
Barry Thomas
Barry
Thomas

Senior Retail Commerce Thought Leader, North America

Get in touch

As businesses navigate the ever-evolving digital landscape, the integration of AI becomes increasingly crucial to compete and win in the marketplace. One significant development in this space is the emergence of small language models (SLMs), which are gaining traction across various sectors, including CPG and retail. These models have the potential not only to enhance operational efficiencies but to also reshape customer interactions.

What is an SLM?

An SLM is a scaled-down version of a large language model (LLM) that is designed to perform specific tasks involving understanding and generating humanlike text. Unlike their larger counterparts, SLMs require less computational power and data to train, making them more accessible and easier to deploy, especially for businesses that do not need or cannot afford the vast capabilities of larger models.

SLMs are generally 5 to 10 times smaller than an LLM, so they consume much less energy. They can also be hosted on a single graphics processing unit, which is a major benefit given the shortage of these chip sets and the steep computational costs.

Key benefits of SLMs

These characteristics make SLMs particularly suited for small to medium-sized enterprises or departments within larger corporations that need to implement AI solutions quickly and efficiently:

  • Efficiency: SLMs can be executed on standard desktop machines, enabling companies to quickly develop and deploy natural language solutions for faster experimentation, prototyping, and iteration.
  • Customization and adaptability: Companies can train SLMs to their specific requirements. By leveraging proprietary data sets, SLMs excel in specialized domains, providing accurate and relevant results.
  • Lower cost: While large models are impressive, their expense can burden companies with limited resources. Looking to 2027, 80% of organizations plan to accelerate process automation as well as invest in learning and training on the job for current employees. SLMs offer a cost-effective alternative, allowing local training and deployment without incurring substantial ongoing costs and licensing fees.

How could retailers could go big with SLMs?

Two retail-specific use cases lend themselves well to SLMs:

  • In-store personal assistants: Retailers could integrate SLMs into their in-store kiosks or mobile apps to act as personal shopping assistants. These SLM-powered assistants can quickly process shopper queries, offer personalized shopping advice, and provide product recommendations based on shoppers’ preferences and past purchase history.

    Consider a shopper looking for a skin care product. The SLM assistant could ask about the shopper’s skin type, concerns, and preferences, and then instantly suggest products that address those needs.

    This interactive and tailored experience also helps manage privacy, because the localized data processing (a key SLM advantage) minimizes data transmission over the cloud.

  • Enhanced customer support: SLMs can handle customer service queries through text or voice-based interfaces. These models can be embedded directly into retailers’ customer support systems, such as chatbots on websites or customer service apps on smartphones. By processing inquiries locally, the SLMs can quickly respond to questions about product availability, order status, store hours, and more, without the latency associated with cloud-based processing.

What’s next?

The landscape of SLMs has been evolving quickly, with companies including Microsoft, OpenAI, and Hugging Face releasing compact models that offer impressive capabilities.

As sustainable alternatives to resource-heavy and costly LLMs, SLMs mark a pivotal development in AI. With their efficiency, customizability, and lower operational costs, SLMs could help retailers reimagine what they can achieve. Moreover, smaller challenger brands and CPGs are likely to turn to SLMs as a cost-effective way to take advantage of AI.

By integrating these agile, small, and powerful AI models into retail operations, retailers and CPGs can seize the opportunity to lead.

For a more detailed look at AI use cases for enterprise organizations and similar topics, register for Kantar's 2024 Digital Commerce Conference.

Get in touch
Related solutions
Get the empowering insights you need to boost your business, and shape the future of retail.
Get the full view of how people shop online, from a category and brand perspective.