How Bazaarvoice Navigated the Challenges of Deploying an LLM App

preview_player
Показать описание
Bazaarvoice, a top platform for user-generated content and social commerce, has leveraged AI for much of its history — and now has a pioneering LLM app in production.

Lou Kratz, Principal Research Engineer at Bazaarvoice, leads those efforts from a technical perspective. “The biggest impact AI has at Bazaarvoice is around ensuring the content that we provide our clients — which are generated by users — is of high quality,” he recently said. In addition to leveraging other AI systems, “we used generative AI recently to release what we call a content coach that guides consumers through the process of writing a good review.”

Kratz sees two challenges that you may not want to neglect when getting an LLM app into production:

- Data quality for RAG: “You look at something like retrieval augmented generation — it’s really powerful, it can really make things explainable and usable to the general public — but it's only as good as the data we give it. When it comes to business-specific data, the first challenge is getting that cleaned up.”

- Education: “Almost all of our data scientists and engineers have become mentors…in order to help people understand the specifics about how AI works and if it solves their use case.”

We caught up with Kratz at Arize:Observe this year to talk about their generative AI use cases, and how they overcome challenges in production.
Рекомендации по теме