Kore.ai Releases “Lighter, Less Expensive” LLMs for Customer Experience Use Cases – CX Today

author
4 minutes, 43 seconds Read

Kore.ai has shared the latest release wave for its coveted conversational AI offering: the XO Platform.

Ads


World’s Leading High-rise Marketplace

That includes new XO GPT Models, which are tailored large language models (LLMs) that the Kore.ai team has developed to be “lighter, faster, and less expensive”.

In other words, they require less computing power, lowering costs for Kore.ai and – consequently – its customers.

The move is an important step toward the LLMs of the future, which are likely to be highly customized, fine-tuned, and hosted in the vendor’s own data center.

Indeed, Kore.ai has already fine-tuned the XO GPT Models for customer experience conversational use cases.

Moreover, they impressively enable paraphrasing within automated customer responses, enabling more human-like conversations.

Yet, Kore.ai does not tie its customers to these models. It enables enterprises to deploy and orchestrate various LLMs for different use cases to optimize their performance.

Also, customers can get into the nuts and bolts, altering the pre-configured prompt and business rules behind each use case to further test and optimize their deployments.

Five9 released a similar capability in March for CCaaS use cases – but this is the first time a pureplay conversational AI provider has brought such a feature to market.

More “Industry-First” Features from the Release

Elsewhere, Kore.ai has pulled its portfolio closer together, allowing users to access its Automation, Contact Center, Search, and Agent AI within one unified interface.

Derek Top, Senior Analyst at Opus Research, sees significant value within this single interface – alongside Kore.ai’s LLM advancements.

“Kore appears to be fulfilling a market requirement by providing Generative AI capabilities into existing and emerging workflows,” he told CX Today.

Offering a multi-LLM approach and single interface provides flexibility for enterprises looking to start small and expand use cases in order to fit business needs.

“Also, given the various rates of AI adoption, many companies are leaning on solution providers as a way to get comfortable with Generative AI and LLMs.” In this announcement, Kore.ai showcases its recognition of this trend.

The vendor has also released an agentless dialer, which enables businesses to send high-volume, personalized, and automated voice notifications.

Lastly, Kore.ai has launched new agent-assist features that surface guided instructions and relevant playbooks to service reps in real-time as they engage with customers.

All in all, these enhancements will allow businesses to put AI to work “10x faster” and generate “optimal conversational experiences”, as per Kore.ai.

A Word from Kore.ai Founder & CEO Raj Koneru

Having the final say, Raj Koneru, Founder & CEO of Kore.ai, celebrated new GenAI features within the release and their potential to reduce the costs and enhance the efficiency of bot deployments.

“Coming off the heels of a successful $150 million round, this next wave of innovation reinforces how Kore.ai is leading the way with generative AI,” he said.

“We have reimagined what it takes for our platform users to bring new AI-automated solutions to market faster and deliver value with speed, accuracy, and cost.

XO V11.0 will take the complexities and time out of building and managing AI and focus on putting it to work to drive value at scale.

In doing so, Kore.ai hopes to strengthen its leadership position in the enterprise conversational AI space, as reaffirmed in the most recent Gartner Magic Quadrant.

Meanwhile, Kore.ai has also expanded its routes to market and CX partnerships.

Recently cemented its relationships with CCaaS leader Genesys and enterprise communications high-flyer Zoom exemplify this – and the vendor’s fast-growing footprint across the CX space.

This post was originally published on 3rd party site mentioned in the title of this site

Similar Posts