We don’t have an abundance of crystal balls at The CAI Company, but things are so exciting in the world of conversational AI right now, we thought we’d take ourselves out of our comfort zone and make some predictions. Overall, 2025 is shaping up to be a pivotal year for conversational AI. Businesses that embrace the technology (strategically) and address the associated challenges will be well-positioned to reap the rewards. Here’s what we think might be on the cards, check back in at the end of the year and see how we did! 

1. Conversational AI will be widely piloted for customer-facing applications.

This is quite a safe prediction to start with as it’s backed up by a recent Gartner report that suggested 85% of customer service and support leaders plan to explore or pilot conversational generative AI solutions in 2025.

2. There will be a new focus on delivering tangible value from AI:

Companies will begin to move away from adopting AI simply to “keep up” and will instead focus on driving measurable value from AI investments. In the early days of chatbots, we saw companies incorporating them as the shiny new thing, before realising the importance of having a use case, a plan and a way to demonstrate effectiveness. That cycle is playing out again with using generative AI to create bots, but we predict that the early adopter companies, who experimented this year, will be doubling down on effectiveness and ROI in 2025.

3. We’ll see more LLM based bot platforms emerging:

At The CAI Company we’ve enjoyed working with integrating LLMs into virtual assistants and indeed into our workflows. For most of our clients, a hybrid approach makes the most sense for now., But early testing of platforms like iostack show how sophisticated functionality (and guardrails) are being built into the new wave of products to make Gen AI-first bots that are both excellent at conversation and staying on topic.

4. We’ll continue to see a shift in responsibility for AI from IT teams to CX teams:

Customer service departments, will continue to grow in influence in the AI sphere. Traditionally, IT teams have been responsible for AI strategy. However, research indicates that customer service leaders are now spearheading AI strategy. For instance, nearly half of customer service leaders surveyed stated that their department is responsible for identifying AI opportunities, while only 19% believe that IT should be responsible.

5. Voice is in ascendency:

We’ll speak to more bots in customer service and elsewhere. We’ll also begin to see conversational AI more in the Internet of Things: Could this be the year that we’ll ask our fridges for menu suggestions based on what’s in them? Or that we take driving tips from our cars?

6. Improved AI-Powered search will disrupt the status quo:

Traditional search will be disrupted by conversational AI. AI-powered tools are already showing how they can pull precise answers from various data sources. For example, the Reddit Answers feature uses generative AI to respond to user queries, offering summaries and linking to relevant communities. Google’s NotebookLM shows the power of curating data sources and using AI to query them and reformat their content into digestible information – you can even chat with the AI about your data and have it turned into a podcast style experience.

7. Companies will need to improve their knowledge libraries for conversational AI to function properly.

61% of customer service leaders surveyed reported backlogs in editing articles (Gartner). Over a third admitted to lacking formal processes for revising outdated materials. Using poor quality data to train external facing bots presents a risk to companies: garbage in = garbage out. Tooling that helps companies tag their content or help organise it ontologically will gain traction. In a recent (excellent) webinar from VUX.world the panel talked extensively about the role of content strategy in LLM applications. While we would love ‘ontology’ to be the word of 2025, we’ll settle for a renewed interest in content systems as a trend to watch for.

8. We’ll hear a lot more about ‘agentic AI’ but we predict evolution rather than revolution.

As experienced conversational AI practitioners, we have designed and built bots that can complete tasks for the user for some years, but many chatbots are still only capable of simple information exchange.

Agentic AI refers to artificial intelligence systems with enhanced autonomy, decision-making capabilities, and adaptability. AI agents can pursue complex goals and execute workflow tasks with limited direct human supervision, so their uses include, but also go beyond, acting as customer agents. In short, “agentic AI” will eventually replace “chatbots” and will mean it’s no longer embarrassing for CAI experts to tell people what we do at dinner parties.

9. We’ll see some new analytics tooling on the block.

Bot and CX analytics are rapidly changing. This year we’ve seen contact centre software growing into the CAI space and we’ve seen consolidation of tooling and rebranding e.g. Calabrio/ Wysdom. We predict more consolidation in the omni-channel analytics space, but that new contenders will emerge specialising in catering for agentic AI/ bot teams. As more bots (with less standardised answers) are deployed, there will be increased demand for specialist, high quality analytics that can make sense of both the structured and unstructured data from automated conversations.

10. Companies will look to use Gen AI more efficiently.

Rather than processing 100% of data through an LLM or Gen AI process, companies will look to reduce their costs, computational resources and environmental impact by making use of caching and memory. We’ve seen clients roll out LLM-based machine translation for their CAI solutions, but with a cache to store previous translations. It proved to be a canny decision, with 90% of utterances going to the cache, rather than through the LLM.

11. We’ll shift from one-size-fits-all Gen AI solutions to more selective choices.

While many in 2024 went all-in with LLMs, next year businesses will prioritize using curated AI models for specific tasks to enhance control and user experience. Tim Willers from Nibble gave an excellent talk at Unparsed 2024 about how they are doing this with ‘atomised copy’.

As AI steps become more use-case specific, the need for platforms that can seamlessly integrate different AI tools will grow. For instance, creating a precise Refund experience will demand more time and resources than a generic fallback, and the AI tools required will likely differ for both.

12. Data privacy, regulations and ethics will contribute to scaled down use cases.

European companies are subject to GDPR, which means that companies have to be extremely careful about where their user data is stored or transmitted. This means that any use of LLM servers’ geolocation has to be as much of a factor in decision-making as cost or accuracy. We’ve seen this impacting use cases in banks and companies where data and security is critical and this will continue to put a brake on decisions.

 

So that’s our take. Whether we hit all the marks or not, 2025 is promising to be an exciting year for conversational AI, and we’re buckling up for a fun ride. 

What do you think? Did we miss anything?