Generative AI to answer customer inquiries

ChatGPT, a chatbot based on a Large Language Model (LLM), continues to cause a stir. Now that Microsoft is making it even easier to apply ChatGPT in your own business environment with its Copilot service, many people are now getting the idea that this is the moment to develop a chatbot for customers or employees. With a chatbot, they no longer have to call the contact centre or a service desk for more complex questions, but can look for the answer themselves. It’s a good idea, but it does take a fair bit of work in practice, says Bastiaan Sjardin, CTO and AI Strategist at Future Facts Conclusion. 

April 25th, 2024   |   Blog   |   By: Conclusion

Share

Bastiaan Sjardin, CTO and AI strategist at Future Facts Conclusion, discusses the use of generative AI for the client, including benefits, complexity, and applications.

What do we mean by generative AI for customer contact? 

“Chatbots have been around for a long time,” he says. “They are mainly used for the simple questions that are easy to script. The customer contact involves questions like: ‘where can I find my invoice?’ And at a service desk, you can expect questions like: ‘how do I reset my password?’ There are companies that have tried to integrate AI into these kinds of chatbots, but it is very complex.

This is because AI can give fictitious answers to questions but present them as facts. This is also called hallucination. The fact that AI can generate new ideas is a great strength of the technology, but in customer communication it is actually a weakness. You do need to be sure that the chatbot’s answer is correct. For this reason, organizations often use chatbots in practice only for the straightforward questions you can script from A to Z. The more complex inquiries, and certainly those that involve giving advice, are left to specialists like contact centre agents or service desk staff.”

With the labour market being as tight as it currently is, it is no surprise that companies are eagerly looking at ChatGPT’s capabilities. They are certainly looking at the possibilities of applying this to their own organisation’s documents, for example their own knowledge base. Because then, instead of manually searching the knowledge base themselves, employees only have to type in a question. The LLM uses Natural Language Processing (NLP) to formulate clear answers.  

ChatGPT has made a huge leap over previous chatbots due to the richness of the underlying LLM.

Bastiaan Sjardin

What are the advantages of generative AI?

ChatGPT has made a huge leap over previous chatbots due to the richness of the underlying LLM. The bot can generate easy-to-read texts at the six language levels distinguished by the European reference framework, CEFR. You can actually specify the level at which the text should be written. You can also easily specify which documents can be used as sources. This way, you maintain control of the quality of the answers because you can be sure the chatbot will not use incorrect source information. 

Why is generative AI so complex? 

Don’t be fooled in thinking that all the answers will necessarily match what you expect. After all, the chatbot can still hallucinate. It might use information in the wrong context, or the form of the output might not match what you expect. 

The success of generative AI in enterprise environments therefore depends on good prompt engineering. The prompt has to take into account the desired tone of the answer and the level of accuracy required. To make sure the bot always gives the desired answer to a question, you will have to create very many prompts. This is the only way to guarantee the relevance and quality of the answers. Then, you will need to validate the output generated with both specialists and end-users.

People’s responses will probably vary, and after a validation session like this, you’ll end up with a lot of input to improve the prompts. It is a process you will have to keep repeating. Bastiaan: “You need to implement a system for continuous learning where you use feedback from end users and specialists to update the AI models constantly. That way, you create an adaptive system that performs better and better the longer you use it.” 

Emerging Technologies Trend Report 2024

Emerging Technologies Trend Report 2024

The Conclusion companies continuously monitor emerging IT trends and discover and explore new technologies that may be of interest to our clients. In our semi-annual trend report, we highlight several technologies that, in our opinion, deserve attention. Will you explore with us?

Generative AI in practice at APG 

Pension administrator APG has been using chatbots in its contact centre for years. Based on the new possibilities with ChatGPT, the company would like to take it a step further, and asked Future Facts Conclusion to help think about this. Bastiaan: “Firstly, we took part in the Ideation phase. As many as 20 to 25 ideas emerged on how to use generative AI to improve customer contact.

Those ideas were prioritised on aspects such as feasibility, positive impact and investment. We developed a generic framework that we can use for any question about generative AI. That does take a bit more time to develop on the front end, but it means we can move towards an architecture and organisation that is set up to deliver generative AI responsibly.

For example, with the framework approach we can reuse prompts once they’ve been developed. In the near future, this will not only save us time, but also ensure higher quality. This is especially important for APG because, as we’ve said, AI tends to hallucinate. Of course APG can’t afford to make mistakes; they’re under strict supervision. With the framework, we guarantee that we take the lessons from one project into the next.”

The success of generative AI in enterprise environments therefore depends on good prompt engineering.

Bastiaan Sjardin

CTO and AI Strategist at Future Facts Conclusion

Bastiaan Sjardin

Discover the latest tech trends now!