QnABot on AWS is an open-source chatbot solution powered by Amazon Lex. It is designed to provide conversational AI capabilities across multiple channels and languages, allowing businesses to deploy self-service chatbots in their contact centers, websites, and social media platforms. By leveraging large language models (LLMs), QnABot can generate accurate answers from existing company documents and knowledge bases, making the chatbot more conversational and improving the customer experience.
With the latest QnABot releases (v5.4.0+), users can now take advantage of generative AI features. These features include the ability to disambiguate customer questions by considering conversational context, dynamically generating answers from relevant FAQs or Amazon Kendra search results and document passages. The generated answers also provide attribution and transparency by displaying links to the reference documents and context passages used by the LLM to construct the answers.
When deploying QnABot, users have the option to automatically deploy a state-of-the-art LLM model (Falcon-40B-instruct) on an Amazon SageMaker endpoint. However, QnABot also integrates with other LLMs using an AWS Lambda function provided by the user. To facilitate the integration, sample Lambda functions (plugins) have been released, allowing users to integrate QnABot with leading LLM providers such as Amazon Bedrock, Anthropic, and AI21.
The new generative AI features of QnABot offer several advantages. Firstly, the number of FAQs that need to be maintained and imported into QnABot is reduced, as concise answers can be synthesized on the fly from existing documents. The generated answers can also be customized for different channels, providing short and concise answers for voice-based contact center bots, while offering more detailed information for website or text-based bots. Additionally, the generated answers are compatible with QnABot’s multi-language support, allowing users to interact in their preferred language and receive answers in the same language. Moreover, the generated answers include links to the reference documents and context passages used, ensuring transparency and providing insight into how the answers were constructed.
The new features also enable QnABot to disambiguate follow-up questions based on preceding conversation context. This allows QnABot to interpret user requests based on conversation memory and context, generating clear search queries to find relevant FAQs or document passages to answer the user’s question.
To get started with the new generative AI features, users can follow a tutorial provided by AWS. The tutorial covers the steps to deploy the latest version of QnABot, create and populate an Amazon Kendra index, choose and deploy an LLM plugin (optional), configure QnABot, access the QnABot web client, and start experimenting with the generative AI capabilities. The tutorial also provides guidance on customizing the behavior of QnABot and adding curated Q&As and text passages to the knowledge base.
Overall, the new generative AI features of QnABot make it more conversational and dynamic, allowing businesses to provide better customer experiences and generate responses based on their knowledge base.