The purpose of this article is to explore some best practices for monitoring and improving chatbot performance and responding to user feedback. To help you navigate the article we have broken it down into the following sections:
- What is post-launch monitoring?
- Who has access to complete post-launch monitoring?
- Reviewing IDK Interactions
- Reviewing Search Interactions
- Reviewing User Feedback (Optional feature)
What is post-launch monitoring?
Regularly reviewing interactions post-launch and making adjustments based on user feedback is crucial to ensure that your chatbot is performing optimally and continuing to meet the needs of your users. We recommend spending 30 - 60 minutes at the end of each month to review your chatbot's performance.
Who has access to complete post-launch monitoring?
The settings needed to complete post-launch monitoring may include the following permissions:
For more information on user permissions, review the User Roles & Permissions article.
Reviewing IDK Interactions
IDKs (I Don't Knows) are instances in which your chatbot was unable to provide any response to a user's input.
As a result, reviewing and addressing IDK interactions is one of the most effective ways you can quickly improve your chatbot's performance. Patterns in IDK responses may indicate a gap in content about a topic that is important to your users. Adding relevant questions can improve your chatbot's performance immediately and reduce the number of IDK responses in the future.
- To review IDK interactions, go to the Chatbot>Interactions page.
- Select the Filter dropdown, and from the Response Type dropdown, select IDK.
- If you have more than one page of interactions to review, consider exporting the results as a .csv file that can be further filtered, sorted, and organized in Excel or another spreadsheet software. You can export the interactions by selecting the Save Disc icon next to the filter icon.
- Select the chat icon to view the interaction.
- For a question you would like your chatbot to be able to answer if asked moving forward, select Add A Response.
- In the Response field, craft your response, and select Save.
For more information on crafting your response, review the Creating a Knowledge Base Response article.
- After crafting and saving your response, it will be sent to the Moderation Queue for review by Ocelot Staff before being published to your live knowledge base.
- In the Response field, craft your response, and select Save.
If Automatic Content Generation provided the IDK, the Add A Response button will not be available and a new question will need to be created to add it to the knowledge base.
For more information about how to create a new knowledge base question, review the Creating and Editing a Knowledge Base Question article.
Best Practice: You do not need to add every IDK interaction to your chatbot's knowledge base. Some IDK interactions may be overly vague or complex, outside of your contracted libraries, or altogether outside the scope of a chatbot designed to answer questions about higher education. For more information about what makes an effective question, review the Editing the Knowledge Base Best Practices article.
Regularly monitoring your chatbot's IDKs and adding relevant content will reduce the number of IDK interactions in your chatbot over time.
If you notice a pattern of questions outside of your contracted departments or interactions in a language other than English in your IDK responses, reach out to your CSM. Ocelot offers content libraries for many campus offices and translations for several additional languages.
Reviewing Search Interactions
Search responses are instances in which your chatbot was unable to provide a response from your knowledge base, and uses your spider to offer a list of relevant links from your indexed webpages.
Patterns in search responses can also provide valuable insight into what your users are searching for and whether your bot is providing helpful suggestions.
- To review Search interactions, go to the Chatbot>Interactions page.
- Select the Filter dropdown, and from the Response Type dropdown, select Search.
- If you have more than one page of interactions to review, consider exporting the results as a .csv file that can be further filtered, sorted, and organized in Excel or another spreadsheet software. You can export the interactions by selecting the Save Disc icon next to the filter icon.
- If you have more than one page of interactions to review, consider exporting the results as a .csv file that can be further filtered, sorted, and organized in Excel or another spreadsheet software. You can export the interactions by selecting the Save Disc icon next to the filter icon.
- Select the chat icon to view the interaction.
- For a question you would like your chatbot to be able to answer if asked moving forward select Add a response.
- In the Response field, craft your response, and select Save.
For more information on crafting your response, review the Creating a Knowledge Base Response article.
- After crafting and saving your response, it will be sent to the Moderation Queue for review by Ocelot Staff before being published to your live knowledge base.
- In the Response field, craft your response, and select Save.
If Automatic Content Generation provided the search response, the Add A Response button will not be available and a new question will need to be created to add it to the knowledge base.
For more information about how to create a new knowledge base question, review the Creating and Editing a Knowledge Base Question article.
Note: If you are seeing a high-number of responses that resulted in a search response, consider if your institution has made any changes recently to your webpages.
Administrative-level users can also adjust spidered pages, force run a spider outside of its regularly-scheduled interval, or adjust when and how often the spider is run to ensure search results are up-to-date.
For more information about spiders, review the Navigating the Admin: Spiders article.
Reviewing User Feedback (Optional feature)
User feedback enables chatbot users to provide positive (thumbs up) or negative (thumbs down) feedback about general library questions, custom questions, clarifying questions, and integrated questions.
Negative Feedback with Comments
Users who provide negative feedback to a chatbot response also have the ability to submit a comment to provide additional information about their choice. Reviewing negative user feedback can help identify opportunities to improve your chatbot's responses to meet your users' needs.
- To review user feedback with comments, go to the Chatbot>User Feedback Analytics page.
The analytics will automatically be filtered by the last month, you can select the timeframe dropdown to change the parameters as needed. - In the Interactions Feedback chart, there will be a percentage for positive and negative feedback, and the number of negative feedback with comments. Under Comments, select View All. This will take you to the pre-filtered interactions page of all negative user feedback that provided comments.
- If you have more than one page of interactions to review, consider exporting the results as a .csv file that can be further filtered, sorted, and organized in Excel or another spreadsheet software. You can export the interactions by selecting the Save Disc icon next to the filter icon.
- If you have more than one page of interactions to review, consider exporting the results as a .csv file that can be further filtered, sorted, and organized in Excel or another spreadsheet software. You can export the interactions by selecting the Save Disc icon next to the filter icon.
- Select the chat icon to view the interaction.
- To modify the response, select Edit the Response button.
- Edit the response and select Save.
When reviewing user feedback to make adjustments to chatbot content, consider why the user provided negative feedback and what, if any, changes they have suggested.
- Did the response answer the user's question, or point them to the appropriate resource to have it answered?
- Did the response provide enough information about a complex process?
- Did the response include enough information for users from different populations, if relevant (e.g., prospective students, current students, alumni, etc.)?
- Is the user asking for additional information or details about a complex task?
- Is the user asking for additional contact information or links?
When a user's feedback includes a request for additional details or contact information in a response, these are valuable suggestions that can be used to improve that response for all users.
If Automatic Content Generation created the response, the Edit Response button will not be available and a new question will need to be created to add it to the knowledge base.
For more information about how to create a new knowledge base question, review the Creating and Editing a Knowledge Base Question article.
Best Practice: You may not need to make changes to your chatbot's responses based on every piece of user feedback. Some user feedback may be too vague or hyper-specific to be actionable or applicable to all users. Additionally, some user feedback may express disappointment with the response itself, instead of the response's length, level of detail, or written quality.
Lowest Performing Responses
Reviewing and addressing your chatbot's lowest-performing responses based on user feedback is another effective way to quickly improve your chatbot's performance.
- To review the lowest-performing responses, go to the Chatbot>User Feedback Analytics page.
- Scroll to the bottom of the page to find the Lowest Performing Responses chart. Questions are arranged in descending order, with those receiving the greatest amount of negative feedback at the top of the list.
- To review feedback for a specific question, select the question from the list.
- Select the dropdown arrow next to the question, then select Edit or View.
- Select the Analytics tab to review all user feedback and comments for that question.
- If a question has multiple negative feedback indicators but no comments, you should still review the response for accuracy and add any additional detail that may be helpful to your users.
Was this article helpful?
That’s Great!
Thank you for your feedback
Sorry! We couldn't be helpful
Thank you for your feedback
Feedback sent
We appreciate your effort and will try to fix the article