Let’s talk!

Kindly provide your details, we will reach you shortly.


Contact Us
case study Testing & Automation

Automated Chatbot Utterance and Integration Testing to Improve Reliability 

A global firm deployed a Microsoft Teams–integrated chatbot to support employee requests such as ticketing, reminders, and scheduling. As the number of supported utterances and third-party integrations grew, manual validation became time-intensive and inconsistent. CES automated end-to-end chatbot testing using its in-house Zyna framework, validating responses, integrations, and UI flows, and enabling scheduled Azure-based regressions – reducing manual testing and eliminating framework setup effort. 

Scroll down for the whole story

The Challenge

High-volume utterance coverage

High-volume utterance coverage

Third-party integration response validation

Third-party integration response validation

Manual testing time and repeatability

Manual testing time and repeatability

the client

Investment Banking & Finance (BFSI)

United States

Technology Stack

  • Selenium
  • C#
  • Unit Testing Framework
  • Azure
  • Azure Pipelines
  • Microsoft Teams

Solution Area

  • Quality Engineering | Chatbot Testing Automation

the impact

400 Hours Setup Saved

Manual Testing Cut (20 Days to 8 Hours)

Weekly Scheduled Regression Runs

Broader Automation Reuse Across Web Apps

how we did it

The shift was automation-led.

The result: faster validation, repeatable regressions.

The Need & The Challenges
The CES Solution
Results & Business Impact

The Need

The firm’s Microsoft Teams–integrated chatbot handled a wide range of employee requests and pulled data from multiple applications. As utterances and integrations grew, the team needed a reliable way to validate functional behavior and third-party responses without spending weeks on repetitive manual testing.

Challenges

  • Utterance volume and test coverage: The chatbot supported many utterances, and validating each flow with the right data sets took significant time.
  • Integration and response validation across applications: The chatbot depended on responses from multiple applications, making manual verification slow and error-prone.
  • Regression readiness without long manual cycles: The team needed repeatable regression runs to confirm stability as changes were introduced.

CES automated the chatbot testing end to end using an in-house framework and CI execution.

  • Zyna-based automation foundation: Used CES’s ready-to-use in-house framework (Zyna) to accelerate automation startup and avoid building a framework from scratch.
  • Chatbot web interface automation: Automated the complete chatbot web interface to validate functional flows tied to user utterances.
  • Integration + response validation across utterances: Validated integrations and responses for a comprehensive list of utterances, including third-party application outputs.
  • Azure Pipelines scheduled regressions: Hosted the solution on Azure Pipelines and ran weekly scheduled regression tests for consistent validation.
  • Saved ~400 hours by avoiding framework build and initial setup effort
  • Reduced manual testing from 20 days to 8 hours through automation
  • Implemented weekly scheduled regressions via Azure Pipelines to keep coverage consistent release to release
  • Built a framework approach that can integrate with multiple web applications for future automation runs
view all case studies

A manual cycle automated. A reliable chatbot experience delivered. This Zyna‑based solution automated utterance and integration testing – cutting validation from 20 days to 8 hours and enabling weekly Azure‑driven regressions.