Agent Troubleshooting Tool
Updated
The Agent Troubleshooting Tool is designed to help agents diagnose and resolve system, network, and hardware issues that disrupt call quality. By providing a simple and efficient diagnostic mechanism, the tool improves agent productivity, ensures seamless customer interactions, and minimizes downtime.
The tool empowers agents and support teams to perform quick tests, identify root causes of issues, and take corrective actions promptly, thus enhancing overall operational efficiency and customer satisfaction.
Issues the Tool Can Help Diagnose
Voice quality issues
Microphone access issues
Firewall whitelisting issues
Citrix media offloading setup
MQTT connectivity issues
Agent View
The Agent Troubleshooting interface is designed to be simple and easy to navigate:
Test results are displayed clearly with pass/fail indicators
Agents can run customized test sequences based on their provider or network configuration
The system automatically detects whether the agent is using Citrix VDI or Azure VDI and runs the appropriate tests
Prerequisites
Before running the Agent Troubleshooting Tool, ensure the following conditions are met:
Atleast one Voice Appication is shared with the agent
Platform Type is set to Enterprise
Headless voice is disabled
Agent Readiness or Nailed Up is enabled for the agent
The user type is VOIP
Ensure agent has permission for Voice Troubleshoot
Ensure agent readiness is enabled for Agent. This can be configured in Unified Routing for Agent.
Important: Ensure the agent’s status is set to an unproductive state (such as Break or Lunch) before starting troubleshooting.
If the agent is in an Productive (Available) status, the troubleshooting option is disabled and the system prompts the agent to change their status.
Accessing the Agent Troubleshooting Tool
Agents can access the troubleshooting option from their profile:
Log in to the Sprinklr platform.
On the Home screen, navigate to your profile.
Click Troubleshoot from the drop‑down menu. The Agent Troubleshooting window opens, displaying available diagnostic tests.


Configure Settings
Before You Start
Before running tests, review the settings:
In the Agent Troubleshooting window, click the gear icon on the top right corner to open Settings.
Verify the following fields:
Microphone - Select the correct microphone input.
Spreaker - Play a sample sound to verify speaker functionality.
Ringtone Output Device - Device where incoming call ringtones play
The last‑used inbound or outbound Voice Application is selected by default.
If unavailable, the first shared Voice Application is used as a fallback.
You can select a different Voice Application from the drop‑down.

Note: If you do not manually select values, the system automatically uses the last‑used or first available Voice Application.
Start the Test
You can run various diagnostic tests to identify the problem in the network by following these steps:
Ensure you are not on a call and your status is not Productive.
Click Start Tests.

All tests in the Agent Troubleshooting window run simultaneously. Clicking on Start Tests will run tests simultaneously for all the fields listed in the Agent Troubleshooting window.
Note: Follow on‑screen instructions during testing (for example, avoid making outbound calls).
Tip: Use the arrow icon next to any test to view detailed results.

Understanding Test Results
After tests complete:
Each test shows a Pass or Fail status
Failed tests include a Retry option
Click Maximize to view detailed diagnostic information
Available Diagnostic Tests
The following tests may appear in the Agent Troubleshooting window, depending on your environment and Voice Application.
1. Microphone Connectivity Test
Verifies that the browser or system has granted the necessary permissions to access the microphone and verifies if the microphone is connected.
Microphone Connection Time: What’s Good versus Problematic
Excellent: If the time taken to access the microphone is under 200 ms, the microphone connection is performing optimally.
Problematic: If the microphone takes more than 500 ms to connect, this typically indicates driver or hardware issues.
Recommended action: Switch to a good‑quality, certified headset or update the device drivers to improve microphone performance. For more information, refer to the Configuring Microphone Settings article.
2. MQTT Connectivity Test
Checks whether the browser can receive real‑time updates from Sprinklr’s backend services. These updates include alerts, status changes, heartbeats, and other real-time events used across the Sprinklr platform.
Only the services required for your user appear.
Metrics
Time to Connect - Time required to handshake and establish connection with the service.
Optimal: under 1 second
Unacceptable: over 4 seconds
Round‑Trip Time (RTT) - Average time for a packet to send and return.
Optimal: under 150 ms
Unacceptable: over 300 ms
3. Turn Connectivity Test
Checks whether TURN (Traversal Using Relays around NAT) servers are reachable. TURN servers are used in restricted networks and may not be required in all environments.
Failures usually indicate firewall or whitelisting issues
This test may not appear for all voice applications
4. Throughput Connectivity Test
This test measures the network throughput available to the agent.
Throughput refers to the amount of data your network can reliably send or receive over time. Higher throughput allows audio to stream smoothly during calls, while low throughput can result in choppy audio or call dropouts.
This test runs only when ICE (Interactive Connectivity Establishment) servers are configured for the selected Voice Application. As a result, you may not see this test for all voice applications.
Low throughput values typically indicate network‑related issues and can negatively impact overall call quality.
5. Media Offloading Setup Tests
This test verifies whether media offloading is correctly set up in the agent’s environment. Media offloading is supported on virtual desktop platforms such as Citrix and Azure Virtual Desktops and helps ensure optimal call quality by processing media on the local device instead of within the virtual session.
This test appears only when media offloading is enabled for the agent.
If the test detects a failure, it identifies the likely point of failure and provides actionable diagnostic guidance to help troubleshoot and resolve the issue.
6. Test Call
The Test Call runs a short echo call to validate call setup and audio quality. Anything you say during the call is echoed back, allowing you to confirm that audio is clear and functioning as expected.
A Pass status indicates that the test executed successfully. It does not guarantee that every call scenario will be issue‑free.
Test Call Results
Expand Test Call to view the result banner:
Success (green): Call setup completed without issues
Warning (yellow): Call setup completed with minor issues
Failure (red): Call setup failed
The result banner displays a preliminary cause and includes a link to a relevant Help Center article with recommended next steps. For more information on the ICE Server Connectivity Troubleshooting Tips, refer to the ICE Server Test Troubleshooting Tips article.
Additional Details (Expanded View)
When you expand the Test Call, additional sections are displayed:
Configured ICE Servers
Lists the ICE (Interactive Connectivity Establishment) servers configured for the selected Voice Application.

Call Quality Metrics
High values (highlighted in red) for the following metrics can negatively impact voice quality:
Jitter – Variation in packet arrival times
Packet Loss – Percentage of lost data packets
Round‑Trip Time – Time taken for audio packets to travel to the destination and back

Mean Opinion Score (MOS)
Indicates overall call quality on a scale of 1 to 5, where higher values indicate better quality:
Good: Greater than 4 (shown in green)
Poor: Less than 3 (shown in red)
Download WebRTC Dump
A Download WebRTC Dump button appears at the bottom of the Test Call section after you expand it. Clicking this button downloads a log file containing detailed call metrics. You can share this file with IT, developers, or Support teams to help diagnose issues identified during the test call.

After the Tests Complete
Once the tests finish successfully, you can
Restart All Tests
Runs the full test suite again. The Restart All Tests option runs all the tests. However, if a specific test fails, there is no need to rerun the entire test suite. You can simply retry the failed test individually.
Download Report
Downloads a detailed troubleshooting report that can be shared with IT or support teams for further investigation.
You cannot exit troubleshooting while tests are in progress.