Performing spot checks, or reviewing your reviewers grades, means that you can monitor and find any discrepancies in your reviewers already completed work. There are a couple of ways how to review the reviewer in Klaus.
Using our Calibration feature
First you must ensure the following setting is enabled in your calibration settings:
Now create a calibration session for each reviewer you wish to grade.
Next, filter your conversation list for “Reviewed by - is - ‘name of reviewer’” & combine this with “Reviewed date - last 7 days” to keep it relevant.
Add as many reviewed conversations you wish to spot check to the calibration session you have created for that reviewer.
Head to the calibration session and submit your own review for each conversation. Once you have submitted your score, mark it as the baseline score.
You can now view the discrepancies in the reviewers scores in the calibration dashboard.
You are scoring on the same scorecard as your reviewers so you can see how accurate your reviewers are grading per category. As they are calibration grades, they do not impact agent view or general dashboard at all.
Using a unique Scorecard
First you must create a custom scorecard you will use to review your reviewers on. Example:
You can add root causes into the negative ratings. This allows you to track in dashboards where the reviewers are going wrong.
Example of root causes for “Review Accuracy”:
Where the “Category A/B/C” etc. would actually be named after of the category in your regular scorecard.
Now you can simply filter your conversation list for conversations that have been reviewed in the past week.
You can now leave a review for the Reviewer, using the new scorecard. (*Agents will not see this score in their view)
You can filter your dashboard specifically for this scorecard to visualise how your reviewers are performing.
You can use the ‘Categories’ dashboard to dive deeper into where the reviewer is misaligned (i.e scoring too high or low for a specific category).
This gives your reviewers their own IQS that is traceable in the dashboard. It's easy to spot check whenever you like (no need for calibration sessions).
Using the Extension & Separate Workspace & Scorecard
Create a separate workspace in your account. This is where the reviews for the reviewers will go.
Create a unique scorecard for reviewing your reviewers (same as the previous method).
Install the browser extension.
Filter your main conversation list for conversations that have had a review in the last 7 days (in your regular workspace).
Open up the browser extension, and ensure it is set to your new separate reviewers workspace.
Select the reviewer in the extension to leave the review for (not the agent).
Complete the review.
Now, you can easily filter your dashboard by workspace, so none of your ‘review the reviewers’ scores will be on your main workspace dashboard.
Reviewers scores are in their own workspace, so results do not get mixed. It will give your reviewers an IQS that can be tracked in the dashboard for that workspace.