Keywords Ins/OutsCase Study - STAT Search Analytics
Through strategic planning, STAT Search Analytics had identified several problems that led to a list of company goals, including:
- To increase customer satisfaction and retention
- To reduce support burden
- To have a new feature fleshed out and validated for the dev team to pick up when they need work
- To increase engagement in our app
- To increase adoption of our app
- Release features that provide unique value compared to our competitors
Initial User Feedback
Throughout our research projects, we noticed there was a recurring piece of unsolicited feedback; our clients wanted to know where their keywords were going when they changed rankings.
For example, when they have 50 keywords in position 3 on Monday, and only 40 in position 3 on Tuesday, where did those 10 go? What position are those 10 in now?
I surfaced this to our VP of Product, and we agreed that we needed more information and it warranted its own research project.
We conducted a User Research Question Brainstorming Session (URQBS) with a few people that were going to be working on this feature; a few devs, CS members and the product owner.
This gave us a list of questions about our users that wanted answered. We then edited and prioritized them, then determined the best way to get the answers (ex: some required research interviews, others required data pulls, etc…).
The goal was to figure out what we could answer in a reasonable amount of time, and how.
From there we got to work on getting a mix of qualitative data and quantitative data. We conducted Research Interviews and did a deeper dive into our support system to see what exactly was being said.
One of the interesting takeaways from our research interviews was that even though our participants wanted to see this in an interactive dashboard, they would still be happy if this information was only available in a report.
From this, and the rest of the takeaways, we were able to create a list of requirements for v1 of this feature.
Proving the ROI
With the requirements created, I could now plug in some data to calculate the costs vs benefits and found that this should pay for itself within the first year of being released.
And that’s just on costs saved with CS. This calculation did not include the indirect benefits such as:
- Increased retention or reduced churn
- It adds unique value to the platform
- CS can now spend an extra day a week on something else.
This was enough to prioritize the work and proceed.
The Adapted Strategy
Now that we knew that the outcome was going to be a report, we decided that we only needed wireframes, a prototype, and one round of usability tests in order to be confident in our direction. After that, it would be ready for dev as long as there were no game-changing insights from our tests.
We started with the wireframes, which were pretty simple since we only needed to add an extra report to our reporting wizard. These were mainly necessary for the prototype and usability tests.
The Adapted Strategy
Prototype, Testing, Release
We also created a prototype of the actual report and conducted some power usability testing with 8 participants, iterating every 3 participants. Changes made to the prototype were only changes that we agreed were obvious and crucial to the first version.
After that, we were ready to build it and release it.
And after 30 days, I did some digging into the success measurements, and found:
- 68% drop in CS tickets concerning the ins and outs of keywords in a tag.
- 94% drop in repeat support tickets concerning this topic.
- This was our 4th most popular report, out of 15 reports.
- CS has “time spent” on these requests to under 30 minutes a week, down from 9.3 hours a week.