Before I discovered how using analytics in software testing I was working at a company and finding lots of bugs. As a tester I started to wonder if I am truly doing a good job at assuring quality of my product that I am testing. I would find lots of bug, all kinds: severity 2, 3, 4. I would log them, I would triage them, I would advocate for them but just to see that they did not make it into a sprint or if they did, they would be at the bottom of the sprint backlog where most of the time they would fall over into the next sprint and then repeat sprint after sprint.
Questions I started asking
- - What difference am I making if the bugs are not getting fixed?
- - What is the benefit that I am providing to the customer?
- - What can I innovate to get more bugs to be fixed?
- - Maybe my bugs are really not worth fixing when compared to the new development and features that need to be released?
Being introduced to analytics platforms like adobe Analytics and Google Analytics, I was intrigued on how much I can find out myself about the user’s experience and preferences in relation to my bugs. I realized as well that data, numbers and statistics is what is driving most of the decision making in our development team.
Learning Analytics
After going to a few brown bag lunches on analytics and learning to query and dissect different segments I was able to drill down to know for the most part exactly how many people are hitting each of my bugs that I logged. This was like finding a hidden treasure.
I began to gather data from these tools about each bug and inserting it in the defect description or notes before the triage. During triage I would bring the data up to be taken note of. The results were instantaneous and amazing. It was like turning the fog off.
I would write comments like:
- - “22 people a day on average select this link and get this bug to reproduce on them.”
- - “100 people a month on average open this specific page on an android running a Samsung device”
- - “75 people a month or 2+ people a day open our home page using internet explorer 10 and see the cover image text overlapped”
- - Etc
Seeing the true effect of the bug on the customer resulted in many of the bugs being prioritized higher and inserted into the sprint above new development work, while some of the bugs would actually get deprioritized. What matter to me was that the right ones were now getting worked on.
Another Avenue
In addition, I found a second analytics avenue – the company call center. Setting up a few meetings and showing where the defects live in Jira, together we decided that every time a customer called about a certain defect to put a time stamp comment in that specific defect. The result was that some defects would have a new comment every other day or worse every single day. Seeing these statistics, the project management knew these had to be fixed ASAP and these defects instantaneously became the highest priority.
Within about a year our technical call center representative team went from 9 people down to about 2-3 because there was almost nothing to call about. The defects were being fixed and the customers were happy. As for me, I felt that as a Quality Assurance Tester I was now truly assuring quality in our organization. What made the difference? Using analytics in Software Testing.