Intouch Insight is a data-driven software company in Ottawa, Canada. Their suite of software and services work together to help companies listen, interpret, and act on customer data so they can improve operations and deliver brilliant customer experiences.
These include a survey tool, a checklist tool, overt and covert mystery shopping services, as well as event data capture software. This suite drives action and value from customer experience programs with a one of a kind CEM software.
I started working at Intouch Insight in February 2018. I was very lucky because I was allowed to look through our products and suggest improvements right from the beginning. After taking the time to properly understand our tools, one of the first things that came to my mind was what Intouch call records.
What are records?
A record is a collection of data from a single point. For example, if someone were to fill out a survey, the resulting answers, as well as information about the person who completed the survey, and the resulting score of the survey, would be compiled into a record. Checklists, mystery shops, audits, as well as capture events would all result in records being created.
These records could be viewed in Intouch Insight’s CEM software.
When I was given the green light to explore records, I first started interviewing internal stakeholders. Often, it is not just clients that are looking through these records, but also a large number of our employees. I wanted to know the internal pain points. I interviewed several people from different departments who have different use cases for viewing records. They verbally explained the issues to me, and also showed me how they used it most frequently. Once that was completed, I turned to interview our clients.
What were their pain points, how did they use the records, and how did it differ from our internal users?
The interviews were done and it was time to compile all the data. I took the discussed pain points from internal and external users and prioritized them from most frequent to least frequent and cross-referenced them to see what issues would overlap between the two groups. Whatever overlapped, was what was going to be most important to address first.
It’s important to understand what it looked like, to begin with. The most important data was found at the top of the page. This included:
- The overall score;
- The name of the person who completed it;
- An ID for the record
- The date, address and location of the completed record; and
- The duration that it took to complete.
Depending on the product or service, it could also include:
- Whether the record passed or failed;
- If were any followups that emerged during the completion of the record;
- The overall sentiment of the record
- The weather where the completion took place
The rest of the record displayed questions and answers, in order of completion.
Very often, these records were quite long and the user had to scroll down for a long time to find the answer that they were looking for.
If they needed to reference anything at the top, they would have to find their way back up, then try and return to where they were before. That was the first major pain point.
The second major pain point was the distinction between questions and answers. The fonts were the same size, only distinguished by colour. In a very long document, it was hard to scan for specific questions or answers.
Now we knew what the challenges were:
How do we make the data at the top more accessible?
How do we distinguish the questions and answers in a functional and more user-friendly way?
- Employer: Intouch Insight
- Services: UX/UI
- Year: 2019
- Role: Research and Design
- Duration: 6 months
- Environment: Team effort
To address the first challenge, I moved the top portion to the left-hand side, and the questions and answers to the right-hand side. The left-hand side would remain static, and the right-hand side would allow for scrolling.
This way, the important data could always be referenced.
You’re probably wondering, but how is that mobile-friendly? Truth is, that since our software is used mostly (90% of the time) on a desktop, we found that building for desktop was the more important.
The next challenge was a little different. It required a hierarchal analysis.
The more referenced element would need a larger font and a visual icon to accompany it. The answer was found to be of more importance, so a logical icon would be the type of answer that was given. For example, if the user answered a radio question, a radio button icon would appear to the left of the answer.
The designs were shown to several internal and external stakeholders. Of these, some had been implicated in the user interviews while others had not.
This gave us a clear overview of whether we’d hit the mark with the majority of our users.
You can see the left-hand side panel, with all the important information displayed. That panel is static and always accessible. On the right-hand side, the questions and answers are easily distinguishable and the type of question is obvious. They are also clearly divided into their respective sections, with the score for each section is prominently displayed.