In Jan 2020, the business has decided to fully move the call centre team offshore, which will cut 30% of work capacity.
And the business requirement for us is
How can we improve the current "support" page to help customers find the answer they need, in order to mitigate the call centre costs?
So this makes me think we need to start from the current "support" page flow to understand what went well and what needs to improve to achieve the goal. So with this goal in mind, I decided to set up meetings with the customer service manager to get a better understanding of what issues our customers facing right now, and what are the most frequent questions than been asked by customers; Does customer visit the website before they call? etc...
The result is quite surprising, we found that
And the most frequently mentioned issue is the lack of “Consistency and standards”.
Some quotes like: "I thought camera assist had troubleshooting information."; "Why 'Drivers and download’ menu, and ‘Self-help’ menu take me to the same place."
And thinking to get the best practical solution for those issues, I decided to conduct a workshop of a group of people involved in this project, that comes from diverse backgrounds.
As a team, I let everyone map out a list of pain points that I collected throughout the research, and ask them to draw their ideas on paper and present their ideas and designs and we vote to the best practice option on each pain point.
I gathered all these ideas and sketches out a paper prototype based on our discussions. Then, refine the flow, and using Axure to make the mid-fidelity interactive prototype. And to test our ideas, I have facilitated and run five usability tests with our customers.
This usability test was conducted as an unmoderated “think aloud” usability test via "Askable." In an unmoderated usability test, users are not observed live while they carry out tasks. Instead, their interactions with the website and their verbal comments are videos recorded for later analysis.
Five users each carried out 3 tasks on the website in separate usability test sessions. At the end of each usability test session, they answered a number of pre-defined questions.
By doing that, I have identified some positive and negative feedback from the user and also some unexpected user journeys while they using the prototype.