Reducing call centre contact by 56%

 

Reducing call centre contact by 56%

The problem

A KPI for the product team that owned EE’s Help area was the volume of users who contacted an agent after using the website. As the sole Product Designer in that team, I was tasked with reducing that figure. There were several initiatives I worked on but this case study will focus on the one that tied in with another business goal; To reduce the amount of complaints and refunds related to roaming (calling, texting and using data abroad).

Outcomes

56% reduction in users needing to contact call centres for roaming related help


Research

Analytics analysis
I created analytics dashboards and an information architecture map to get a better understanding of which areas users were visiting the most and which of these areas were resulting the most agent contacts. I found that roaming was a high volume area with relatively high proportion of visitors ending up contacting an agent. After showing my findings to the Help PO, we decided that this would be our area of focus.

When looking deeper into the roaming area’s usage, I found that users were bouncing between the large variety of articles available before eventually contacting an agent. When looking into these articles, I found lots of duplicated and outdated content.

The ‘roaming cost calculator’ journey was by far the most visited page and was in turn also the biggest driver to agent contacts.

Getting insights from call centres
I reached out to a representative of the call centre team and asked for any insights regarding roaming related calls as I thought this could help provide more context on our users’ needs.

  • Most calls resulting in a complaint occurred once they had returned from abroad and noticed unexpected charges on their bill. This resulted in a high amount of refunds being issues to customers. The particular policies that customers had not been aware of, and therefore caused them to incur charges, had been explained to me so I was aware of them.

  • Before going abroad, customers would call up because they weren’t sure on what charges they would incur and what options they had to reduce them.

Usability testing
Alongside a researcher, I usability tested the existing experience to identify pain-points and areas of confusion. We found that users were getting lost in the sea of articles available and were often unaware of key policies that would cause them to incur charges. When using the ‘roaming cost calculator’ journey, users were often left confused as to what their potential charges would be.

I also tested several competitor experiences so we had insights on how other approaches performed.

Journey mapping
I documented my findings in journey maps which served as a great resource to educate team members on the problem space.

Ideation

Ideation workshop
I presented all of my research findings to team members and stakeholders so everyone was aligned on the problem space. We then ran a series of Crazy 8 sessions, focussing on a different element of the problem space each time. This proved to be a great way of getting everyone to come-up with and express their ideas. We then dot-voted on which ideas we’d like to explore further.

We decided to focus on the ideas that aimed to educate users on roaming before they went abroad. This was because:

  • By doing this we could better answer the questions users naturally had before going abroad as well as try to educate them on policies that they needed to be aware of but had no idea existed.

  • We had control over the web experience and could quickly and easily test ideas without being too reliant on other business areas.

Content improvements
Analytics and usability testing had shown that there was an issue with the content about roaming within the Help area. We made the content team aware of this (as well as our other findings) which resulted in them adding tickets to their backlog to address these issues.

Wireframing and Prototyping
The existing ‘roaming cost calculator’ journey seemed like a great foundation to build from. It was heavily directed to by other areas of EE and received a large volume of traffic. I believed that, with improvements, it could serve users well.
Using the ideas generated during our Crazy 8 sessions, I sketched more detailed wireframes which addressed the multiple issues we identified. After multiple rounds of feedback and iterations, I landed on a design which was ready to test with users. I then used Invision to create a basic prototype that we could test with.

Usability testing
Alongside a researcher, we then tested the updated designs with users. We found that users were able to easily understand what charges they would incur and there was an increased awareness of the policies which had been causing complaints and refunds.

Final designs
Using Sketch, and adhering to the EE style guide, I produced final UI designs and completed a final handover with our development team. As they were consulted throughout the design process, this was quick and easy to do.

Going live

Outcomes
After our first release, contact agent reports showed a massive 56% decrease in roaming related queries. As this was released towards the end of my time at EE, I wasn’t around long enough to see how much these changes reduced the amount EE had to refund to customers as a result of them not understanding how roaming worked (although initial feedback seemed positive).