Troubleshooting

red hat

Interaction design

|

UX research

|

Visual design

|

Analytics

|

User validation

Insights from user research showed that troubleshooting was a high customer priority and customers are motivated to self-solve rather than wait for a support case response.

The troubleshooting tool helps users find the right answer and find it faster.

Business problem

Customers have to wait unnecessarily for a resolution and support engineers can’t spend their time on more critical or complex cases when support cases are opened that match a published help article.

Solution

Created a troubleshooting tool to deflect support cases for simple issues by providing the solution to customers without needing to open a support case.

Contributed a progressive form component, which acts like a wizard with variable steps, to the PatternFly open source design system.

User benefits

  • Customers with simple product issues get answers without waiting
  • Support engineers are free to prioritize challenging cases
  • Customers with complex product issues get answers faster
  • Visually align support site with the product look and feel for a consistent customer experience

User challenges and design solutions

Customers ignore the recommended solutions shown when opening a support case.

On average, the correct solution was recommended for 40% of opened cases yet only 3% of users clicked on a recommendation and did not continue opening a support case.
Created a dedicated tool for troubleshooting to show matching solutions to customers before they choose to open a support case.

Difficult to find solutions to product issues amidst the support site's content.

Promoted the troubleshooting tool to users that search for product issues so the general site search can prioritize getting started and product guides.

Low user adoption of initial troubleshooting tool.

Added a troubleshooting entry point to the customer support site home page.

Design journey

Top Tasks analysis

Using a top tasks analysis, 881 customers ranked the five tasks they find the most valuable from a list of 90 common tasks available on Red Hat's customer support site. 4/5 top tasks related to troubleshooting, justifying the need for a dedicated troubleshooting tool.

Interaction exploration

Explored a chatbot style interaction that would ask the user questions to narrow in on a solution. AI technology wasn’t advanced enough at the time to ask the right questions.

Explored a search experience that updated as the user typed as little or as much as they wanted. It was too open ended to guide the user to the right answer.

First release

The first release guided customers to choose the product and version, next provide a short issue summary, and then optionally describe the issue in more detail. Recommendations were shown and narrowed down with each answer provided. 

The customer’s responses were copied over if they did need to open a support case, making sure that task flow wasn’t impacted.

Testing

On average, 30% of users did not end up opening a support case, far exceeding the 3% deflected when opening a support case. However, the number of users was small.

Ran an A/B test for adding the tool on the customer support site home page for one month. Traffic increased by by 1160% and support case creation was reduced by 7.33%, which equated to significant cost and workload savings. This was the data needed to justify taking up prime home page real estate.

Recommendation algorithm improvement

Worked with the search engineering team to design the functionality of two additional features - AI-generated alerts and featured solutions curated by PMs.

Redesigned the user flow to focus on one step at a time and present results throughout the experience.

Gathered requirements from other product design teams for a focused but continuous form, inspired by Typeform. Contributed the design guidelines to the PatternFly design system for a progressive form - a component that progressively builds a form based on the user’s previous responses.

Prototyping and user testing

Built Axure prototypes to test if the next and back buttons should follow UX guidance for a form (buttons below the form) or a wizard (in a consistent location on the page). 

In a user test with 17 customers plus a survey of 27 customers, twice as many chose buttons sticky to the bottom of the screen yet none felt strongly about their choice. There was no significant difference in task completion time between either scenario. Since the user test did not eliminate either option, we chose sticky buttons because of the lower level of development effort.