top of page

Design questions, and how I answered them.

Hint: it's all different ways of asking the user.

Usability Studies

Parature Information Architechture

Parature Information Architecture

When I joined Parature, there were a lot of questions about the organization of the site.  As new features were added, they were "tacked on" to the design, and things became convoluted.  When we began the Parature redesign, I wanted to make sure the information architecture remained clear.

 

The Question

"How do current users mentally categorize this certain list of tasks?  How should they be nested?"

 

The Approach

In order to figure out users' mental model of the information hierarchy, I sent out an online card-sorting study.  A card sorting study has the advantage of letting users reorganize and categorize a predetermined set of words to best see how they group them together.  With the software used, users had the ability to name the categories they placed the cards, and add notes to individual cards, such as "I don't know this term" or "I could also see this in another category."

 

The Results

The results from this study were shocking; I had never seen such a perfect dendrogram with so few branches generated from a card study before.  I enlisted a developer help to remove abandoned answers and other false entries from the JSON file hoping it would explain more nuance, but it only served to sharpen the results.  The users had a very clear idea of how the pages should be organized.  When we tested the new structure, users kept commenting on how intuitive the new layout was.

Parature Ticketing Behaviors

Parature Ticketing behaviors

When Parature was acquired by Microsoft CRM, the idea was to merge the best of both worlds into a new product, the Interactive Service Hub.  The new service desk would integrate Knowledge Base and Ticketing capabilities from both companies.  But in order to ensure Parature customers would be ready to move onto the new product, I had to ensure it would completely fit their behaviors and needs.  We had to find out how Parature customers were using the product.

 

The Question

"Who is Parature's user base?  In what ways are they using Parature?  Where are their frustrations so we can avoid repeating them, and what essentials do we have to keep in the redesign?"

 

The Approach

For such a broad question, I took a 3-pronged approach.

 

Customer self-reporting survey

To get the broadest idea of who our users were, I decided to start with a survey.  The advantage of accessing every single customer outweighed the disadvantages of self-reporting bias.  Focusing on quantitative results, I took extra care to have very specific questions but still added the ability to clarify any answer.  The questions were structured so I could break down results based on role, experience, and other pivots.  The survey was sent out to roughly 1,000 end customers.

 

A total of 334 participants completed the survey.  A few issues remained consistent even when roles were taken into account, such as 92% of users wanted to be able to view multiple items at a time, allowing them to multitask.  The most common workarounds reported was users were opening multiple windows and tabs so they could easily switch been tasks without losing their place.  Armed with this information, I ensured the new designs allowed users to view multiple items at once so they could multitask.

 

Remote usability studies

In order to counteract the lack of detail and potential bias in the survey, I also conducted remote usability studies via web conference.  Ideally, in-person studies would have been ideal, but were not possible at this time.  I defined the participant requirements as a minimum of 10 users per role (ticketing, phone, chat, and knowledge base); half of these users would be current Parature users with a minimum of 6 months experience, and half would be new users.

 

To satisfy these requirements, I met with a total of 12 customer service representatives, ranging from tier 1 support to manager.  I selected eligible participants for a variety of ages, genders, companies, and experience demographics to eliminate any potential skew.

 

In the study I measured several metrics; time on task, task success percentage, ease of use, and overall usability using the SUS scale.  I also tracked markers to qualify these metrics, such as number of times prompted, alternate ways of doing or using a feature, and any custom metrics as they arose.  From performing these studies, I learned that a common frustration is for support agents on phone and chat, creating a new case for their customer took too long to load, and required too much information before it could be created and saved.  This lead to the design of the "Quick Create" function, allowing users to input the minimum amount of information before they could move on.

 

Web analytics

It can be difficult to gain meaningful insights based off of web analytics alone, but combined with the survey and usability studies, they helped flesh out the data.  I requested a month's worth of data, and took note where the most and least time was spent on the site in order to develop better questions for my survey and study.

 

Using the survey and usability studies to provide context to the web analytics, we saw that the high number of hit rates on ticket pages was due in large part to agents switching around different tickets.  Some pages with low hit rates and high bounce rates turned out to be due to agents quickly looking up some information and then leaving, instead of low-value pages as originally believed.

 

The Results

Based off of all the feedback we got from the previously mentioned studies, we created a new design that integrated the findings from the study.  We added easy navigation between tickets which could be organized into a queue by the ticketing manager roles if desired.  The "Quick Create" allowed agents filling out information in real time to create tickets with less information, auto populated fields where possible, and could be saved in a draft format for an agent to flesh out after a call.  A quick notes section served to replace the Post-It notes on users' monitors.  The remote usability studies also served as a benchmark study against our new design, which improved efficiency by roughly 32% for the core tasks tested.

 

 

 

 

bottom of page