Redesigned information architecture cloud product for productivity improvement

* Note: Product is under NDA, specific details are not included below

Team

My Role:

  • UX Researcher

Collaborators:

  • UX Designer

  • Product Director

  • Product Owner

  • Content Strategist

Timeline

Duration: 5 weeks

Study Presented: 02/28/2023

Methodologies

  • Qualitative Usability Testing

  • User Interviewing

  • Microsoft Desirability Toolkit (MDTK)

  • Likert Scale Survey

For a full case study explanation, including study process, please schedule a case study review with me either through the contracting agency that is representing me for the role you are hiring for or via my email.

Problem Space

It was anecdotally understood from software engineer (SEs) users that the information architecture was confusing for a global feature within an internal cloud platform. This confusion led to a reduction in productivity, and a solution was needed to make the information architecture match the mental model of users.

Based on the anecdotal user feedback, a redesign of the global feature was completed in Figma.

The Product Director (PD) and UX Designer who were involved with the project wanted 3 main questions answered related to the redesigned feature:

  1. Did the removal of a feature create confusion when users arrived in a specific location of the platform?

  2. Did users expect to find a type of information of interest by going through a specified task flow?

  3. Did rearranging the information architecture of a resource page lead to a clearer understanding of how to find content of interest to users?

They wanted to understand user attitude toward the redesign by using the Microsoft Desirability Toolkit (MDTK) as an additional qualitative understanding of usability. Also, they wanted to understand if the redesigned feature was disruptive or supportive regarding user task completion.

Study Impact

I communicated all findings and recommendations by presenting a formal share out deck. The participannt feedback that I gathered addressed the 3 main questions that the PD wanted answered.

  1. The removal of the feature in question did not create confusion for participants

  2. Participants did not expect to find the type of information of interest within a specified task flow, although it was a good secondary option

  3. Rearranging the information architecture of a page did lead to a clearer understanding of how to find content of interest to users

The MDTK revealed participant attitude toward the redesigned feature and a redesigned resource page.

For the redesigned system, the top 3 words that participants chose were:

  1. Clean (N=6)

  2. Understandable (N=4)

  3. Easy to use (N=4)

A couple of explanations of why they chose specific words:

“Clean- Likes that the UI is not cluttered, there is a consistent look”

“Easy to use- If they got lost, it was easy to reorient

For the redesigned resource page, the top 3 words that participants chose were:

  1. Easy to use (N=5)

  2. Clean (N=5)

  3. Attractive (N=4)

A couple of explanations of why they chose specific words:

“Attractive- Looks good for a page that communicates a lot of content

“High quality- The information they want is there, likes how it’s aggregated”

Participant feedback was clear about whether the redesign caused disruption. Using a Likert Scale to rate all redesigns on a 1 to 5 scale from “Extremely Disruptive” (1) to “Extremely Supportive” (5), participant responses were either “Supportive” or “Extremely Supportive”.

One explanation from a participant regarding why they rated the redesigned elements tested as “Supportive”:

The redesign is an improvement over what exists today, with still some room to clarify how to locate information when in a certain area of the internal platform”

One explanation from a participant regarding why they rated the redesigned elements tested as “Extremely Supportive”:

“A specific area of the platform is really called out, and a lot of the activities they want to do are in that area; design is overall really well organized”

Based on participant feedback, I made recommended next steps in the categories of Content Strategy, Analytics, and Iterative Design.

Content Strategy

  • Work with the Content Strategist to continue to improve:

    • Site architecture

    • Use of microcopy

    • Button discoverability

  • Run studies to understand user mental models related to

    • Site architecture

    • Platform content

Analytics

  • Review and synthesize user analytics to understand user flows and potential pain points for additional studies

Iterative Design

  • Iterate design based on study feedback and future studies

  • Run rapid prototyping usability studies to increase speed of iterative improvements

Approach

The feature prototype that I tested had been redesigned with anecdotal evidence as its rationale before my involvement.

Since this was the case, instead of the redesign being based on methodologies such as contextual inquiry and/or user interviewing, my task was to validate what had been redesigned.

I was asked to run a usability study using a Figma prototype.

I was asked to understand:

  1. If the redesign addressed 3 main questions the Product Director (PD) wanted answered

    1. Did the removal of a feature create confusion when users arrived in a specific location of the platform?

    2. Did users expect to find a type of information of interest by going through a specified task flow?

    3. Did rearranging the information architecture of a resource page lead to a clearer understanding of how to find content of interest to users?

  2. Capture user attitude toward the redesigned feature and a redesigned resource page using the Microsoft Desirability Toolkit (MDTK)

  3. If the redesign was disruptive to or supportive for user task completion by asking a Likert Scale question to rate responses

The PD, who functioned as the Product Owner (PO) for much of the study before one was assigned, wanted to gather feedback from 3 user segments:

  1. Managers

  2. Data Scientists

  3. Software Engineers

I wrote a moderator guide that asked interview questions related to their role and responsibilities as they related to the use of the feature.

After the introductory interview questions, I asked participants to share their screen and demonstrate how they used the current feature to complete 3 specific tasks related to the redesign.

I then asked participants to complete those same 3 tasks using a Figma prototype of the redesign.

Next, I asked them to complete the MDTK for the redesigned feature and the redesigned resource page by choosing 5 words from a list of 25 that describe their reaction and to explain why they chose each word.

The session ended with participants filling out a Likert Scale question that asked them to rate all redesigns on a 1 to 5 scale from “Extremely Disruptive” (1) to “Extremely Supportive” (5) and to explain their rating.

Participant feedback synthesized from usability testing, MDTK words for the redesigned feature and resource page, and the Likert Scale question indicated that the redesigned feature and resource page met the business goal of an improved user experience to increase productivity.

What I Learned

I really enjoyed this study- the product was interesting and the UX Designer was fun and easy to work with. I experienced a lot of transformative professional growth through my experience.

  • When I began synthesizing, my focus was on all the features that needed to be improved, but then I realized that my primary responsibility was to address the objectives set by the Product Director

  • I was able to experience collaborating with a UX Designer for a project on a daily basis- they even attended all the participant sessions and asked follow up questions

    • I was confident in my knowledge of the product and the research questions I was addressing with my interview questions, and the UX Designer was able to address questions they had- it was a great combination!