Empowering institutions to effective assess academic program utility and effectiveness

Team

Product Manager, Front-End Engineer, Back-End Engineer, Business Analyst

Role

Date

2019-2021

0 -> 1

Product Design

Contextualizing Program Review -> Ideate and Iterate ->

Test, Improve, Release ->

Results

The Approach

Contextualizing Program Review

Understanding Relevant Personas

Mapping the User Journey

The team kicked off the project with a 3 day workshop focused on mapping out the program review process and cultivating empathy for the users involved. Some preliminary user interviews had already been conducted (the project had been started then stopped due to business constraints before I joined the team). From those, we knew we were designing for two main user groups - an "administrator", those who set up and project manage the process and "contributor", faculty who actually write the narrative report.


Based on the user interviews, we drew up empathy maps to understand how their thoughts, feelings, and motivations relate to the program review process specifically.

Using what we learned from the preliminary user interviews, we mapped out the journey of the administrator and contributor to get a comprehensive understanding of what their program review experience currently looks like, respectively. Visualizing the journey is helpful to establish understanding among project stakeholders as well as tease out further questions/assumptions.

We expanded the map to include environmental and external factors. We noted the perceived pain points that the current program review process yields. This served as a great launching point - we had an idea of what themes we could delve into as we continued user interviews:


  • communication between administrators and contributors - those that actually write the report

  • contributors are not feeling empowered to write an insightful report via lack of access to data

My Product Manager and I conducted additional user interviews based on the assumptions and questions generated during our workshop. We interviewed 4 administrators from 4 different institutions. It was important to talk with users from a variety of institutions - we wanted to make sure we're solving for as many use cases as we can. Some example questions included:


  • Describe your program review process. What is the timeline? Who is involved?

  • What are your pain points for your program review process? Are there any challenges you have that come to mind?

  • Who needs to look at individual program review reports? Is there feedback provided to the programs? How is this feedback provided today?


I had already started sketching some ideas - translating the ideas we were seeing to features and functionality, but after the round of user interviews I had plenty to work off of.

Ideate and Iterate

User Interviews

Sketching is my preferred way to get ideas down quickly. I'll often time box myself. For example, give myself 5 minutes to come up with as many ways as possible to solve a given problem. That way, I don't really have time to judge my ideas - it's useful in doing a brain dump.


I try to take a picture of sketches I think I can build off of and keep it in my sketch document as I wire and, eventually, move to high fidelity. Having the timeline of artifacts helps give context when communicating ideas to the team.

Over the next several months, I collaborated with my fellow designers, product owner, development team, internal stakeholders and of course, users, to test and gradually improve the administrator and faculty program review experience.


It was a fun challenge to not only balance user and business needs, but also reconcile a work-in-progress design system while working within any technical constraints. Consistent communication between everyone - designers, developers, and product - was essential.

We worked up to a beta test and subsequent release.

Rapid Prototyping

Before limited release of the product, I wrote guidelines, scheduled and led beta testing. In total, we spoke to 12 users (split between administrators and faculty) to validate our designs and continue to smooth out knots in the experience.

The tests served as an awesome practice opportunity for me - there's definitely an art to framing user tasks in moderated sessions.

We learned a lot about how we could improve our designs, especially regarding the program review set up flow. Themes included:


  • Some of the language we used is unclear i.e. "reporting years". Users saw it as the year it's due instead of years associated with the program review.

  • Confusion around the steps of the process - users weren't aware of the micro-save before moving on to the next step

  • Data integrations, specifically the program mission statement, were not clear

User Testing: Program Review Beta

I compiled a testing summary document that included test subject details, notes and quotes. Within the document, I took a pass at prioritizing next steps based on user feedback we saw during testing. Overall, most of the action items were "quick fixes" - the logic of our design was sound thanks to iteration, collaboration and consistent solicitation of feedback (always trust the process!).


Below is a high-fidelity prototype of the improved program review set up flow, incorporating feedback from the beta tests.

Redesigning a powerful k-12 assessment analysis engine

The Approach

Our first goal was to establish parity with current Aware.


We started with the information architecture of the platform, with the goal of implementing a simplified navigational structure that provided clarity by better matching user mental model and eliminating friction to complete customers’ primary jobs-to-done.


It was important to understand what exactly was in the product - what provided value and what didn’t. To do so, I partnered with our head of training to audit the current functionality. We learned there was a lot of redundancy in functionality and a few features that customers simply don’t use. We were set up for many quick wins. I also learned that that some very interesting interaction design choices were made i.e. triple-clicking menu items to expand associated information in some cases.

Beyond Parity

After a year and roughly 6 cycles of work, we achieved parity with Classic Aware. We moved onto new functionality that has been top of customers’ - and our - minds for some time. We knew this from a combination of feature requests and conversations with districts. The most important of which was Single-Test Analysis. At this point, I was not doing the day-to-day work, but working with the designer who was, coaching them through the ups and downs of the project.


STA is a complex tool, and although similar to Quick Views, STA does X.


The addition of single-test analysis, along with the simplified, intuitive redesign of the space made up a new "Aware Premium", that facilitated a sizable amount of net new and upsells.

The Opportunity

Aware is Eduphoria’s flagship product. It’s an assessment analysis engine used by ~700,000 people in k-12 school districts primarily in Texas. Aware is an immensely powerful tool, but over its 15 year existence, a lack of user research, systems-thinking, and human-centered design, in general, compounded to result in a bloated experience with unintuitive interactions, outdated visual conventions and certain features that, simply put, did not provide customer value. All of this, in context of Eduphoria’s goal to expand to new markets in the near-term sparked a complete overhaul of the product, creating a new foundation that would allow us to build on top of it sustainably, setting the company up for long-term growth.


Through user feedback and market analysis it was clear that Aware needed to be revamped to enhance usability, focus on our customers’ jobs to be done, and bring its visual identity to modern standards.

I worked with my product manager and front-end lead to strategize the next few cycles of work, prioritizing outcomes and getting an idea of what we'd do, when.

Visually, we used a stopgap solution - a beta palette - while collaborating on our brand refresh with an external agency. We will be implementing a visual experience that is consistent with our brand later this year.

The primary job customers use Aware for is analyzing assessment(s) data through different lenses. Teachers and administrators are able to pivot the data based on different criteria i.e. ethnicity, special education status, grade, etc. The functionality is called “Quick Views

Despite it being the highest value task for users, accessing a quick view took a long time to get to. Users had to navigate multiple layers of a multi-layered tree navigation menu. In talking with our internal SMEs and customers, we learned that the large majority of time, users primarily visit the same 3-5 views. We redesigned the quick view flow to reflect that. Users are able to designate favorite views, bringing what they need most of the time to the highest level of visibility.


This was a huge improvement to the existing flow - including I also sought to simplify the navigation by housing quick views, under the umbrella of data views, which also contains saved views. I then evaluated this solution with a handful of customers. It was well received. This was a great exercise in stakeholder alignment and communication - our CSO was not convinced about the new design, specifically not having quick views as its own navigation item. Once we went through the documented research findings, he was on board. At this point, I looped in our customer success team to design an onboarding campaign with the tool we use, UserPilot to ensure that customers were aware of change in IA and why we made it, and the value it brings.

Quick Views: The Main Value-Driver of Aware
Team

Product Manager, Front-End Engineer, Back-End Engineer, QA Engineer

Role
Date
Date

2021-2023

Web

Web

Product Design

Product Design

Aware continues to be the biggest earner for Eduphoria, accounting for more than half of Eduphoria’s annual revenue.


Once new navigational design and additional analysis features were implemented, we saw a 17% increase in product sales year-over-year.


Through this project I - along with the design team - forged a closer working relationship with customer success and support, which has proved to be extremely valuable for us in ensuring that we ensure that we’re not only building the right thing, but it’s desirable and usable, too.


Before I joined, Eduphoria had never tested designs with users, just internal folks. Through this project, the team was exposed to the immense value of shortening assumption cycles through user testing. I was proud to be able to lead that culture-shit for the product org.


There is still work to be done within Aware as we prepare to sell to out-of-state - figuring out how best to accommodate assessment practices of new markets.

Results

The Approach

Our first goal was to establish parity with current Aware.


We started with the information architecture of the platform, with the goal of implementing a simplified navigational structure that provided clarity by better matching user mental model and eliminating friction to complete customers’ primary jobs-to-done.


It was important to understand what exactly was in the product - what provided value and what didn’t. To do so, I partnered with our head of training to audit the current functionality. We learned there was a lot of redundancy in functionality and a few features that customers simply don’t use. We were set up for many quick wins. I also learned that that some very interesting interaction design choices were made i.e. triple-clicking menu items to expand associated information in some cases.

I worked with my product manager and front-end lead to strategize the next few cycles of work, prioritizing outcomes and getting an idea of what we'd do, when.

The primary job customers use Aware for is analyzing assessment(s) data through different lenses. Teachers and administrators are able to pivot the data based on different criteria i.e. ethnicity, special education status, grade, etc. The functionality is called “Quick Views

Despite it being the highest value task for users, accessing a quick view took a long time to get to. Users had to navigate multiple layers of a multi-layered tree navigation menu. In talking with our internal SMEs and customers, we learned that the large majority of time, users primarily visit the same 3-5 views. We redesigned the quick view flow to reflect that. Users are able to designate favorite views, bringing what they need most of the time to the highest level of visibility.


This was a huge improvement to the existing flow - including I also sought to simplify the navigation by housing quick views, under the umbrella of data views, which also contains saved views. I then evaluated this solution with a handful of customers. It was well received. This was a great exercise in stakeholder alignment and communication - our CSO was not convinced about the new design, specifically not having quick views as its own navigation item. Once we went through the documented research findings, he was on board. At this point, I looped in our customer success team to design an onboarding campaign with the tool we use, UserPilot to ensure that customers were aware of change in IA and why we made it, and the value it brings.

Quick Views: The Main Value-Driver of Aware
Beyond Parity

After a year and roughly 6 cycles of work, we achieved parity with Classic Aware. We moved onto new functionality that has been top of customers’ - and our - minds for some time. We knew this from a combination of feature requests and conversations with districts. The most important of which was Single-Test Analysis. At this point, I was not doing the day-to-day work, but working with the designer who was, coaching them through the ups and downs of the project.


STA is a complex tool, and although similar to Quick Views, STA does X.


The addition of single-test analysis, along with the simplified, intuitive redesign of the space made up a new "Aware Premium", that facilitated a sizable amount of net new and upsells.

Aware continues to be the biggest earner for Eduphoria, accounting for more than half of Eduphoria’s annual revenue.


Once new navigational design and additional analysis features were implemented, we saw a 17% increase in product sales year-over-year.


Through this project I - along with the design team - forged a closer working relationship with customer success and support, which has proved to be extremely valuable for us in ensuring that we ensure that we’re not only building the right thing, but it’s desirable and usable, too.


Before I joined, Eduphoria had never tested designs with users, just internal folks. Through this project, the team was exposed to the immense value of shortening assumption cycles through user testing. I was proud to be able to lead that culture-shit for the product org.


There is still work to be done within Aware as we prepare to sell to out-of-state - figuring out how best to accommodate assessment practices of new markets.

Results