Ingrid Horng
UX Designer


Slate is a management application that faculty members use during the admission cycle. Every year, the University of California Irvine Graduate Division receives thousands of applications for their graduate programs. I worked with a team of 5 to build the next version of the application to streamline the application review process. I led the design process from ideation to prototype and was responsible for the the Application Dashboard.

  • Role: UX Designer (1of 2)  

  • Timeline: 6 Months    


Applicant Dashboard


Slate is a third-party software program so we were immediately constrained by what could be implemented by program administrators within the Grad Division versus what would need to be escalated to Slate’s parent company, Technolutions. This meant we had to spend time identifying where we could provide differentiated value for Faculty members without full-on customization.


Understanding the problem

To kick off the discovery phase, I worked with the researchers to understand the problem by interviewing the faculty members. These engaging interviews uncover the root cause behind the dissatisfaction with the current state of Slate and offered valuable insights into how it's being utilized in their daily work. From the interview findings, we wanted to gauge their perception of Slate's features through a survey. This quantitative approach allowed us understand which areas needed improvement for a better user experience.

Interview Findings:

  • Larger the program, the more they struggle in viewing multiple applications at once

  • Difficulty finding key features result in faculty members believing they don't exisit

  • Heavy reliance on admissions processes and workarounds outside of Slate

  • Privacy concerns around protecting data from being seen

  • Forced to do a multitude of limited actions in a set order, incurring repetitive stress

Survey Findings:

  • Dissatisfaction - 70% of respondents had a less than favorable experience

  • Low mastery - 37% of respondents felt somewhat to highly confident in their mastery

  • Lack of feature utility is strongly correlated to lack of satisfaction level - 100% of respondents who found filters and bins to be very not useful were also very dissatisfied

Identifying usability problems

I took a closer look at the usability of the product by conducting a heuristics evaluation. By using Jakob Nielsen's 10 Usability Heuristics as the guideline, I identified the 5 common violated principles that fueled the faculty members unsatisfying experience:

  • Recognition rather than recall

  • Visibility of systems status

  • Flexibility and efficiency of use

  • Aesthetic and minimalist design

  • User control and freedom


Evaluating competitors

From the market perspective, we wanted to understand how the product stands in the market and identify any gaps in the market. I conducted a competitive analysis, examining 3 competitors on their strengths and weaknesses. The insights helped made informed decisions on what kind of features should be used for a better user experience.



Walking users' shoes

From the research findings, we learned there was a clear pain point coming from the faculty members' but wanted to identify where and when it was happening. I created a user journey map that showed the existing system was inflexible and limited, and that it didn’t fit faculty members’ mental models of their actual admissions processes. As a result, many faculty members were satisfying most of their admissions process tasks outside of Slate.


User Flows

From the journey map, we identified there were 2 existing admission process flows: Slate's suggested flow and faculty members' actual flow. By visual mapping out the flows, it guided us to make a call on the final design scope to remove interactions that were cumbersome and focus on tailoring to the faculty member's mental model.

Meeting the users

Within the faculty members, each individual was assigned to role during the admission review process: assigners or reviewers.


Problem Statement

How might we support faculty members from different departments to efficiently review applications to help them reach their goals?

With our research insights, we re-framed them into opportunity areas and innovate on issues to focus on faculty members' needs and problems instead of jumping straight to solutions.




My team and I went through a series of brainstorming session to finalized the concept design, refining them based on feedbacks received from the stakeholders and the team. Once we finalized the concept design, I created a prototype for our first round of user testing to gather faculty members' feedback. I was responsible for the applicant dashboard, where I focused on breaking down complex data into digestible information using visual hierarchy and data visualization.

Iterating designs through user testing

We put our designs to the test with our first usability testing. We wanted to learn if our solutions aligned with the faculty member's mental model.


Applicant Dashboard: Data Visualization

Feedback: Charts were useful but faculty members requested greater interactivity.

Iteration: Redesign the charts to be dynamic with the table by reflecting whats on the table


Applicant Packet: Comment Feature

Feedback: Privacy was a concern when it came to commenting on the applicant materials. Faculty members were uncertain if the comments that they saw were public or private.

Iterations: Added a feature where they could select the option to post their comment privately or publicly.


SIR Dashboard: Applicants Contact Information

Feedback: Most faculty members did not notice the CSV icon, which allows them to export applicants contact information.

Iteration: Redesigned the feature where they could only select the email address to copy and paste it onto their email.

Second round of user testing

Overall, we received positive feedback on the UI design and interactions. There were areas that required further exploration, such as the filter, form fields, and table functionality. Since we were not able to tackle those tasks with our given time, we added them to our future roadmap to revisit.



Reviewing applications in a more intuitive way

In the 6 months, we developed a solution by identifying the “big rocks” – those overarching “process needs” that were consistent across departments. Together, we focused on providing visibility and efficiency by introducing 4 new features and 5 usability/visual improvements that.

Design System

In addition to delivering the designs, I collaborted with another designer to establish the UI guidelines. Throughout the process, our goal was to ensure the user experience was seamless by creating consistent patterns and components throughout the product.



Faculty members' were excited to see that their mental model of the review process was finally acknowledged. A critical part of the new experience is that faculty member's who were reviewing or assigning applicants have a clear understanding of how to efficiently review large numbers of applications together as a team. The existing Slate platform did not do a good job of supporting collaboration, relying most on one person's contribution instead.