UTM Timetable Planner Redesign

Course Selection Simplified
My Role
UX Designer & Data Analyst
  • Prototyping
  • Testing protocols
  • Data analysis
Team
Jacob Lin
Manahil Masroor
Miguel Duterte
Shobhit Srivastava
Project Duration
12 Weeks
Tools
Figma
Google Forms
Excel
InVision

At a Glance

This project was a redesign of the current course timetable planner website for the University of Toronto Mississauga, also known as the UTM Timetable Planner. The goal of the redesign was to reduce excessive scrolling and improve system visibility through layout changes, and create a system with the appropriate amount of information to not overwhelm the users.

A Mission to Reduce Scrolling and Improve Visibility

During early investigations, we noticed that to check the timetable, users must either scroll extensively or use the quick action bar to jump to the bottom. However, this means that adding a course after checking the timetable requires scrolling back up or quick actioning to the top and then scrolling down again—there’s no streamlined way to navigate the course list. Adding to the challenge, when a course is selected, users must rely on memory to track its times from the course description, as the only way to confirm it has been added or identify conflicts is by scrolling all the way back down to the timetable. This repetitive navigation disrupts the overall user experience.

Ideation & Design System

Initial Walkthrough

We invited several U of T students for an initial walkthrough of how they go about their course selection process. Based on observation and students' explanations, the typical user flow can be summarized below:

Areas of Opportunity

The process can be further grouped into 3 main actions: search, navigate, and add/remove. We chose to target navigation in particular because it's the step that users spent the most time on and where they expressed frustrations the most.

Our Aim

  1. Revamp how the system is navigated so users won't need external controls (e.g. Ctrl+F) to find what they need.
  2. Reduce the amount of scrolling by condensing the information being shown.
  3. Improve system feedback and visibility.

Ideate

10+10 Ideation: the Highlights

We used the ideation method of 10+10 sketches to quickly generated design ideas. We also decided to take a more interesting approach by drawing inspiration from both the physical world and the virtual world.

This design was inspired by the look of a honeycomb. Which creates an interesting pattern whilst fitting more information in the same amount of screen real estate.

This design takes on the idea of a deck of cards to sort and minimize the overwhelming information present on a course planning website.

This design takes on the idea of bubble wraps where a course (in the shape of a bubble) can be removed by clicking/popping the bubble.

Testing & Evaluation

A/B Testing - the Designs

For our first design, we opted for the honeycomb design as we thought it would introduce a new way to navigating while achieving everything we want to do (reduce excessive scrolling and information overload). The alternative design takes shape of the deck of cards design from the 10+10 sketches. We opted for this idea for design B in A/B testing since it possesses a similar structure as the hexagon design with a fundamental change to the navigation on the left.

Design A: the Honeycomb

Design B: Cards

A/B Testing - Setup

Each participant was asked to complete 3 timed tasks: locating a course, adding a course, and removing a course. In addition, they were also asked to complete the system usability scale and satisfaction surveys afterward. This case study ultimately analyzed data related to course locating as it was the primary focus of the redesign.

A/B Testing - Analysis

Comprehension

Each participant was asked to complete 3 timed tasks: locating a course, adding a course, and removing a course. In addition, they were also asked to complete the system usability scale and satisfaction surveys afterward. This case study ultimately analyzed data related to course locating as it was the primary focus of the redesign.

Since both p-values were greater than 0.05, we failed to reject null hypothesis 1 and therefore conclude the two designs had no significant difference in terms of user comprehension.

Satisfaction

Participants were asked to complete a satisfaction survey to tell us how satisfied they were with the system in regard to the process of locating a course. We used a 5-point Likert scale ranging from (1) Not satisfied at all to (5) Very satisfied. Again, to see if the difference was significant, we performed two-sample t-tests with an alpha level of 0.05 assuming equal variances using Excel.

Since the p-value was a less than 0.05, we were able to reject null hypothesis 2 and conclude that the difference in average satisfaction rate was significant.

Reflection

Numbers Don’t Tell the Full Story

Despite the statistical tests giving rather inconclusive results, the actual difference between the two designs is noticeable. Aside from the possibility of committing a type II error, many participants were able to offer insights on how they really felt about both designs.

Was the idea of hexagons viable?

  • Some users said they prefer the look of the hexagons as it is unorthodox yet creative
  • The way the hexagons are organized messes with users' perceptions when scanning through (as shown in the figure on the left), which reduces visibility.

Creative or Practical?

  • The layout is boring but effective
  • The cards can fit a bit more information compared to the hexagons, which isn't a bad thing if it is crucial to the participants (e.g. prerequisites).