HCII Teacher Dashboard

Problem Statement

I worked in the Human Computer Interaction Institute at Carnegie Mellon designing and researching the best ways to implement a teacher dashboard for an online cognitive tutor. I was the designer on the team for two different dashboard projects, creating wireframes for the "Next-Day" concept and designing materials for and helping conduct interviews for the "Real-Time" concept.

My role in working on the Next-Day Dashboard was to enhance the ability of the teacher to track the progress of their class, while my work on the Real-Time dashboard centered around exploring ways to facilitate teacher interventions for students in real time. The use scenario for the dashboard would be a blended classroom for middle school students. I worked directly with graduate students Françeska Xhakaj and Kenneth Holstein and presented my progress each week to Vincent Aleven and Bruce McLaren.

Questions I addressed:

  • How to design for uncertainties and adaptability?

  • How to layer micro level and macro level information into an interface?

  • How does a designer use UX to encourage certain behaviors?

  • How to incorporate participatory design results into a design prototype?

  • How can a designer accurately learn about a person’s behavior?

  • What are ways to construct an exploratory design interview? 


Solution Methodology


A cognitive tutor is unique in that it learns and adapts to what students know, moving students at their own pace and identifying different misconceptions they might have. This presents a unique challenge to teachers for planning their lectures and activities, as they must teach students who may be at vastly different places in the curriculum. As students must enter in every step they take, the tutor can provide powerful information on where a student may have a misconception or whether or not a student has truly mastered a subject.


"Skill or Mastery" is defined as the probability that a student will correctly answer a question on a topic if tested on it, however, a threshold for the percentage of mastery implies mastery to be an predetermined binary measure.

"Errors or Misconceptions" are specific types of errors that the tutor detects are indicative of a misunderstanding, however, only about a third of these errors are categorized. While teachers loved to see these misconceptions, the metric itself is much weaker than the analysis being performed on mastery and skills.

Teaching style or level of involvement by the teacher is the final, arguably most difficult, uncertainty to plan a dashboard around given the range of teaching styles. The separation of class and individual metrics helped address this issue by separating use cases and hiding complexity.



This is a prototype Françeska Xhakaj showed to teachers for early usability testing. The main issues I had with the prototype was repetitive information and a lack of strong hierarchy in the dashboard, these problems resulting in teachers being unlikely to use or value the dashboard. The UX didn't align with current teacher practices of talking to small groups or the desire to see who's struggling and why.


One of the biggest challenges I had was knowing enough about how the data was being generated in order to make accurate representations for each metric. Taking into account how student data and learning curves would affect what teachers would see and act upon was another semi-invisible variable I had to account for. My early iterations weren't as clear in affording certain teacher actions, but I began to move towards providing enough information for a teacher to construct a lecture entirely from one screen of the dashboard.

Next-Day dashboard wireframes

My goal was balance highlighting the most important information so a low time investment teacher could quickly check in on how their class and individual students were performing while providing enough information to satisfy a high time investment teacher. Each dashboard needed to stand independently from each other as a teacher making decisions with the help of both dashboards would be forced to use their working memory.

Class-Level Screen

A teacher could use the data from the dashboard to adjust their lecture planning to optimize learning and planning supplementary lectures. Whole skills, which are the basis for lectures, are highly emphasized, encouraging to teachers to, at the very least, adjust their schedule based on how their class as a whole is learning. The dots of the error chart highlight patterns of errors, allowing the teacher to make the judgement of whether a certain error is still relevant to the class. 

Student-Level Screen

A teacher could never use this screen and still be informed about how their class is moving through the material, however, this screen is especially useful for helping individuals and small groups of students. Various filtering methods literally show how groups of students are moving through a problem set, presenting teachers a pre-made group they can talk to at once (a technique already utilized by teachers). Showing direct, literal progress of students in the tutor also help teachers more accurately understand how the tutor, mastery, and errors work.

Real-Time Interviews

I planned the interview protocol with Kenneth Holstein, but designed almost all of the interview materials myself. Our objective was to understand what needs a Real-Time Dashboard is required to fulfill while remaining open to different technologies, our overarching goal to augment and enhance a teacher's ability to circulate effectively and stage learning interventions. In addition to finding needs to satisfy, mapping how a teacher currently works is necessary in order to more smoothly integrate the dashboard into their current teaching practice.

Card Sorting (Super Power Question)

"Card sorting is a participatory design technique that you can use to explore how participants group items into categories and relate concepts to one another...(Universal Methods of Design)"

In beginning the interviews with the question: "If you could have any super power to help you teach, what would it be and how would you use it?", I intended to open the minds of the teachers, many who might already have fixed ideas for what a dashboard is. Moving the teachers expectations away from me providing all the ideas with a simple open-ended question encouraged participation and ideation from the teachers themselves as well.

Directed Storytelling

"...directed storytelling is a shorthand means of collecting compelling stories from participants when time or other factors prevent direct observation...(Universal Methods of Design)"

Given contextual inquiry restrictions due to the time of year (summer), I printed or asked the teachers to draw a seating chart of the computer lab or group work area. I opened with open-ended questions such as, "what did a successful day look like", then tried to prompt directed storytelling by following up with questions such as, "what did you do specifically that helped make that happen?"

Many teachers misrepresented their true behavior patterns as my colleague Ken Holstein noted that talking with students about non-academic subjects or even being on one's phone were common teacher habits. The need for a researcher to watch how a subject actually behaves is critical to success.


"...design teams can learn how people react to new technology while also taking in to account existing contextual and social factors (Universal Methods of Design)"

One of our primary goals in using speed-dating was to uncover teachers' gut reactions to certain methods and technologies for interacting with their students. Grey areas such as "how much control and information is necessary?", "how much attention should be spent on a student versus the class?", and "how much do you trust of what the dashboard is telling you?" were explored through discussion with teachers surrounding these scenarios.

Encouraging teachers to respond to the idea instead of the exact details of the scenario was an essential part of the process. Most teachers would have a strong emotional reaction to each scenario; the questions below were used as prompters if necessary.

What I learned:

  • Methods for becoming immersing one's self into a project

  • How to design for unknowns and adaptability

  • How to layer information within a hierarchy of use cases

  • How to translate exploratory interview responses into design decisions

  • Importance of contextual inquiry for accurate design research

  • How to run effective interviews and adapt on the spot


Affinity Diagrams

I went through the transcriptions of each interview, pulling out salient quotes. I found a digital affinity diagram was useful in keeping my thoughts organized, mobile, and easily sharable.

Next-Day Dashboard: List of changes

In submitting my final designs for the Next-Day Dashboard I also included an excel spreadsheet describing all changes I made to Françeska Xhakaj's original dashboard with teacher quotes to back up my proposals. Creating this spreadsheet was helpful as it forced me to explicitly cite my reasons for my decisions instead of only relying primarily on intuition and suppositions.


Vincent Aleven designed these two graphics, as well as other iterations which I modified and applied to the Next-Day dashboard. Connecting the student progress in the cognitive tutor to the physical world, through a simple calendar, helped ground and connect the dashboard to discrete time, which is easier to understand.

I used this graphic and its zooming feature, showing trends over time through changing levels of scale, as inspiration for the class-level page metrics in the Next-Day Dashboard.



I read literature published on teaching dashboards and research methods to provide context as to why dashboards are an up and coming tool as well as specific implementation for educational environments. Information Dashboard Design by Stephen Few was especially helpful, I adapted many of his principles and a few of his graphics into the Next-Day Dashboard.

Literature I read:

"Cross-disciplinary Participatory & Contextual Design Research: Creating a Teacher Dashboard Application"

  • Troy D. Abel andMichael A. Evans

"Learning dashboards: an overview and future research opportunities"

  • Katrien Verbert, Sten Govaerts, Erik Duval, Jose Luis Santos, Frans Van Assche, Gonzalo Parra, and Joris Klerkx

"An Interactive Teacher’s Dashboard for Monitoring Groups in a Multi-tabletop Learning Environment"

  • Roberto Martinez Maldonado, Judy Kay, Kalina Yacef, and Beat Schwendimann

Information Dashboard Design

  • Stephen Few

Storytelling with Data: A Data Visualization Guide for Business Professionals

  • Cole Nussbaumer Knaflic

Universal Methods of Design

  • Bruce Hanington