Lynnette is an online tutoring software for simple algebraic equations which I redesigned to be more engaging.
The primary users were middle school students being overseen by a teacher in a computer lab.
I worked at CMU’s Human Computer Interaction Institute on an interdisciplinary team under the direction of Vincent Aleven and Jonathan Sewall. Two other designers were in charge of branding, and I worked with an engineer for implementation.
Focus groups (secondary)
Workshop user testing
process for research
American Psychology Association - Top 20 Principles from Psychology for PreK-12 Teaching and Learning
Principle 10: "Students persist in the face of challenging tasks and process information more deeply when they adopt mastery goals rather than performance goals.”
Principle 12: "Setting goals that are short term (proximal), specific, and moderately challenging enhances motivation more than establishing goals that are long term (distal), general, and overly challenging.”
Emphasizing the idea of mastery was essential for student engagement, so I sought to include visual reminders within the problem solving interface, for example, by always displaying the progress throughout a level.
Focus groups and interviews
I watched earlier recordings of focus groups, interviewed a teacher, and tested the old tutor interface with middle school students. User testing with middle schoolers was a novel experience, kids being among the most talkative and at times confusing users I have ever interacted with.
Old tutor interactions
After recording all of the necessary inputs on the old tutor interface for solving a problem, I looked for opportunities to create feedback which was immediate, informative, and fun.
The interaction for a correctly attempted step reflects how the skill bar is updated, filling the input box from left to right as the skill bar increases from left to right. The skills being used in that step are temporarily highlighted, so a student interested in how they are progressing can better understand the connection between each step and each skill.
The animation for an incorrectly attempted step also reflects the corresponding skill bar update, filling the input box from right to left. However, for all future attempts, the animation changes. This is because the tutor only updates skill bars based on the first attempt. As such, the pulsing animations following the first attempt signify neither an increase or decrease in skill.
process for skills
Yanjin Long and Vincent Aleven - Students' Understanding of their Student Model
"We found that students pay close attention to the OLM and report that seeing [the OLM[ change encourages them to learn…” (OLM = Online Learning Module, a.k.a. the tutor interface)
“students inspected the OLM frequently to see their progress, but thought that the design of the [old] OLM was too simple…”
Mastery is most tangibly represented through skill bars, thus the way in which skills were handled was crucial. Moreover, secondary research and user tests showed students were interested in and motivated by seeing their skill bars change.
I experimented with making skill bar updates quite explicit, to appeal to students who were interested in seeing how they were progressing. However, to track the animation occurring, an excessive degree of eye movement was required.
To reduce the degree of required eye movement, I experimented with skill bars appearing directly below the input box. While sketching helped me outline the concept, I had trouble detailing the motion and rhythm.
Students I showed this to were able to understand the connection between steps and skills, and also commented that the animated increase for the skill bar was “satisfying to watch”. Upon further reflection however, the problem solving process itself was being slowed down, a fatal issue.
I created this animation with a focus on reducing overall time spent. Like the animation above, I wanted to emphasize the overall degree of skill a student had accumulated, but decided for incorrect answers, that same animation would feel disingenuous.
Final animation specifics
The motion behind the first animation I tried was linear, and didn’t feel organic or dynamic. For my next iteration, I consciously tried to go too far with its dynamism, in order to better understand where the threshold of “just right” was.
A student masters a level when they master every skill, an accomplishment I wanted students to enjoy and be intrinsically motivated by. Thus, an extra juicy interaction to celebrate level mastery was necessary.
process for mastery
I wanted to make sure that the beginning of the mastery animation was the same as the correct step animation (a left to right wave). Maintaining consistency between interactions was important in order to avoid confusion.
Mastery animation 1.0
This felt too fast, as a user may not understand that this mastery animation was a continuation of the correct step animation.
In comparison to my first iteration, this animation felt much more controlled, satisfying, and deliberate.
Students over-estimating their success when solving a problem and being surprised by a decrease in their mastery was the top pain point I observed in user testing. By displaying errors on the problem solving interface in real time, I sought to avoid this cognitive dissonance.
process for error history
Yanjin Long and Vincent Aleven - Students' Understanding of their Student Model
“…discrepancy between the student’s and system’s assessment may indicate inaccurate self-assessment abilities of the students.”
Detour - UI Paradox
Conventional UI wisdom is often about making a process easier for a user, in this case transforming the “next” button into a “done” button to signify to the user they are finished with the problem. However, this automated transformation assumes the student knows when they are finished with the problem, an assumption which might limit learning.
I saw this explanation step as a self-assessment tool, reducing the chance of a student guessing their way through a problem. Later in the project, it became unclear whether the explanation step would be included in the final interface, so I reduced its prominence.
Errors as an expectation
The golf metaphor of “par”, with a certain number of errors being expected, was a fun idea apart from a couple fatal flaws.
- golf isn’t very accessible to middle school students
- an “expected” number of errors isn’t built into the tutor, so the number would need to be generated arbitrarily
Permanent error history
With these wireframes, I wanted to experiment with how errors and correct answers might be permanently shown on the problem solving interface.
Past input boxes
In order to make the interface more closely appear like a paper worksheet, my project managers recommended making past input boxes disappear (screen on right). I agreed that a visual distinction should be made between the past and current input boxes (unlike the screen on the left), but disagreed on entirely removing the past input boxes.
In the next meeting, I came back with the interface below. I showed all 3 interfaces next to each other and was able to convince my project managers that reduced opacity for the past input boxes would be a better compromise.
Affordances for teachers
Showing the error history on the problem solving interface will also help teachers. Through contextual inquiry, it was found that students often avoid asking for help even if they need it. However, with error history being shown, teachers can see who they may need to help just by looking at the concentration of grey symbols on a student’s screen.
To ensure a student won’t remain stuck indefinitely, the last hint for a step will contain the correct answer. Through focus groups, it became clear that students often hated reaching this last hint prematurely, avoiding hints entirely as a result. This issue is alleviated by providing visual feedback for how many hints are left before the final hint.