Bringing Behavioral Science to Teaching and Learning Innovation

Amy Homkes-Hayes, Lead Innovation Advocate

Holly Derry, Associate Director of Behavioral Science

Carly Thanhouser, Behavioral Scientist

Molly Maher, Behavioral Scientist

Behind every good learner is good behavior.

We continue to learn that much of the success students and online learners experience comes from choosing the right behaviors, but that doesn’t make choosing easy. That’s why the Office of Academic Innovation is dedicated to integrating behavioral science principles into teaching and learning. Amy Homkes-Hayes, Lead Innovation Advocate, sat down with Holly Derry, Associate Director of Behavioral Science, to explore how she and her team support Academic Innovation’s portfolio of work including educational technology software, MOOCs, and more.

How you would describe behavioral science at Academic Innovation?

We use Behavioral Science to motivate people or spur behavior change. We draw from a collection of strategies and techniques, which have been studied in lab settings or in the field, and apply them to Academic Innovation’s tools. In our work, for example, we focus on how we want learners to behave to be successful in a course. We look through our behavioral science toolbox for ideas on how to keep learners accessing the material every week, completing assignments on time, or engaging on a deeper level in a discussion board. These are just a few examples of where we apply behavioral science in our work at Academic Innovation.


Behavioral science is often used in healthcare settings, but it’s fairly new to edtech. What have you noticed about health behavior change versus behavior change in education?

We’ve been surprised by the similarities. In health and in education, people are working toward goals. In both contexts, some goals are concrete (think: scores) while others are more conceptual (think: feeling better or studying harder). The actions people take to reach their goals might be different, but the ways to motivate people toward these goals are not.

In addition, a lot of the action in healthcare, and education, happens outside of the classroom, or clinical setting. Both patients and learners need self-regulation skills, sustained motivation, and a sense of belonging to continue their health management or learning independent of physicians or faculty.


I hear you talk a lot about how central motivation is to behavioral science. Why?

Anything anyone does is motivated by something. It’s our job to find that something. ECoach, for example, uses both extrinsic and intrinsic motivators to help students succeed in large courses. While tapping into intrinsic motivation may have a longer lasting and more meaningful impact, it’s not always a luxury we can rely on. (How many people are intrinsically motivated to read the textbook?) The extrinsic motivators we have access to (grade feedback, credit, and extra credit) are often better at encouraging students to do something they may not obviously see value in, at first. For example, after students use ECoach’s Exam Playbook, a metacognitive tool that helps them strategize and plan out their use of study resources before and exam, they may become intrinsically motivated to use it again, but we rely on extra credit to encourage them to give it a try.

In the end, we often rely on a blend of intrinsic and extrinsic ways to motivate learners because we know both are valuable in different ways.


How do you account for the differences in intrinsic motivation among learners. In other words, how do you determine what motivates them?

This is where personalization comes in. Each person is different, and when possible, it’s best to offer people choices that will fit with their situation, needs, and desires. This might mean giving people choices on whether they get credit or not, which parts of a course they complete, the time they need to complete it, and so on.

We can also go to the other end of the personalization spectrum and present students with messages written for their specific characteristics. All we need is the data. A few of our projects are powered by the Michigan Tailoring System (MTS), tailoring software developed here at the University of Michigan in 2008. Our projects that use MTS can collect data from people and then deliver a tailored experience to match why they’re taking a course, what grade they want to get, how motivated they are to get that grade, and so on. We can really tailor anything we want, and soon (summer 2018), all of Academic Innovation’s tools can incorporate MTS, and tailoring as well.

Many people assume the tailored messages are computer-generated — they’re not. We’re not using robots to do this. All of the tailored communication and interventions we use are carefully crafted by the behavioral science team.


How do you guide people toward successful behaviors without telling them what to do?

We respect a learner’s right to make their own choices. We see our role as helping people make informed choices. We don’t tell learners what their goals should be, and we don’t tell them what steps they should take. We may say, “here’s what the research says about the best ways of doing something, and here are recommendations you could follow based on that research,” but then it’s up to each learner what they do next. We never say “you must do this or can’t do that.”

We use behavioral science and tailoring to grab people’s attention, to help them see what’s most relevant to them, and to help them make informed choices. We want to share best practices and help people make decisions that are right for them.


What kinds of methods and techniques are you using?

We use methods from many fields, including Public Health, Behavioral Economics, Social Psychology, and User Experience. We rely heavily on Motivational Interviewing (MI), a counseling strategy that has its roots in substance abuse counseling but has expanded into many areas including education. It’s based on the notion that knowledge does not, in and of itself, change behavior. Motivational Interviewing (along with the ability to tailor messages) enables us to find out what motivates people, to build discrepancy between their desired behavior and actual behavior, and to help them more clearly see the path toward behavior change.

We also often use the behavior change recipe from the book Switch, by Chip and Dan Heath. They use the analogy of an elephant (motivation), a rider (logic and reason), and a path (the environment). The goal is to motivate the elephant (because otherwise the rider gets exhausted), direct the rider (so a motivated elephant doesn’t walk in circles), and clear the path (so that the two can get to where they need to go). This is exactly what we’re trying to do here in Academic Innovation by applying behavioral science to our suite of tools, experiences, and opportunities.

Exciting New ART 2.0 Features for Winter Backpacking and Beyond

Amy Homkes-Hayes, Lead Innovation Advocate

Sharing the Curricular History of U-M

Academic Reporting Tools (ART 2.0) has lived in the Office of Academic Innovation since 2015. During its time with us, faculty champion and ART 2.0 evangelist Dr. August Evrard, Arthur F. Thurnau Professor of Physics and Astronomy, and the ART 2.0 team have iterated on what we show the U-M community. Since March 2016, CourseProfile has enabled students to view course history on things like enrollment in a course by school or college and pre, concurrent, and post course selections. In addition, CourseProfile shows a subset of Student Evaluation of Teaching (SET) data on topics like how much or little students wanted to take the course, or if it increased their interest in the subject. Starting in November 2016 the InstructorInfo deck was added, enabling students to view a subset of SET questions on U-M faculty spanning topics from clarity to creating a respectful classroom environment. In Fall 2017, we released MajorMetrics, which includes a timeline of how many students have graduated with a particular degree (referred to as majors for undergraduate students) in the last 10 years, as well as statistics on joint degrees (co-majors) and minors.

In each iteration of ART 2.0 we revisit our original vision and mission of the tool to promote a deep and shared U-M curricular history with our community, and to aid students in exploring, discovering, and selecting courses. We are enthused to share since fall 2017, more than 21,000 U-M community members have accessed the tool 230,000 times, demonstrating the sustained growth of the ART 2.0 service. As we look to the future of ART 2.0, we are eager to announce new features that align with our mission and respond to student feedback.

New ART 2.0 Iterations

Illustration of a laptop screen with icons representing grade distributionThe first exciting new addition to ART 2.0 are grade distributions. Now, when users access CourseProfile, they will see grade distributions for many U-M courses. In fact, of the 500 most searched for courses in ART 2.0, we have grade distribution data on 496. Showing grade distributions in a university-sanctioned tool is something students have asked for since ART 2.0 launched in 2016. Students tell us seeing grades in combination with the perceived workload of the course (something we also show in ART 2.0) helps them make decisions about their course schedule. Of equal importance, showing grades in our service helps “bust myths” about classes – or otherwise helps dispel the notion that “no one gets an A in this course.” We are eager for students to use grade distribution data in addition to the other rich data in ART 2.0 to help inform their class exploration and decision-making.

The second new ART 2.0 feature worth noting is the feedback feature we have added to the tool. Now, users can tell us if they have a positive or negative experience with ART 2.0, as well as leave comments for the ART 2.0 team. We want to hear from our users, and plan on using the feedback we get when examining additional ART 2.0 features.

Finally, we have improved the ART 2.0 search functionality so users can easily toggle between searching for courses, instructors, or majors. Like all of the software we build in the Office of Academic Innovation, we took user feedback on the search experience in ART 2.0 and made improvements to it- making the process of searching less confusing and easier for our users.

Check out ART 2.0 and the new features we’ve added, and feel free to give us some feedback too. We will keep improving the tool in service of the university and its community.

Seeing the “Big Picture”: Using Design Representations to Promote Understanding and Reflection on Design

Rebecca Quintana, Learning Experience Designer

“Visual representations can render phenomena, relationships, and ideas visible, allowing patterns to emerge from apparent disorder to become detectable and available to our senses and intellect.” (Hansen, 2000, p. 198)

A well-known challenge for design teams is being able to get a sense of the “big picture” of a design without a mediational tool or aid (Arias et al., 1997). In the field of learning design, diagrammatic or iconic representations of curriculum design can be valuable because they can highlight the relationships among learning activities, to give the viewer a sense of flow and movement (Quintana et al., 2018). Researchers of Massive Open Online Courses (MOOCs) have just recently begun to make initial forays into creating design representations of MOOC curricula. For instance, Daniel Seaton and his colleagues at Harvard have developed methods that enable the creation of iconic representations of course elements (e.g., videos, textual readings) to promote understanding of relationships among course elements (see Seaton, 2016). Seaton’s methods and approach have inspired much of the work I describe in this blog post.

In Part 1 of this two-part series, I described how we use curriculum storyboards during the early stages of the design process to represent emerging ideas about curriculum design and to give project team members a sense of the curriculum sequence—including its rhythm and cadence. In this blog post, I will share our work in the academic research and development space at the Office of Academic Innovation, and describe our experimentation with two representational formats to understand their potential utility for enabling curriculum designers to understand and reflect on course structure once a course has been designed and has “launched.” I will detail our investigation of the following two design representations: (1) course composition diagrams, which are digital and interactive representations of curriculum design, and (2) beaded representations of course structure, which are tangible constructions made with traditional craft materials, such as beads and straws. Both of these representational formats portray an abstraction of course elements and sequences, and fall under the designation of design representations—the codification of a curriculum design that makes it available for review and critique. The goal of our investigations was twofold: (1) to understand the kinds of insights that could be elicited by each format, and (2) the potential value of eliciting insights among design team members at the conclusion of a design process.

Course composition diagrams

Following the methods described by Seaton (2016), we used, a web-based tool for creating infographics, to visualize the course structure of ten recently launched MOOCs on the Coursera platform. Yuanru Tan, a Learning Design and Accessibility Student fellow at Academic Innovation, manually “scraped” data from each page of each course (e.g., video titles, video length, number of “in-video” questions in a video) and from Coursera’s administrative analytics pages (e.g., average quiz scores) as the basis for the course composition diagrams. We used abstract icons to represent the elements of each course (e.g., videos were represented by orange triangles), displaying them in chronological order. Two examples are shown in Figure 1 (course names have been anonymized).

A visual representation of two professional development courses with diamonds representing assessments, triangles representing video, a four-pointed star representing readings, a long dash representing section headings, and a circle representing discussion prompts.

Figure 1: Two examples of course composition diagrams, with individual course elements shown as icons

The visualization elements were interactive; as users “hovered” over an element, they could obtain more details about it, such as title (for a video) and average quiz score (for a quiz). See Figure 2.

A diagram of a third professional development course with a box of text connected to a triangle which reads: Video - what is a growth mindset and how can it help you succeed?, length - 14 minutes, type - lecture, and in-video questions - 1

Figure 2: Additional details are visible when a user hovers over a course composition diagram

We invited members of the ten design teams who had worked on these courses, including faculty members, design managers, learning experience designers, course development assistants, course advocates, media specialists, and marketing specialists, to interact with one or more course composition diagrams (i.e., for courses that they worked on) and to complete a short survey that asked them to comment on their experience of interacting with the representation.

Yuanru Tan and I qualitatively analyzed the survey data, following an iterative process outlined by Creswell (2015). Our analysis of the survey data revealed several themes, however in this blog post, I will focus on only one—aspects of the affordances of course composition diagrams. Design team members described how the representations provided a high-level overview of course structure which revealed the proportion of one element type to another. The language in the responses to our survey echoed the principles of visual design, such as balance, variety, repetition, pattern, rhythm, emphasis, and movement. For instance, some comments related to balance with respect to the distribution of one or more element type. One respondent remarked, “Each module was relatively ‘even’ in terms of the number of content types.” Other comments related to balance with respect to the weighting or preponderance of an element type within a specific part of the course: “I see a similar trend of very heavy reading modules in the middle/end of the course.” Our analysis showed that the course composition diagrams allowed design team members to see “what was there,” from an objective point of view, while enabling them to see the relationship of individual elements to the whole course “composition,” suggesting that design team members were able to understand the course structure in a nuanced way.

Beaded representations of course structure

In a second study, we explored an unconventional format—that of beaded representations—to represent the five MOOCs that are part of the School of Education MicroMasters program, offered on the edX platform. Noni Korf, Director of the Learning Design and Media Team, envisioned the idea while at the edX Partners conference in Paris. Noni said, “While in Paris, I found some striped beads that reminded me of icons for text documents, which inspired the first beaded representation.” She was interested in seeing if the differences between the Coursera and edX platforms might be revealed through the beaded representations: “I was interested in finding patterns in MOOCs—were there ways of describing our course experiences that didn’t involve us diving deeply into the content? Could MOOCs be categorized as molecules with different molecular weights? Or pie chart bubbles with differing ratios of content with which to interact? Looking across our portfolio, I wanted to see if patterns would emerge in courses that followed a strictly linear structure (i.e., our MOOCs on Coursera) and in courses that offer more flexibility for learners (i.e., our MOOCs on edX).”

Our learning experience design team developed a method for representing course structure using traditional craft materials (e.g., transparent, translucent, and opaque beads, and colorful drinking straws), stringing them together in chronological order. Each course element corresponded to a “type” of material (e.g., readings were depicted using striped, opaque beads). Because the edX interface uses a nested structure, with course elements organized in sections and subsections, we depicted each subsection as an individual beaded string. Participants could comprehend the hierarchical structure of the course by viewing the first string of beads, followed by the next string beneath it. The courses we depicted were not entirely linear, and where learners could make choices about where to proceed next, so we created a branching structure in the representation by crimping shorter lengths of strings and attaching them to the stem (see Figure 3).

Five strands of varying styles of beads with branching strands attached to three of the five strands

Figure 3: The hierarchical structure of a section of a course is evident through the beaded representation. Each string depicts a subunit (an activity sequence that could include elements such as short and long videos, discussion forums, teamwork, and assessments).

We invited members of the design team that developed the MicroMasters series, including Dr. Don Peurach, the lead designer and course instructor, and Kathryn Gabriele, a graduate student (co-designer) to participate in a focus group session to view and interact with the beaded representations. Our conversation centered around the insights gleaned through viewing the beaded representations. Following a methodology that was similar to the first study, we analyzed transcripts from the discussion. Four uses for the beaded representations were evident from our analysis of the discussion: (1) participants were able to make curricular connections related to course architecture and pedagogical approaches, (2) participants were able to come to a deeper understanding of the learner experience, (3) participants gained insights about the design process, and (4) participants reflected on the method.

As with our prior work with course composition diagrams, the beaded representations provided a “bird’s-eye view” of course structure, allowed participants to make connections among elements, and enabled them to notice aspects such as pattern and variety. We noted the beaded representations acted as “boundary objects,” allowing participants to externalize ideas, facilitate shared understanding, and bridge conceptual gaps (Arias et al., 1997). Our use of traditional craft materials was intended to introduce an element of playfulness and intrigue into our participants’ experience. One participant remarked on how he had a “visceral” reaction to the beaded representations: “What the beads are doing is driving me to think and rethink.”

Future work

We intend to deepen our research around design representations, as we continue to refine our methods. Future work will include creating cluster visualizations of courses that are similar along particular dimensions, such as by content type, sequencing, and other types of patterns. We will continue to involve design teams in our work, asking them for their insight toward understanding the utility of various design representations to support understanding and reflection on design at various stages of the design process.

We look forward to receiving feedback on this work at the American Educational Researchers Association (AERA) conference and at the Conference for Human Computer Interaction (CHI). This will help us to advance our work on design representations as part of the work of academic research and development that is happening at Academic Innovation.


Rebecca Quintana, Yuanru Tan, and Noni Korf will present their paper “Visualizing course structure: Using course composition diagrams to reflect on design” at the 2018 Annual Meeting of the American Educational Research Association (AERA) in New York where it has been awarded “Best Paper” in the Online Teaching and Learning Special Interest Group.

Rebecca Quintana and Kathryn Gabriele will present the work of their co-authors Yuanru Tan and Noni Korf at the 2018 CHI conference: “‘It’s just that visceral’: Eliciting design insight using beaded representations of online course structure.”


Selected references

  • Arias, E., Eden, H., & Fisher, G. (1997, August). Enhancing communication, facilitating shared understanding, and creating better artifacts by integrating physical and computational media for design. In Proceedings of the 2nd conference on Designing interactive systems: processes, practices, methods, and techniques (pp. 1-12). ACM.
  • Creswell, J. W. (2015). Educational research: Planning, conducting, and evaluating quantitative and qualitative research. Boston: Pearson.
  • Hansen, Y. M. (2000). Visualization for thinking, planning, and problem solving (pp. 193-220). Cambridge, MA: MIT Press.
  • Quintana, R., Tan, Y., & Korf, N. (2018, April). Visualizing course structure: Using course composition diagrams to reflect on design. Paper to be presented at the Annual Meeting of the American Educational Research Association (AERA). April 13-17. New York, New York.
  • Seaton, D. (2016, January 29). Exploring Course Structure at HarvardX: A New Year’s Resolution for MOOC Research [blogpost]. Retrieved from’s-resolution-mooc-research