Gallery Tool Unlocks Peer Feedback Possibilities for MOOC Learners

James Park, Design Manager

Rebecca M. Quintana, Learning Experience Design Lead

During our team’s work on the Storytelling for Social Change course with University of Michigan (U-M) faculty member, Anita Gonzalez, we recognized a need to really bring to life a tool that would allow learners to share their text- or image-based work for the course with other learners in an easy and open manner and to also receive robust feedback on their work from other learners. Because of the nature of our learners’ work in the course, and because of the different expectations and experiences we wanted them to have, we sought an alternative to the course platform’s peer-review tool. This alternative had to be one where work and feedback could both be shared more freely and in a way that prioritized high-quality interactions (especially dialogue) over numerical scores and one-way assessment. Ultimately, we ended up with a Gallery that would facilitate this sort of learner interaction and empower learners to share work—or multiple works—without fear that criticism of their work or the particular “rules” of the peer-review tool would impede their  successful progress in the course.

The Gallery tool is not only being used in the Storytelling for Social Change course, but also our Python Basics course, which introduces the basics of Python 3. For Python Basics, we wanted learners to have an opportunity to practice their Turtle programming skills and to submit their work for peer feedback. We wanted a lightweight option, something that would allow learners to share their work in a “low-stakes” environment, without the formality and restrictions of peer-graded assignments. The Gallery Tool allowed us to create a forum for learners to upload their drawing(s) and create prompts, which ask their peers for specific feedback about their drawing. We set the tool up to allow learners to filter on type of drawing, such as abstract, animal, building, logo, and nature.

We are already seeing a tremendous range of subject matter in the Gallery, including spider webs, pyramids, U-M logos, nature scenes, and many, many abstract drawings. Learners are asking for feedback on topics such as how to create color effects, how to create specific shapes, and areas for improvement. Interestingly, learners are also asking questions of other learners that relate to skills they have demonstrated in their drawings, such as “How do you fill a shape?”

Abstract drawings of turtles and a multi-colored spiral

Figure 1: Two Turtle drawings, published in the Python Basics course using the Gallery Tool


What does the Gallery Tool do?

Submission page for gallery tool

Figure 2: The Upload Submission screen learners see before submitting a piece for peer feedback.

Learners can upload a text- or image-based artifact, or a link to an artifact in another medium, to the Gallery, where they will be able to also provide a synopsis of the artifact and some relevant questions they would like to pose to a potential reviewer. In turn, they will have the ability to browse other learners’ work and provide feedback on it, taking into account the very questions that their colleagues have posed in association with the work. The Gallery is very much a place of reciprocity, and it thrives on learners contributing and receiving meaningful thoughts and reactions from others.

A collaboration across Academic Innovation teams

The creation of this tool was very much a joint effort across Academic Innovation teams, namely the Online Tools Team, who did the heavy lifting of designing and building the tool, the Design Management team, and the Learning Experience Design team. Our former colleague Steve Welsh provided a lot of early guidance on the tool’s design from a learning-experience perspective, and Anita Gonzalez also contributed helpful ideas about its purpose and execution, as well as a thoughtful early critique of a prototype. Together, members of all of these teams met regularly to assess the Gallery’s features, design, and future efficacy when employed in the context of our courses.

One personally eye-opening aspect of the development process was the careful balance of designing a robust tool that would be truly effective in Storytelling for Social Change—which was natural, since it was the impetus for the tool in the MOOC context—but easily adaptable for other courses and contexts from both the pedagogical and programming perspectives.

What’s next for the Gallery Tool?

We can see lots of potential for use of this tool in future projects. Essentially, this tool is a forum for learners to participate in a “show and tell” of their work, because it allows them to share creative artifacts and receive feedback from peers. Some courses ask learners to complete a final project. The Gallery Tool would be a great place for learners to share sketches and drafts, and receive responses to questions about their work before they submit their project for summative evaluation. Learners can also browse through previous examples, before beginning work on a challenging project. Some of our courses are hosted on two platforms simultaneously. Since the Gallery Tool works through Learning Tools Interoperability (LTI) integration, the tool could be a bridge between both versions of a course. Learners on Coursera would be able to share work and interact with learners on edX, and vice versa. In sum, Learning Experience Designers and others at Academic Innovation are excited by the flexibility that the tool affords, and are eager to use the tool in situations where learners would benefit from the opportunity to share and showcase their early work with a receptive and constructive audience.


Personalized Electronic Coaching: The Origin of ECoach

Amy Homkes-Hayes, Lead Innovation Advocate

Personalization is a popular concept in and outside of higher education, yet definitions vary, sometimes widely, about what it means to “personalize” educational experiences for students. ECoach, a tailored communication system, is using personalization backed by well-researched behavioral science, smart user experience design, and ongoing software development, to help students succeed in large courses. Professor Tim McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education and founder of ECoach, explains what it was like for him to grapple with meaningfully and successfully reaching hundreds of students in his large introductory physics course. Listen as Professor McKay talks about the “a-ha” moments that motivated him to create a digital edutech solution to provide the right information, at the right time, and in the right way to his students. Hear Professor McKay examine how ECoach has evolved over time, and what the future may look like for ECoach and thoughtful, student-centered, and technology driven personalization as part of the future of higher education.


Take a listen by clicking below.



Infographic: Growth and Adoption of ECoach Across the University of Michigan

Amy Homkes-Hayes, Lead Innovation Advocate

ECoach is a digital platform that was originally developed by a research team led by Professor Timothy McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education, to create a tailored communication system for introductory large-scale courses at the University of Michigan. Currently implemented in courses such as statistics, chemistry, economics and biology, ECoach provides personalized and timely feedback to students on how to succeed. ECoach content is informed by behavioral science techniques such as motivational affirmation and multiple data streams, including input submitted by students themselves. This digital tool helps learners navigate big (and sometimes overwhelming classes) by providing tailored communication, insights into their progress and ways to approach common obstacles. By making information more transparent and personalized to each student the hope is to increase student motivation and engagement with the course. In the past few years, this electronic personal coaching platform has grown immensely and its use continues to expand.

Since ECoach’s inception, the personal coaching platform has grown to support more than 24,000 students at the University of Michigan and continues to grow with potential future uses in admissions, student orientation, student life and career counseling. Total number of U-M students who have used ECoach: 24,136. A bar graph with four data points describing the number of U-M students using ECoach: 2,055 in 2015, 8,953 in 2016, 19,313 in 2017, 24,136 in Winter 2018. 44 unique U-M instructors are using or have used ECoach. Types of courses using ECoach: Statistics, Computer Science, Chemistry, Engineering, Biology, Physics, Applied Liberal Arts, Economics. Percent of current enrolled undergraduates who have used ECoach at some point in their academic careers is 56%.Student testimonial. "I think ECoach is directly responsible for my success in the course…Probably the hardest part of traversing from high school to college was knowing what to do and when, and ECoach really set that up for you. It’s really helped and I feel like as a student who uses it over students who don’t, I definitely had an advantage." - ECoach Student. Academic Innovation logo.

Gathering Hands-On Student Feedback with “Pop Up” User Testing on Campus

Ning Wang, Fall/Winter Innovation Advocacy Fellow

At the Office of Academic Innovation, we improve our digital tools through feedback from students and users, and as a former Innovation Advocacy Fellow at Academic Innovation, my work focused on helping to initiate innovative forms of usability tests. In this blog post, I will talk about one form of usability testing we’ve conducted in the past and how it is a valuable means to collect feedback for both informing iterative improvements to our digital tools. (Figure 1: Pop-up test on north campus)

What are “Pop-up” tests and what advantages do they provide?

A table with an Academic Innovation table cover and a pull-up display

Figure 1: “Pop-up” test on north campus.

“Pop-up” tests are an experimental form of usability testing that I worked on from an initial stage during my time with Academic Innovation. Unlike traditional forms – such as one-on-one interviews, focus groups etc. – “pop-up” tests free us from the constraints of small, enclosed meeting spaces and a traditional Q&A format. Instead, these tests allow researchers to interact with students during their daily routine to encourage more interaction between participants and interviewers. Advantages of this type of activity include gathering quick feedback from a larger and wider student body in a short period of time, making more students and faculty aware of digital tools developed by Academic Innovation, and ample opportunity to collect feedback. Through these tests we realized the activities used to gather feedback are not confined by rigorous interviews. Due to the flexibility of the environment in these “pop-up” tests, we can actually have participants transition their roles from passive to active participants whose responses and reactions can even change the direction of the activity. Therefore, we came up with a hands-on activity for a “pop-up” test researching the course page layout of data visualization tool, Academic Reporting Tools 2.0 (ART 2.0).

Using “pop-up” tests to inform layouts that make the most sense for students

ART 2.0 helps students, faculty, and staff make more informed decisions by providing access to, and analysis of, U-M course and academic program data. By allowing students and faculty to access data on courses and majors from past academic terms, ART 2.0 allows for data-driven information to lead toward better decision making and new opportunities at U-M. With this tool, students can decide what major other students like them pursue and what courses they could consider taking the following semester. A lot of students report they like to use it with Wolverine Access to backpack courses.

Screenshot of the ART 2.0 interface including several examples of data visualization such as bar graphs for grade distributions, enrollment, major, and school/college.

Figure 2: ART 2.0 Course Page.

Although ART 2.0 is already an established website (see Figure 2), we still want to learn what is an optimal layout of information for student users. I proposed an alternative, hands-on activity to engage student participants instead of a traditional Q&A format for gathering user feedback. To accomplish this, we took the website and created a form board with the information displayed on the page separated into small components. We put Velcro on the back of these components so students could combine and move around the these pieces until they reached the kind of layout that made the most sense for them (see Figure 3). By offering this hands-on activity, it is easier to assess intrinsic factors, like curiosity, instead of only extrinsic factors, such as treats or rewards, in their decision making process. It is also a “free of fail” activity for participants since we know that different people have different preferences in comparison to a Q&A format, where participants may be embarrassed by not knowing the correct answer to a question.

As we expected, there were no two identical answers out of the 30 samples we collected. Some students preferred a more concise layout and others proposed to combine similar groups of information, for example pre-enrollment, co-enrollment and post-enrollment, for a particular class. From there, we assigned different scores to different areas of the board (upper, middle lower). Components that were placed in the upper section received three points, the middle section received two points, the lower section received one point, and all others received zero points. With this strategy, and our experience interacting with participants, we are able to identify some general patterns:

  • The top three factors students take into consideration when deciding on a course are grade distribution, instructor reviews, and student evaluations.
  • Graduate students pay less attention to school, major, enrollment trends, and grade distribution because they have fewer instructors to choose from.
  • Different schools/colleges also have their own way of collecting course evaluation, and students wish to see more information that is tailored to their own school/college.

During this first round of hands-on, “pop-up” usability testing, we were able to gather valuable feedback while identifying a process that we could keep improving upon. We are confident in the advantages of a substantial user pool and in the feedback collected locally by U-M students. Through this process, we hope Academic Innovation will keep creating and improving tools that best serve students.

A poster with Velcro strips on a table with smaller laminated examples of data visualizations scattered next to it.

Figure 3: “Pop-up” test.

Origin Stories Showcases M-Write

Amy Homkes-Hayes, Lead Innovation Advocate

What happens when an English faculty and a Chemistry faculty partner to create a writing-to-learn program? You get M-Write.

M-Write logo above an illustration of a microphone with text that reads "Origin Stories Podcast Series."Listen to the latest episode in the Origin Stories podcast as Anne Ruggles Gere, Arthur F. Thurnau, Gertrude Buck Collegiate Professor of Education and English Language and Literature, Director of the Sweetland Writing Center, and President of the Modern Language Association, and Ginger Schultz, Assistant Professor of Chemistry, discuss how they came together, from disparate fields, to create the M-Write program. Hear how M-Write uses pedagogy and the creation of software tools to help students use writing exercises to learn science, economics, and engineering concepts in large STEM courses. Professors Gere and Schultz talk to us about how they partnered with the Office of Academic Innovation to help scale M-Write, and explore their long-term plans for the program.

Take a listen by clicking below.

Scaling Homegrown Educational Technology

Amy Homkes-Hayes, Lead Innovation Advocate

“Let’s take a leadership role in edtech since it’s part of our core business,” said Candace Thille, former Assistant Professor of Education at Stanford and current Director of Learning Science and Engineering at Amazon, in a recent webinar on the evolving role of faculty in an era of increasing digital education technology. This is what we are doing in the Office of Academic Innovation, where we launched a “homegrown” edtech accelerator that’s building and scaling digital pedagogy within and beyond the University of Michigan.

What happens to faculty technological innovation?

In many instances its usefulness does not extend beyond the academic departments in which it was born. Why? Because the infrastructure does not exist to scale it. The Office of Academic Innovation solves this problem by providing a team of software developers, user experience designers, and behavioral scientists, who work with faculty champions, to iterate quality educational technology. The Office of Academic Innovation, then, can do what faculty and departments cannot do on their own-grow educational technology tools from innovation to infrastructure, personalizing education at scale.

What’s the Office of Academic Innovation Building?

We started building software in 2015, and currently have seven tools in our portfolio. Although diverse, these tools all center on the intersection of teaching and learning and technology. Some, like ViewPoint, make it easier for faculty to implement simulation pedagogy. ViewPoint takes what was a paper and pencil process and is now a web-based application for instructors to plan, and students to execute, a deep learning experience. Hear more about the origins of ViewPoint in my recent podcast with ViewPoint creator, Dr. Elisabeth Gerber.

Screenshot of the ViewPoint interface including left-hand navigation and windows for "my role," "groups and roles," "my schedule" and "news."


Others, like ECoach, are tailored communication systems providing individualized messages to students in large courses, increasing engagement with, and ultimately the academic success of, learners. ECoach uses complementary data streams including institutional, course, and data students submit themselves to provide timely and personalized messages on how to navigate big classes, where faculty cannot provide ample individualized attention to every student (our Statistics 250 class, for example has upwards of 2,000 students enrolled in it every fall and winter term).

Screenshot of the ECoach interface with a personalized message for a user named "Kathleen" including a course grade scale showing a a current grade of 83 percent and a goal of 96 percent, a "to do" list for the week of October 17, quick links, and a message center.


ART 2.0 visualizes course and instructor data in meaningful ways to help guide class discovery and selection. ART 2.0 helps bust myths on the University of Michigan campus about things like “the workload in this class is overwhelming” or “no one ever gets an A in this course.”

A screenshot of the ART 2.0 interface for a Physics 140 class including a description of the course, a grade distribution bar graph, advisory prerequisites, enforced prerequisites, credits, and course evaluations.


M-Write, founded on writing-to-learn pedagogy, uses smart software to more easily implement writing exercises in large STEM courses. The M-Write team has developed and implemented a dashboard and process for peer reviews so students can evaluate one another’s work on concepts in courses like chemistry and economics. M-Write integrates seamlessly with the Canvas Learning Management System for a positive user experience for students.

A screenshot of the M-Write interface showing a peer review page for a student including a due date, word count, and topics covered as well as boxes detailing the reviews given and reviews received.


These examples showcase the breadth of technological innovation happening in the Office of Academic Innovation, while not minimizing its focus on improving teaching and learning through digital intervention.

How does the Office of Academic Innovation increase the number of faculty and courses employing its tools while ensuring the pedagogy on which our tools rely scales in parallel with the technology?

In the Office of Academic Innovation we grapple with this question as we try and learn from diverse strategies that increase faculty engagement with our digital tools. When we talk about our work and our team we say we blend thought partnership with exemplary service. This approach embodies how we work with faculty in adopting our tools. Some teams, like ECoach, invite faculty to work with them frequently throughout an academic term on the content and cadence of how messages are delivered to students. Other teams, like Gradecraft, host communities of practice to ensure that pedagogy and technology are not divorced from one another as our user bases expand. We know, for example, that embracing gameful pedagogy including concepts like giving students many choices in course assignments and helping them try (and sometimes fail) assignments without jeopardizing their grade are core tenets of gameful course design. Not only do we offer faculty who collaborate with us opportunities to help inform new iterations of our tools, but we also use our software to conduct teaching and learning research. For example, we have used data collected in tools like Problem Roulette and ECoach to study gender performance differences in STEM courses.

Our approaches to partnering with faculty to scale our digital educational technology will continue to expand as our user base does too, positively changing the teaching and learning landscape at the University of Michigan in its third century.

Amy Homkes-Hayes will present on “Growing Digital Pedagogy in the Office of Academic Innovation at the University of Michigan” this week at the 2018 OLC Innovate Conference in Nashville, TN where her talk has been awarded “Best in Track” in Effective Tools, Toys, and Technologies.

Get to Know ViewPoint: Next Up in the Origin Stories Podcast Series

Amy Homkes-Hayes, Lead Innovation Advocate

ViewPoint. Origin stories podcast series.What happens when a U-M faculty member comes to the Office of Academic Innovation with a good idea for software that doesn’t exist? We build it. In our next episode of the Origin Stories podcast series we talk to Dr. Elisabeth Gerber, Jack L. Walker, Jr. Professor of Public Policy and Associate Dean for Research and Policy Engagement at the Gerald R. Ford School of Public Policy, who came to the Office of Academic Innovation with an idea to build software to support simulation pedagogy. For those new to Origin Stories, we tell the tales of why and how the our digital edtech tools were created. We focus on learning from our faculty partners about how their ideas for improving teaching and learning with educational technology software came to be.

Dr. Gerber talks to us about the “a-ha” moment she had when planning one of her largest role-playing simulations, the Integrated Policy Exercise, and how that moment spurred her to reach out to us where ViewPoint was taken from concept to reality. Listen to Dr. Gerber talk about ViewPoint’s features, and how it makes planning and running simulations far easier to produce better learning experiences for students. Dr. Gerber shares what it is like working with the Office of Academic Innovation team, and her ideas for the future of ViewPoint. Take a listen by clicking below.

Bringing Behavioral Science to Teaching and Learning Innovation

Amy Homkes-Hayes, Lead Innovation Advocate

Holly Derry, Associate Director of Behavioral Science

Carly Thanhouser, Behavioral Scientist

Molly Maher, Behavioral Scientist

Behind every good learner is good behavior.

We continue to learn that much of the success students and online learners experience comes from choosing the right behaviors, but that doesn’t make choosing easy. That’s why the Office of Academic Innovation is dedicated to integrating behavioral science principles into teaching and learning. Amy Homkes-Hayes, Lead Innovation Advocate, sat down with Holly Derry, Associate Director of Behavioral Science, to explore how she and her team support Academic Innovation’s portfolio of work including educational technology software, MOOCs, and more.

How you would describe behavioral science at Academic Innovation?

We use Behavioral Science to motivate people or spur behavior change. We draw from a collection of strategies and techniques, which have been studied in lab settings or in the field, and apply them to Academic Innovation’s tools. In our work, for example, we focus on how we want learners to behave to be successful in a course. We look through our behavioral science toolbox for ideas on how to keep learners accessing the material every week, completing assignments on time, or engaging on a deeper level in a discussion board. These are just a few examples of where we apply behavioral science in our work at Academic Innovation.


Behavioral science is often used in healthcare settings, but it’s fairly new to edtech. What have you noticed about health behavior change versus behavior change in education?

We’ve been surprised by the similarities. In health and in education, people are working toward goals. In both contexts, some goals are concrete (think: scores) while others are more conceptual (think: feeling better or studying harder). The actions people take to reach their goals might be different, but the ways to motivate people toward these goals are not.

In addition, a lot of the action in healthcare, and education, happens outside of the classroom, or clinical setting. Both patients and learners need self-regulation skills, sustained motivation, and a sense of belonging to continue their health management or learning independent of physicians or faculty.


I hear you talk a lot about how central motivation is to behavioral science. Why?

Anything anyone does is motivated by something. It’s our job to find that something. ECoach, for example, uses both extrinsic and intrinsic motivators to help students succeed in large courses. While tapping into intrinsic motivation may have a longer lasting and more meaningful impact, it’s not always a luxury we can rely on. (How many people are intrinsically motivated to read the textbook?) The extrinsic motivators we have access to (grade feedback, credit, and extra credit) are often better at encouraging students to do something they may not obviously see value in, at first. For example, after students use ECoach’s Exam Playbook, a metacognitive tool that helps them strategize and plan out their use of study resources before and exam, they may become intrinsically motivated to use it again, but we rely on extra credit to encourage them to give it a try.

In the end, we often rely on a blend of intrinsic and extrinsic ways to motivate learners because we know both are valuable in different ways.


How do you account for the differences in intrinsic motivation among learners. In other words, how do you determine what motivates them?

This is where personalization comes in. Each person is different, and when possible, it’s best to offer people choices that will fit with their situation, needs, and desires. This might mean giving people choices on whether they get credit or not, which parts of a course they complete, the time they need to complete it, and so on.

We can also go to the other end of the personalization spectrum and present students with messages written for their specific characteristics. All we need is the data. A few of our projects are powered by the Michigan Tailoring System (MTS), tailoring software developed here at the University of Michigan in 2008. Our projects that use MTS can collect data from people and then deliver a tailored experience to match why they’re taking a course, what grade they want to get, how motivated they are to get that grade, and so on. We can really tailor anything we want, and soon (summer 2018), all of Academic Innovation’s tools can incorporate MTS, and tailoring as well.

Many people assume the tailored messages are computer-generated — they’re not. We’re not using robots to do this. All of the tailored communication and interventions we use are carefully crafted by the behavioral science team.


How do you guide people toward successful behaviors without telling them what to do?

We respect a learner’s right to make their own choices. We see our role as helping people make informed choices. We don’t tell learners what their goals should be, and we don’t tell them what steps they should take. We may say, “here’s what the research says about the best ways of doing something, and here are recommendations you could follow based on that research,” but then it’s up to each learner what they do next. We never say “you must do this or can’t do that.”

We use behavioral science and tailoring to grab people’s attention, to help them see what’s most relevant to them, and to help them make informed choices. We want to share best practices and help people make decisions that are right for them.


What kinds of methods and techniques are you using?

We use methods from many fields, including Public Health, Behavioral Economics, Social Psychology, and User Experience. We rely heavily on Motivational Interviewing (MI), a counseling strategy that has its roots in substance abuse counseling but has expanded into many areas including education. It’s based on the notion that knowledge does not, in and of itself, change behavior. Motivational Interviewing (along with the ability to tailor messages) enables us to find out what motivates people, to build discrepancy between their desired behavior and actual behavior, and to help them more clearly see the path toward behavior change.

We also often use the behavior change recipe from the book Switch, by Chip and Dan Heath. They use the analogy of an elephant (motivation), a rider (logic and reason), and a path (the environment). The goal is to motivate the elephant (because otherwise the rider gets exhausted), direct the rider (so a motivated elephant doesn’t walk in circles), and clear the path (so that the two can get to where they need to go). This is exactly what we’re trying to do here in Academic Innovation by applying behavioral science to our suite of tools, experiences, and opportunities.