Posts

Gallery Tool Unlocks Peer Feedback Possibilities for MOOC Learners

James Park, Design Manager

Rebecca M. Quintana, Learning Experience Design Lead
@rebquintana

During our team’s work on the Storytelling for Social Change course with University of Michigan (U-M) faculty member, Anita Gonzalez, we recognized a need to really bring to life a tool that would allow learners to share their text- or image-based work for the course with other learners in an easy and open manner and to also receive robust feedback on their work from other learners. Because of the nature of our learners’ work in the course, and because of the different expectations and experiences we wanted them to have, we sought an alternative to the course platform’s peer-review tool. This alternative had to be one where work and feedback could both be shared more freely and in a way that prioritized high-quality interactions (especially dialogue) over numerical scores and one-way assessment. Ultimately, we ended up with a Gallery that would facilitate this sort of learner interaction and empower learners to share work—or multiple works—without fear that criticism of their work or the particular “rules” of the peer-review tool would impede their  successful progress in the course.

The Gallery tool is not only being used in the Storytelling for Social Change course, but also our Python Basics course, which introduces the basics of Python 3. For Python Basics, we wanted learners to have an opportunity to practice their Turtle programming skills and to submit their work for peer feedback. We wanted a lightweight option, something that would allow learners to share their work in a “low-stakes” environment, without the formality and restrictions of peer-graded assignments. The Gallery Tool allowed us to create a forum for learners to upload their drawing(s) and create prompts, which ask their peers for specific feedback about their drawing. We set the tool up to allow learners to filter on type of drawing, such as abstract, animal, building, logo, and nature.

We are already seeing a tremendous range of subject matter in the Gallery, including spider webs, pyramids, U-M logos, nature scenes, and many, many abstract drawings. Learners are asking for feedback on topics such as how to create color effects, how to create specific shapes, and areas for improvement. Interestingly, learners are also asking questions of other learners that relate to skills they have demonstrated in their drawings, such as “How do you fill a shape?”

Abstract drawings of turtles and a multi-colored spiral

Figure 1: Two Turtle drawings, published in the Python Basics course using the Gallery Tool

 

What does the Gallery Tool do?

Submission page for gallery tool

Figure 2: The Upload Submission screen learners see before submitting a piece for peer feedback.

Learners can upload a text- or image-based artifact, or a link to an artifact in another medium, to the Gallery, where they will be able to also provide a synopsis of the artifact and some relevant questions they would like to pose to a potential reviewer. In turn, they will have the ability to browse other learners’ work and provide feedback on it, taking into account the very questions that their colleagues have posed in association with the work. The Gallery is very much a place of reciprocity, and it thrives on learners contributing and receiving meaningful thoughts and reactions from others.

A collaboration across Academic Innovation teams

The creation of this tool was very much a joint effort across Academic Innovation teams, namely the Online Tools Team, who did the heavy lifting of designing and building the tool, the Design Management team, and the Learning Experience Design team. Our former colleague Steve Welsh provided a lot of early guidance on the tool’s design from a learning-experience perspective, and Anita Gonzalez also contributed helpful ideas about its purpose and execution, as well as a thoughtful early critique of a prototype. Together, members of all of these teams met regularly to assess the Gallery’s features, design, and future efficacy when employed in the context of our courses.

One personally eye-opening aspect of the development process was the careful balance of designing a robust tool that would be truly effective in Storytelling for Social Change—which was natural, since it was the impetus for the tool in the MOOC context—but easily adaptable for other courses and contexts from both the pedagogical and programming perspectives.

What’s next for the Gallery Tool?

We can see lots of potential for use of this tool in future projects. Essentially, this tool is a forum for learners to participate in a “show and tell” of their work, because it allows them to share creative artifacts and receive feedback from peers. Some courses ask learners to complete a final project. The Gallery Tool would be a great place for learners to share sketches and drafts, and receive responses to questions about their work before they submit their project for summative evaluation. Learners can also browse through previous examples, before beginning work on a challenging project. Some of our courses are hosted on two platforms simultaneously. Since the Gallery Tool works through Learning Tools Interoperability (LTI) integration, the tool could be a bridge between both versions of a course. Learners on Coursera would be able to share work and interact with learners on edX, and vice versa. In sum, Learning Experience Designers and others at Academic Innovation are excited by the flexibility that the tool affords, and are eager to use the tool in situations where learners would benefit from the opportunity to share and showcase their early work with a receptive and constructive audience.

 

Personalized Electronic Coaching: The Origin of ECoach

Amy Homkes-Hayes, Lead Innovation Advocate
@amynhayes

Personalization is a popular concept in and outside of higher education, yet definitions vary, sometimes widely, about what it means to “personalize” educational experiences for students. ECoach, a tailored communication system, is using personalization backed by well-researched behavioral science, smart user experience design, and ongoing software development, to help students succeed in large courses. Professor Tim McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education and founder of ECoach, explains what it was like for him to grapple with meaningfully and successfully reaching hundreds of students in his large introductory physics course. Listen as Professor McKay talks about the “a-ha” moments that motivated him to create a digital edutech solution to provide the right information, at the right time, and in the right way to his students. Hear Professor McKay examine how ECoach has evolved over time, and what the future may look like for ECoach and thoughtful, student-centered, and technology driven personalization as part of the future of higher education.

 

Take a listen by clicking below.

 

 

Infographic: Growth and Adoption of ECoach Across the University of Michigan

Amy Homkes-Hayes, Lead Innovation Advocate
@amynhayes

ECoach is a digital platform that was originally developed by a research team led by Professor Timothy McKay, Arthur F. Thurnau Professor of Physics, Astronomy, and Education, to create a tailored communication system for introductory large-scale courses at the University of Michigan. Currently implemented in courses such as statistics, chemistry, economics and biology, ECoach provides personalized and timely feedback to students on how to succeed. ECoach content is informed by behavioral science techniques such as motivational affirmation and multiple data streams, including input submitted by students themselves. This digital tool helps learners navigate big (and sometimes overwhelming classes) by providing tailored communication, insights into their progress and ways to approach common obstacles. By making information more transparent and personalized to each student the hope is to increase student motivation and engagement with the course. In the past few years, this electronic personal coaching platform has grown immensely and its use continues to expand.


Since ECoach’s inception, the personal coaching platform has grown to support more than 24,000 students at the University of Michigan and continues to grow with potential future uses in admissions, student orientation, student life and career counseling. Total number of U-M students who have used ECoach: 24,136. A bar graph with four data points describing the number of U-M students using ECoach: 2,055 in 2015, 8,953 in 2016, 19,313 in 2017, 24,136 in Winter 2018. 44 unique U-M instructors are using or have used ECoach. Types of courses using ECoach: Statistics, Computer Science, Chemistry, Engineering, Biology, Physics, Applied Liberal Arts, Economics. Percent of current enrolled undergraduates who have used ECoach at some point in their academic careers is 56%.Student testimonial. "I think ECoach is directly responsible for my success in the course…Probably the hardest part of traversing from high school to college was knowing what to do and when, and ECoach really set that up for you. It’s really helped and I feel like as a student who uses it over students who don’t, I definitely had an advantage." - ECoach Student. Academic Innovation logo.

Gathering Hands-On Student Feedback with “Pop Up” User Testing on Campus

Ning Wang, Fall/Winter Innovation Advocacy Fellow
@nwangsto

At the Office of Academic Innovation, we improve our digital tools through feedback from students and users, and as a former Innovation Advocacy Fellow at Academic Innovation, my work focused on helping to initiate innovative forms of usability tests. In this blog post, I will talk about one form of usability testing we’ve conducted in the past and how it is a valuable means to collect feedback for both informing iterative improvements to our digital tools. (Figure 1: Pop-up test on north campus)

What are “Pop-up” tests and what advantages do they provide?

A table with an Academic Innovation table cover and a pull-up display

Figure 1: “Pop-up” test on north campus.

“Pop-up” tests are an experimental form of usability testing that I worked on from an initial stage during my time with Academic Innovation. Unlike traditional forms – such as one-on-one interviews, focus groups etc. – “pop-up” tests free us from the constraints of small, enclosed meeting spaces and a traditional Q&A format. Instead, these tests allow researchers to interact with students during their daily routine to encourage more interaction between participants and interviewers. Advantages of this type of activity include gathering quick feedback from a larger and wider student body in a short period of time, making more students and faculty aware of digital tools developed by Academic Innovation, and ample opportunity to collect feedback. Through these tests we realized the activities used to gather feedback are not confined by rigorous interviews. Due to the flexibility of the environment in these “pop-up” tests, we can actually have participants transition their roles from passive to active participants whose responses and reactions can even change the direction of the activity. Therefore, we came up with a hands-on activity for a “pop-up” test researching the course page layout of data visualization tool, Academic Reporting Tools 2.0 (ART 2.0).

Using “pop-up” tests to inform layouts that make the most sense for students

ART 2.0 helps students, faculty, and staff make more informed decisions by providing access to, and analysis of, U-M course and academic program data. By allowing students and faculty to access data on courses and majors from past academic terms, ART 2.0 allows for data-driven information to lead toward better decision making and new opportunities at U-M. With this tool, students can decide what major other students like them pursue and what courses they could consider taking the following semester. A lot of students report they like to use it with Wolverine Access to backpack courses.

Screenshot of the ART 2.0 interface including several examples of data visualization such as bar graphs for grade distributions, enrollment, major, and school/college.

Figure 2: ART 2.0 Course Page.

Although ART 2.0 is already an established website (see Figure 2), we still want to learn what is an optimal layout of information for student users. I proposed an alternative, hands-on activity to engage student participants instead of a traditional Q&A format for gathering user feedback. To accomplish this, we took the website and created a form board with the information displayed on the page separated into small components. We put Velcro on the back of these components so students could combine and move around the these pieces until they reached the kind of layout that made the most sense for them (see Figure 3). By offering this hands-on activity, it is easier to assess intrinsic factors, like curiosity, instead of only extrinsic factors, such as treats or rewards, in their decision making process. It is also a “free of fail” activity for participants since we know that different people have different preferences in comparison to a Q&A format, where participants may be embarrassed by not knowing the correct answer to a question.

As we expected, there were no two identical answers out of the 30 samples we collected. Some students preferred a more concise layout and others proposed to combine similar groups of information, for example pre-enrollment, co-enrollment and post-enrollment, for a particular class. From there, we assigned different scores to different areas of the board (upper, middle lower). Components that were placed in the upper section received three points, the middle section received two points, the lower section received one point, and all others received zero points. With this strategy, and our experience interacting with participants, we are able to identify some general patterns:

  • The top three factors students take into consideration when deciding on a course are grade distribution, instructor reviews, and student evaluations.
  • Graduate students pay less attention to school, major, enrollment trends, and grade distribution because they have fewer instructors to choose from.
  • Different schools/colleges also have their own way of collecting course evaluation, and students wish to see more information that is tailored to their own school/college.

During this first round of hands-on, “pop-up” usability testing, we were able to gather valuable feedback while identifying a process that we could keep improving upon. We are confident in the advantages of a substantial user pool and in the feedback collected locally by U-M students. Through this process, we hope Academic Innovation will keep creating and improving tools that best serve students.

A poster with Velcro strips on a table with smaller laminated examples of data visualizations scattered next to it.

Figure 3: “Pop-up” test.