Posts

Gathering Hands-On Student Feedback with “Pop Up” User Testing on Campus

Ning Wang, Fall/Winter Innovation Advocacy Fellow
@nwangsto

At the Office of Academic Innovation, we improve our digital tools through feedback from students and users, and as a former Innovation Advocacy Fellow at Academic Innovation, my work focused on helping to initiate innovative forms of usability tests. In this blog post, I will talk about one form of usability testing we’ve conducted in the past and how it is a valuable means to collect feedback for both informing iterative improvements to our digital tools. (Figure 1: Pop-up test on north campus)

What are “Pop-up” tests and what advantages do they provide?

A table with an Academic Innovation table cover and a pull-up display

Figure 1: “Pop-up” test on north campus.

“Pop-up” tests are an experimental form of usability testing that I worked on from an initial stage during my time with Academic Innovation. Unlike traditional forms – such as one-on-one interviews, focus groups etc. – “pop-up” tests free us from the constraints of small, enclosed meeting spaces and a traditional Q&A format. Instead, these tests allow researchers to interact with students during their daily routine to encourage more interaction between participants and interviewers. Advantages of this type of activity include gathering quick feedback from a larger and wider student body in a short period of time, making more students and faculty aware of digital tools developed by Academic Innovation, and ample opportunity to collect feedback. Through these tests we realized the activities used to gather feedback are not confined by rigorous interviews. Due to the flexibility of the environment in these “pop-up” tests, we can actually have participants transition their roles from passive to active participants whose responses and reactions can even change the direction of the activity. Therefore, we came up with a hands-on activity for a “pop-up” test researching the course page layout of data visualization tool, Academic Reporting Tools 2.0 (ART 2.0).

Using “pop-up” tests to inform layouts that make the most sense for students

ART 2.0 helps students, faculty, and staff make more informed decisions by providing access to, and analysis of, U-M course and academic program data. By allowing students and faculty to access data on courses and majors from past academic terms, ART 2.0 allows for data-driven information to lead toward better decision making and new opportunities at U-M. With this tool, students can decide what major other students like them pursue and what courses they could consider taking the following semester. A lot of students report they like to use it with Wolverine Access to backpack courses.

Screenshot of the ART 2.0 interface including several examples of data visualization such as bar graphs for grade distributions, enrollment, major, and school/college.

Figure 2: ART 2.0 Course Page.

Although ART 2.0 is already an established website (see Figure 2), we still want to learn what is an optimal layout of information for student users. I proposed an alternative, hands-on activity to engage student participants instead of a traditional Q&A format for gathering user feedback. To accomplish this, we took the website and created a form board with the information displayed on the page separated into small components. We put Velcro on the back of these components so students could combine and move around the these pieces until they reached the kind of layout that made the most sense for them (see Figure 3). By offering this hands-on activity, it is easier to assess intrinsic factors, like curiosity, instead of only extrinsic factors, such as treats or rewards, in their decision making process. It is also a “free of fail” activity for participants since we know that different people have different preferences in comparison to a Q&A format, where participants may be embarrassed by not knowing the correct answer to a question.

As we expected, there were no two identical answers out of the 30 samples we collected. Some students preferred a more concise layout and others proposed to combine similar groups of information, for example pre-enrollment, co-enrollment and post-enrollment, for a particular class. From there, we assigned different scores to different areas of the board (upper, middle lower). Components that were placed in the upper section received three points, the middle section received two points, the lower section received one point, and all others received zero points. With this strategy, and our experience interacting with participants, we are able to identify some general patterns:

  • The top three factors students take into consideration when deciding on a course are grade distribution, instructor reviews, and student evaluations.
  • Graduate students pay less attention to school, major, enrollment trends, and grade distribution because they have fewer instructors to choose from.
  • Different schools/colleges also have their own way of collecting course evaluation, and students wish to see more information that is tailored to their own school/college.

During this first round of hands-on, “pop-up” usability testing, we were able to gather valuable feedback while identifying a process that we could keep improving upon. We are confident in the advantages of a substantial user pool and in the feedback collected locally by U-M students. Through this process, we hope Academic Innovation will keep creating and improving tools that best serve students.

A poster with Velcro strips on a table with smaller laminated examples of data visualizations scattered next to it.

Figure 3: “Pop-up” test.

Origin Stories Showcases M-Write

Amy Homkes-Hayes, Lead Innovation Advocate
@amynhayes

What happens when an English faculty and a Chemistry faculty partner to create a writing-to-learn program? You get M-Write.

M-Write logo above an illustration of a microphone with text that reads "Origin Stories Podcast Series."Listen to the latest episode in the Origin Stories podcast as Anne Ruggles Gere, Arthur F. Thurnau, Gertrude Buck Collegiate Professor of Education and English Language and Literature, Director of the Sweetland Writing Center, and President of the Modern Language Association, and Ginger Schultz, Assistant Professor of Chemistry, discuss how they came together, from disparate fields, to create the M-Write program. Hear how M-Write uses pedagogy and the creation of software tools to help students use writing exercises to learn science, economics, and engineering concepts in large STEM courses. Professors Gere and Schultz talk to us about how they partnered with the Office of Academic Innovation to help scale M-Write, and explore their long-term plans for the program.

Take a listen by clicking below.

Exciting New ART 2.0 Features for Winter Backpacking and Beyond

Amy Homkes-Hayes, Lead Innovation Advocate
@amynhayes

Sharing the Curricular History of U-M

Academic Reporting Tools (ART 2.0) has lived in the Office of Academic Innovation since 2015. During its time with us, faculty champion and ART 2.0 evangelist Dr. August Evrard, Arthur F. Thurnau Professor of Physics and Astronomy, and the ART 2.0 team have iterated on what we show the U-M community. Since March 2016, CourseProfile has enabled students to view course history on things like enrollment in a course by school or college and pre, concurrent, and post course selections. In addition, CourseProfile shows a subset of Student Evaluation of Teaching (SET) data on topics like how much or little students wanted to take the course, or if it increased their interest in the subject. Starting in November 2016 the InstructorInfo deck was added, enabling students to view a subset of SET questions on U-M faculty spanning topics from clarity to creating a respectful classroom environment. In Fall 2017, we released MajorMetrics, which includes a timeline of how many students have graduated with a particular degree (referred to as majors for undergraduate students) in the last 10 years, as well as statistics on joint degrees (co-majors) and minors.

In each iteration of ART 2.0 we revisit our original vision and mission of the tool to promote a deep and shared U-M curricular history with our community, and to aid students in exploring, discovering, and selecting courses. We are enthused to share since fall 2017, more than 21,000 U-M community members have accessed the tool 230,000 times, demonstrating the sustained growth of the ART 2.0 service. As we look to the future of ART 2.0, we are eager to announce new features that align with our mission and respond to student feedback.

New ART 2.0 Iterations

Illustration of a laptop screen with icons representing grade distributionThe first exciting new addition to ART 2.0 are grade distributions. Now, when users access CourseProfile, they will see grade distributions for many U-M courses. In fact, of the 500 most searched for courses in ART 2.0, we have grade distribution data on 496. Showing grade distributions in a university-sanctioned tool is something students have asked for since ART 2.0 launched in 2016. Students tell us seeing grades in combination with the perceived workload of the course (something we also show in ART 2.0) helps them make decisions about their course schedule. Of equal importance, showing grades in our service helps “bust myths” about classes – or otherwise helps dispel the notion that “no one gets an A in this course.” We are eager for students to use grade distribution data in addition to the other rich data in ART 2.0 to help inform their class exploration and decision-making.

The second new ART 2.0 feature worth noting is the feedback feature we have added to the tool. Now, users can tell us if they have a positive or negative experience with ART 2.0, as well as leave comments for the ART 2.0 team. We want to hear from our users, and plan on using the feedback we get when examining additional ART 2.0 features.

Finally, we have improved the ART 2.0 search functionality so users can easily toggle between searching for courses, instructors, or majors. Like all of the software we build in the Office of Academic Innovation, we took user feedback on the search experience in ART 2.0 and made improvements to it- making the process of searching less confusing and easier for our users.

Check out ART 2.0 and the new features we’ve added, and feel free to give us some feedback too. We will keep improving the tool in service of the university and its community.

Portfolio Items