Student's hands, laptop and a discussion bubble

Using Data to Inform Course Development and Assess Impact in a Hybrid Course

Cathy Hearn, Fall & Winter Course Advocate

Rebecca Quintana, Learning Experience Designer
@rebquintana

This is the fifth blog post in a series on the School of Education’s 2018 Winter Cohort initiative. In the first post in this series, Professor Donald Peurach introduced the 2018 Winter Cohort: a learning experience in which University of Michigan graduate students collaborated with online learners from across the globe to complete content from the School of Education’s MicroMasters program in Leading Educational Innovation and Improvement.

In this post, Cathy Hearn and Rebecca Quintana discuss research efforts connected to the 2018 Winter Cohort Experience. Faculty and students from the School of Education and staff from the Office of Academic Innovation formed a community of inquiry around the Winter Cohort. The core course team collected weekly reflections from learners in order to improve the course experience week-by-week, to inform longer-term and larger-scope course development. The larger research team collected additional data including learner demographics, interviews, and activity logs with the goal of answering several questions about teaching and learning articulated by the team. A secondary goal was to develop conference papers and presentations in order to share their learning with others. The research team hopes this work serves as a use case for thoughtful and effective data collection and use from innovative educational programs.

Data collection to inform minor weekly changes

During the class sessions, we strove to create a culture of openness to course feedback.

We frequently held informal focus groups with students, soliciting opinions on topics such as the pace of work, study materials, enrichment opportunities, and office hours. Our learners also completed weekly surveys where they were asked to provide both ‘warm’ and ‘cool’ feedback. Reviewing these surveys weekly allowed us to introduce changes or additions on a week-by-week basis. For instance, learners expressed they were interested in receiving feedback from other teams. In response, we designed a lesson where students were able to do precisely that. We invited campus-based and online learners to take part in a live session in which teams exchanged work artefacts and worked through a feedback protocol. After the session, learners expressed an appreciation that we listened to—and acted upon—their input in real time.

Data collection to respond to driving research questions

In addition to these data sources, our research team intentionally recorded other aspects of the cohort experience. This took the form of learner and course staff interviews, pre- and post- course surveys, video and audio recordings, work artefacts, edX analytics, and more. Based on themes that emerged in our early analysis, we are now using these data sources to investigate social interaction, learner diversity, the effects of curating an online course, and ecologies of resources, drawing on Luckin’s (2010) framework. We have now assembled small groups of researchers around each topic and have submitted a proposal to present our findings at the 2019 American Educational Research Association (AERA) Conference. Through the preparation of this conference proposal, our fundamental aim is to share what we have learned with others setting out to do similar work.

Recommendations that stem from findings

Our attention to collecting and analyzing a rich body of data has been important to help us evaluate our course design and content. We feel many of the recommendations that stem from our findings will be useful for others designing and facilitating similar online and blended courses. Our recommendations include:

  • Developing a “Week Zero” or tutorial week aimed at acquainting learners with the platform and with course norms;
  • Highlighting the presence of course leaders and active course peers, which can be vital to learner persistence in an online learning setting;
  • Creating a simple course structure, and emphasizing important information in order to ensure that guidance is clear; and
  • Ensuring all learners feel represented and included in the course by providing additional examples and case studies from outside of U.S. or mainstream contexts.

Our community of inquiry centered around understanding what it means to support online and on-campus learners who are interacting together in an online environment. We also learned a lot about what it means to form and support a diverse group of researchers who are focused on a shared phenomenon of interest – a practice we hope to adopt in future investigations.

Read these other blog posts from the 2018 Winter Cohort of the University of Michigan’s Leading Educational Innovation and Improvement MicroMasters program:

 

 

Connect with the School of Education MicroMasters team @UMLeadEdHub

References

Luckin, R. (2010). Re-designing learning contexts: Technology-rich, learner-centred ecologies. New York: Routledge.