Join us on Monday, February 4 from 12:00 p.m. to 1:30 p.m. in the Hatcher Gallery (1st Floor) of Hatcher Graduate Library (913 S University) for AIM Analytics as we invite members of the U-M community to share interesting projects they are working on in relation to learning analytics, which will be presented at the Learning Analytics & Knowledge Conference (LAK) March 4-8, 2019 .
AIM Analytics is a bi-weekly seminar series for researchers across U-M who are interested in learning analytics. The field of learning analytics is a multi and interdisciplinary field that brings together researchers from education, learning sciences, computational sciences and statistics, and all discipline-specific forms of educational inquiry.
To register for this event, please RSVP HERE. Lunch will be provided.
Presenters to include:
Title: It’s My Data! Tensions Among Stakeholders of a Learning Analytics Dashboard
Abstract: Early warning dashboards in higher education analyze student data to enable early identification of underperforming students, allowing timely interventions by faculty and staff. To understand perceptions regarding the ethics and impact of such learning analytics applications, we conducted a multi-stakeholder analysis of an early-warning dashboard deployed at the University of Michigan through semi-structured interviews with the system’s developers, academic advisors (the primary users), and students. We identify multiple tensions among and within the stakeholder groups, especially with regard to awareness, understanding, access, and use of the system. Furthermore, ambiguity in data provenance and data quality result in differing levels of reliance and concerns about the system among academic advisors and students. While students see the system’s benefits, they argue for more involvement, control, and informed consent regarding the use of student data. We discuss our findings’ implications for the ethical design and deployment of learning analytics applications in higher education.
Title: Beyond A/B Testing: Sequential Randomization for Developing Interventions in Scaled Digital Learning Environments
Abstract: Randomized experiments ensure robust causal inference that are critical to effective learning analytics research and practice. However, tra- ditional randomized experiments, like A/B tests, are limiting in large scale digital learning environments. While traditional experiments can accurately compare two treatment options, they are less able to inform how to adapt interventions to continually meet learners’ diverse needs. In this work, we introduce a trial design for developing adaptive interventions in scaled dig- ital learning environments – the sequential randomized trial (SRT). With the goal of improving learner experience and developing interventions that benefit all learners at all times, SRTs inform how to sequence, time, and personalize interventions. In this paper, we provide an overview of SRTs, and we illustrate the advantages they hold compared to traditional experiments. We describe a novel SRT run in a large scale data science MOOC. The trial results contextualize how learner engagement can be addressed through inclusive culturally targeted reminder emails. We also provide practical ad- vice for researchers who aim to run their own SRTs to develop adaptive interventions in scaled digital learning environments.
Title: Exploring Learner Engagement Patterns in Teach-Outs.
Abstract: MOOCs have developed into multiple learning design models with a wide range of objectives. Teach-Outs are one such example, aiming to drive meaningful discussions around topics of pressing social urgency without the use of formal assessments. Given this approach, it is crucial to evaluate learners’ engagement in the discussion forum to understand their experiences. This paper presents a pilot study that applied unsupervised natural language processing techniques to understand what and how students engage in dialogue in a Teach-Out. We used topic modeling to discover the emerging topics in the discussion forums and evaluated the on-topicness of the discussions (i.e. the degree to which discussions were relevant to the Teach-Out content). We also applied content analysis to investigate the sentiments associated with the discussions. We have taken a step toward extracting structure from students’ discussions to understand learning behaviors happen in the discussion forum. This is the rst study to analyze discussion forums in a Teach-Out.