Student Privacy and Autonomy in the Age of Digital Innovation

Mika LaVaque-Manty, Arthur F. Thurnau Associate Professor of Political Science

This year’s Enriching Scholarship keynote reminded us that new digital tools will be sources of rich data about our students, what they do and how they learn.

If you are like me, you think, “Cool!” But it’s also possible you think, “Holy NSA, Batman! Creepy!” For many, phrases like “rich data profile” evoke images of a manipulative Big Brother. While taking the unease seriously, I will suggest why such data is a valuable component of educational innovation.

We live in a world of datapolitik, as the political theorist Davide Panagia has called it. Our digital footprints are increasingly part of who we are, whether we like it or not. The thing to do is to develop responsible ways of living in that world.

I write as a political theorist who has found himself very invested in learning technologies. I have revised my own teaching, largely because of my research interests in autonomy. My courses try to foster autonomy, getting students to take charge of their own learning.

Data plays an important part. A regular course I teach has 300 students. I eventually get to know most of those students, but I want to know something about them before we begin. To have any meaningful learning objectives, I need to know who I am teaching. So I survey the students for their interests, background, preferences, and whatever else they want to tell me. I use the Academic Reporting Toolkit to get a sense of what other courses they are taking and have taken. During the semester, I keep close track of their choices and performance. This gives me a sense of where the course is going, both as a whole and for different individuals. GradeCraft, a “gameful” gradebook tool, allows me and my GSIs to see quickly where students stand, who might be falling behind, and why. I wish I had even more data, and I’m hoping, for example, that the E2Coach used in STEM courses is soon available more widely.

The Grade Predictor tool let's students plan out their coursework to get the grade they want.

The Grade Predictor tool let’s students plan out their coursework to get the grade they want.

Students are able to see where they stand in relation to their classmates within GradeCraft.

Students are able to see where they stand in relation to their classmates within GradeCraft.

“OK,” you might say, “that individualizes your 300 students, but what about their autonomy?” Well, the students get much of this data, too. They know how they are doing, where they stand, and, most importantly, they get to play with the data to decide how to move forward in my course: what assignments to tackle and how to weight them. This fosters their autonomy as learners.

Generally, the goal of many DEI initiatives is to share much of the information that we have been collecting with more of our community — students, faculty, staff — for all of us to be better informed in our actions. But we also know that not all data is automatically informative, and important practical and ethical questions about who should know what, when, and how remain. If I know, for example, that a student might be at risk in my course, I might be prepared to help him or her — but some attempts to “help” might do the very opposite. And although Michigan students generally appreciate it when we know them as individuals, some like the anonymity our size makes possible. Starting anew with this professor, without the baggage of an unsuccessful pre-med experience, say, may be the thing that saves someone’s college career. That can also be an autonomous choice, the kind we want to foster. In our use of data we must remember those desires, too.

But since total anonymity is no longer possible, more information — thoughtfully used — is better than less. After all, the sort of “profiling” policies many of us dislike exist because of poor information.

Integrating Curricular & Co-Curricular Learning through Digital Badges

Steven Lonn, Ph.D, Assistant Director, Assessment and Evaluation at the Office of Digital Education & Innovation

p1-300x271On any given day at the University of Michigan, there are hundreds of guest lectures, student group events, workshops, tutorials, and exhibits that present a tremendous variety of learning opportunities to the student community that are only available in this rich residential setting. Unfortunately, U-M does not yet have a comprehensive way to track, recognize, or validate these learning opportunities, nor provide a framework to help integrate them into students’ curricular pathways.

Digital badges may be one way to facilitate this process. The digital nature of these certifications allow rich metadata about the specific setting, requirements, and the learners’ own evidence to travel with the badge. This provides students with an avenue for personalized and reflective learning at specific and contextualized moments over time. Utilizing the Mozilla Open Badges framework, digital badges allow learners to legitimize their learning and growth wherever it occurs and connect their skills and competencies with their academic and professional careers in an open, transportable, and validated format.

p2-300x138At U-M, an exploration into using digital badges to recognize co-curricular learning began with a Teaching & Learning in a Third Century Quick Wins Grant in Winter 2014 investigating how undergraduate engineers could use digital badges to integrate their out-of-class activities with their professional identities. In Fall 2014, the UM 3D Lab and the School of Information Entrepreneurship Program launched pilots of their own. In a student-generated naming competition, the participating students voted to name U-M digital badges “Mblem” — in many ways, these badges are emblematic of the unique learning opportunities available at a premier residential research institution. To date, nearly 400 Mblem digital badges have been issued in recognition of students’ varied experiences, skills, and competencies.

Digital badges can be used to not only facilitate student learning at U-M, but also recognize faculty and staff knowledge as well. One experiment in this area is a new series of Canvas Mblems offered by the Teaching and Technology Collaborative to promote the growing of expertise of this new learning management system. Over 375 Canvas Mblems have already been awarded in May 2015 as part of this pilot initiative.

The Learning, Education & Design (LED) Lab is investigating how digital badges can mediate formal and informal learning in higher education and how the metadata contained within the badges can be utilized to better understand learners’ motivations, identity, and integration of disparate educational contexts. In order to build on the initial success of the pilot programs toward a rich ecosystem of digital badge use at Michigan, there have been several events over the p3-300x200course of the Winter 2015 term designed to engage the U-M community around the concept of digital badges, including two very popular LED / CRLT workshops attended by over 70 members of the Michigan community. DEI is committed to promoting personalized and lifelong learning opportunities through creative uses of technology that help redefine the residential education experience — a vibrant and integrated curricular and co-curricular ecosystem of Mblem digital badges could be an important component of that new vision.


Please visit the Mblem website for more information, or nominate your unit or program for a digital badges pilot.

Launching a New, Large, Web-blended Course: Reusing and Curating Digital Content

Anne Sales, PhD RN and Professor, Systems Leadership and Effectiveness Science,
School of Nursing

Starting a new course from scratch is never easy. It’s particularly difficult when the course has content that’s never been taught in the school before, needs to be taught to a large class, and must be taught in a web-blended format, in which class participants only meet face to face once a month for 3 hours.

The web-blended format of the course makes it essential to take a curating approach to content. Much of the material has to be assimilated by students without a great deal of contact with the instructor. Being able to use existing content, rather than creating content de novo was very important.

I decided to use existing case studies (from Harvard Business Publishing) as a core component of the class, and to have students work in groups to support learning. Systems and models are core concepts in this course, which focused on optimal systems and models for healthcare delivery. While there is a lot of information and content about systems in both online and print media, finding compelling material that supports learning from the ground up to recognize systems in the world around you is not always easy.

However, I was aware of the different Massive Open Online Course (MOOC) offerings from the University of Michigan (as well as many other universities) on the Coursera platform, and had already checked out a number of courses focusing on systems thinking and analysis. One of these, the first UM MOOC offering, is Model Thinking offered by Scott E. Page, Leonid Hurwicz Collegiate Professor of Complex Systems, Political Science, and Economics at the University of Michigan.

Scott’s course opens with several introductory videos that explain thinking in terms of models and systems, and provides an overview of what can be quite complicated and complex issues. While the students in my class don’t need to know technical details of systems analysis, they do need to recognize the ubiquity of systems, and the relationship between models and systems in describing and analyzing what we experience.

Being able to reuse digital content created for one purpose in a different course with an entirely different group of learners than those for whom it was initially intended, to support learning objectives in a very different context, has been invaluable. There is a lot of content out there in all kinds of forms, increasingly in the digital and online worlds—but discerning the good from the mediocre or, worse, the bad (as in misleading or outright incorrect)—is critical for learners, and an important function of teaching or supporting learning.

Issues this raises: how did I know that this material existed? How did I decide that it was good rather than mediocre, and how did I judge that it could be used in a context other than its original MOOC? The answer to the first question is largely serendipity—I happened to have served on the UM committee that approves MOOCs, so I have more awareness of what’s out there than many faculty. I also happened to be interested in this area personally, so I had initially signed up for Scott’s MOOC (twice, actually!), although I was one of the vast number of enrollees who never complete a course (twice). So I’d had a chance to judge it for myself, and found the content interesting, engaging, and relevant. That made me confident that it was a good idea to use it. But we need better ways to let everyone know what’s out there, and give people some idea of the quality of offerings. This will take some creative thinking and input—great things about living in the digital age!

Learning From Experience

Timothy McKay,  Arthur F. Thurnau Professor of Physics, Director of the LSA Honors Program & DIG Principle Investigator

Performance prediction is all the rage in higher education. Testing agencies, researchers, and software vendors all promise to “predict” student outcomes. Some find this prospect alarming, imagining that predictions might determine a student’s fate, or somehow restrict their potential. It’s all a little less scary if we take a step back and think about what it is we’re really doing – learning from experience, understanding the past. Educators don’t learn about the past to predict the future; we learn about the past to change the future.

There are two ways to predict the future. One way is with theory. Given a model for how the world works, you can take what you know about it now and deduce what will happen in the future. Absent defensible theory, experience is our only guide. Assuming that what happened in the past is likely to recur in the future is simple induction, and it’s usually your best bet. Wonderful research1 by people like Philip Tetlock and Daniel Kahneman2 shows that even ‘expert’ human predictors are less accurate than simple extrapolation of this kind.

To predict the grade a student, call her Amy, might receive in a class, we examine how students like her have done in the past. What sorts of difference matter? Again, only experience can help us decide. Imagine we do this really well, identifying many students from the past who are just like Amy across all characteristics we know about. Other things being equal, records of what happened to students like Amy in the past would be our very best predictor of Amy’s performance in the future. But of course we don’t have to let other things be equal.

Learning what’s happened to students like Amy in the past might inspire us to change what we do. Discovering that students who take chemistry lab and lecture concurrently have done better in the past3 might inspire us to require all students to do this in the future – changing their outcomes – in effect breaking our ‘predictive model’. Faculty and staff at the University of Michigan have always used experience to inform their teaching. Learning analytics4 is helping us learn from the experience of all students, instead of just the few we know well.

Perhaps more important, sharing information about the past with Amy might inspire her to change what she does in the class. Learning that successful students regularly spent 12 hours a week on homework, taught their study group members what they’d learned, and began preparing for each exam ten days in advance might inspire Amy to change her own approach to the class. UM’s Digital Innovation Greenhouse5 aims to put this kind of information in her hands. When Amy learns from the past, she can change her future.

So let’s stop talking about predicting the future. What we’re really doing is learning from experience, because that’s the only reliable way we can hope to change the future.

1Tetlock, Philip. Expert political judgment: How good is it? How can we know?. Princeton University Press, 2005.

2Kahneman, Daniel. Thinking, fast and slow. Macmillan, 2011.

3Matz, Rebecca L., et al. “Concurrent enrollment in lecture and laboratory enhances student performance and retention.” Journal of Research in Science Teaching 49.5 (2012): 659-682.



Introducing DEI’s New Blog

James DeVaney, Associate Vice Provost for Digital Education & Innovation

A little more than a year ago we created the Office of Digital Education & Innovation to investigate new things, find pathways to scale and institutionalize learning from targeted experimentation. The University of Michigan is a trailblazing institution with a long history of innovation in teaching and learning. It is a community bound together by a shared commitment across disciplines to discovery and to developing leaders and citizens who will challenge the present and enrich the future.

It is also a community where one of the most commonly heard statements is, “I didn’t realize we were doing that.”

So why are we creating a blog and why now? From the beginning, DEI has sought out answers to the following question: what is only possible at a great public residential research university? We decided early on that part of being a public institution is making a commitment to share our journey as we go. And we already have much to share.

Our Massive Open Online Courses (MOOCs), created at the Digital Education & Innovation Lab (DEIL), have now reached more than 3M lifelong learners. More than 21K students are impacted by digital engagement tools harvested in the newly created Digital Innovation Greenhouse (DIG). The Learning Education & Design Lab (LED Lab) is conducting leading research in learning analytics, alternative assessment, micro credentials and large-scale online education. In a relatively short amount of time, we have partnered with 15 of our 19 colleges and schools to launch more than 40 initiatives.

DEI partners with innovative faculty and staff across campus to challenge assumptions about teaching and learning. Through targeted experiments we aim to unlock and enable personalized, engaged and lifelong learning through the creative use of technology and targeted experimentation. As we develop these partnerships, we are consistently asked to strengthen knowledge sharing opportunities. We hope that this blog provides a place to showcase innovation taking place within DEI and across U-M.

We’ve had a very exciting academic year in DEI working closely with innovators across campus. We’re now poised for an exhilarating summer. We look forward to growing and enhancing our portfolio of digital learning initiatives, launching three major initiatives in the newly created Digital Innovation Greenhouse, issuing multiple calls for proposals and ideas, and further establishing U-M’s leadership at the intersection of digital learning and learning analytics.

We hope you’ll visit often. We plan to showcase DEI initiatives, discuss trends and issues in digital learning, and help increase faculty awareness and participation. We have asked faculty and staff innovators to contribute to this blog through guest posts in order to share their unique perspectives as they innovate within and beyond the U-M community.

In the coming weeks, we will hear from innovators who are creating MOOCs, remixing and integrating content within blended courses, developing digital engagement tools, and exploring digital badges and alternative credentials. Please join us for a discussion about the future of higher education.