Previewing the 2019 Gameful Learning Summer Institute

Evan Straub, Learning Experience Designer – Gameful/Connect
@estraub

When I talk to people about gameful learning, frequently I hear skepticism about the idea of turning school into a “game” or into something “fun.” To consider what makes gameful learning so successful, we have to unpack why we consider games “fun.” Although we frequently associate a positive emotional reaction with playing a game, some games are more entertaining than others. Therefore, we have to reflect on when or why playing a game becomes enjoyment.

Ask a competitive athlete in the middle of a tough competition as to whether or not they are “having fun” at that moment. Most likely, they’re engaged with thinking about the game play, the strategy, and optimizing their physical performance within the competition that the idea of “enjoyment” is probably not the first idea that comes to mind. Instead, we talk about the experience of “challenge.” While challenge in and of itself is not generally considered pleasant (at least at the moment), it can make success extremely satisfying.

men and women sitting around tables with laptops listening to a lecture

Evan Straub, Learning Experience Designer, leading the GradeCraft workshop at the 2018 Gameful Learning Summer Institute

For example, consider a game that is based primarily on luck versus one based on skill. For example, compare the what it takes to win the card game “war” where winning the game is whoever happens to have the high card at that moment — to a game like chess which involves deep strategy to defeat an opponent. Winning at chess will most likely be more of a satisfying experience. Often, you will even hear a defeated player be grateful for the experience of playing a well-matched opponent; the challenge was the reward, regardless of the outcome.

Gameful learning is trying to capture that same experience. Learning should be challenging. Learning science research consistently suggests that we learn more when we are actively engaged in the topic and when the learning is appropriately challenging. Similar to how games encourage differentiated paths, we know that every instructor is a little different as well. Therefore, at the 2019 Gameful Learning Summer Institute, we are so pleased to highlight educators who are willing to share what they have learned in their own classroom.

woman working on a laptop and a man in the background onlooking

Attendees of the GradeCraft workshop at the 2018 Gameful Learning Summer Institute.

One theme that has emerged from our presenters is how to use role-play in the classroom. Naomi Norman, Associate Vice President for Instruction at University of Georgia, and T. Chase Hagood, Director of Division of Academic Enhancement at University of Georgia, are returning to talk about their use of “Reacting to the Past,” which puts learners in control of a situation, guided by historic texts. A similar session last year was incredibly popular so we are thrilled they are returning! Similarly, Nick Noel, Instructional Designer and Media Producer at MSU Information Technology Services, is exploring how educators are similar to “game-masters” and how tabletop role-playing games could be adapted for education settings.

We are also so excited to have Angie Romines, Senior Lecturer of English at The Ohio State University, presenting on how to create and use escape rooms as a collaborative pedagogy. Hopefully, an escape room is never a true authentic task, however Angie notes that success “can only happen when participants collaborate, bounce ideas off of each other, and get comfortable with failure.”  

Women at lectern giving a presentation to attendees of a conference seated at banquet tables

Erin Baumann, Associate Director of Professional Pedagogy at the Harvard Kennedy School gives the keynote presentation at the 2018 Gameful Learning Summer Institute.

Finally, I would be remiss if I didn’t mention our exciting keynote speaker, William (Bill) Watson, Associate Professor of Learning Design and Technology at Purdue University. His work with the Serious Gaming Center at Purdue University researches how games can create engaging and innovative educational opportunities at all levels of instruction.

Of course, there are many more opportunities for new ideas at the 2019 Gameful Learning Summer Institute. We hope that you will join us, feel challenged to take some ideas back integrate into your own teaching, and have fun.

 

 

Register for the 2019 Gameful Learning Summer Institute at the event registration page by Friday, July 5!

Reimagining Assessment in Undergraduate Education: Progress, Resilience, Mastery

Barry Fishman, Faculty Innovator-In-Residence, Arthur F. Thurnau Professor of Information and Education
@barryfishman

Cynthia Finelli, Director, Engineering Education Research Program; Associate Professor, EECS and Education
@cindyfinelli

Melissa Gross, Associate Professor of Movement Science, School of Kinesiology, and Art and Design, Penny W. Stamps School of Art and Design
@MMelissaGross

Larry Gruppen, Director, Master of Health Professions Education program, Professor, Dept. of Learning Health Sciences

Leslie Rupert Herrenkohl, Professor, Educational Studies, School of Education

Tracy de Peralta, Director of Curriculum and Assessment Integration, Clinical Associate Professor, Dental School
@tracydeperalta4

Margaret Wooldridge, Director, Dow Sustainability Fellows Program, Arthur F. Thurnau Professor, Departments of Mechanical Engineering and Aerospace Engineering

 

Given the resources of a major public research university, how would you design undergraduate education if you were starting with a blank page? That is the question we explore in The Big Idea project, and our answer is that undergraduate education should reflect the true strengths of the institution. In the case of the University of Michigan, that is research, discovery, and real-world impact. The Big Idea undergraduate program is problem-focused, with students directly engaged in research and scholarship as they work towards ambitious learning goals. Learners who meet these goals will be prepared to address the world’s pressing—and often ambiguous—problems. The program is designed to emphasize progress towards mastery — thus, it does not use grades, courses, or credit hours to mark student progress or readiness-to-graduate. In this post, we first discuss some critical features of the current higher education landscape and explain why we think they are ripe for re-examination, and then we outline how assessment for and of learning will work in the Big Idea program.

 

How would you design undergraduate education if you were starting with a blank page?

 

How Business-as-Usual in Higher Education Works… Just Not for Learning

The traditional mechanisms for measuring progress and readiness-to-graduate in higher education are grades, grade-point averages (GPAs), and credit hours. If you mention these terms, almost anyone involved with higher education will know what you are talking about. These are combined with course sequences that define both (a) the general education or “lower division” phase of college, where students pursue the broad distribution requirements usually associated with the liberal arts, and (b) the “upper division” years of college, where students pursue a focused major. These mechanisms have served higher education for more than a century, and they offer a relatively straightforward way to direct student learning pathways and to rank the relative performance of learners within those programs. However, in providing structure, these mechanisms also limit learners and curtail the potential of educational programs in key ways. In this section, we explore these limitations to explain why we are taking The Big Idea in a new direction.

 

Grades and GPAs: What Do They Measure?

Assigning grades as a way to record or report students’ performance feels like a naturally-occurring practice in formal education. But the use of grades, like many components of modern education practice, did not become widespread until the early-mid 1900s (Brookhart et al., 2016). Grades were part of a movement to make education more “scientific.” These efforts were led by Edward Thorndike, a prominent behavioral scientist who was instrumental to the spread of the scientific management and measurement in education. The work of Thorndike and others led directly to the standardized testing movement that defines much of K-12 education and college admissions in the United States today (Rose, 2016). The history of 20th century education reform has been described as a struggle between the constructivist and project-based ideas of John Dewey and the instructivist and standardized approaches of Thorndike. The consensus view is that Thorndike won and Dewey lost (Lagemann, 2000). As a result, the current structure of undergraduate education is more about the sorting of students for purposes that come later—such as college, graduate school admissions, or employment—than it is about supporting the learning of the students involved.

The use of grades actually removes information about student learning from the academic record-keeping process. Even when a letter grade is based upon a strict set of criteria about what a student has learned, the letter itself communicates little to the world beyond the classroom where it was issued. If a student has earned an “A,” we know that they did well in the course, learning (hopefully) the majority of the material taught. But what does a “B” mean? We generally assume that it means the student has learned roughly 85-90% of the material, but what material was not learned? If the course is part of a sequence, later instructors are given little information about the understanding or capability of their incoming students. “Grading on the curve,” meant to normalize outcomes across students, further masks information about student learning. A grading curve may communicate comparative performance, but without reference to the goals of the course.

 

The use of grades actually removes information from the academic record-keeping process.

 

The problem becomes even more pointed when trying to gauge a student’s overall learning in college. Currently, the GPA is the primary tool for reporting overall accomplishment. High GPAs are seen as signs of academic accomplishment, and they are rewarded with honors and other recognitions. Employers and graduate schools often use GPA as a sorting mechanism. This is problematic for several reasons. One problem is that GPAs can be “gamed,” with students seeking to take courses solely for the purpose of boosting their overall GPA. Another problem is that a great deal of the variance in a student’s final GPA can result from a student’s first-year grades, which are earned as students adjust to the new environment of college. This places first-generation students or others facing transitional challenges in college at a long-term disadvantage that does not reflect their actual capability upon graduation. Some institutions, such as Swarthmore College, have declared that all classes in a student’s first semester will be recorded only as “Credit/No Credit” to reduce anxiety and encourage risk-taking during this important transitional period.

Another reason that GPAs are problematic involves the growing concern that grades—or more accurately, a focus on grades as indicators of self-worth or future opportunity—are a contributor to the growing mental health crisis on college campuses. Many students, especially at selective universities, have never had an experience of academic “failure” in the form of a low grade or standardized test score. Yet learning scientists have long known that “failure” is a key component of successful learning. If a learner only ever gets the right answer or only ever performs at a high level, there is a good chance that they weren’t truly being challenged in the first place. Furthermore, knowing the “right” solution to a problem or challenge is often not as revelatory as working to understand why a solution is “wrong” and how to repair it. Deep learning is not about knowing answers, as much it is about understanding problem and solution spaces. The typical approach to grading in higher education imposes penalties on learners for taking on challenges and not succeeding on the first try, instead of encouraging continuous progress towards mastering the material. Think, for instance, of courses where the entire grade is based on a single or small set of exams.

Healthier and more productive grading systems (such as the gameful learning approach pioneered at U-M) emphasize progress and “safe” or “productive” failure, encouraging students to work beyond their comfort zones with confidence that they are not being adjudicated on each result. In the Big Idea program, we choose mastery-based assessment over traditional grading systems to support continuous effort and progress, and reflect true student accomplishment over averages or comparison.

 

Credit Hours: The Time Clock of Education

The credit hour, sometimes referred to as a “Carnegie Unit,” was created in the early twentieth century by the Carnegie Foundation for the Advancement of Teaching as part of their effort to create a pension system for college professors. To qualify for the Carnegie pension system, institutions were required to adopt a range of standards, including the newly-introduced Carnegie Unit (Silva, White, & Toch, 2015). From today’s vantage point, those who originally conceived of the credit hour might be surprised to see how its use has expanded to become the “time clock” for virtually all aspects of educational programs. The original metric was useful for determining the amount of time students spent in contact with instruction, but with expanding options for co-curricular, online, and other forms of learning, the question of what should “count” as learning or instruction has become muddied. Furthermore, the amount of time devoted to instruction does not necessarily translate to learning. As we’ve heard at least one critic observe, “If we’re measuring seat time, that’s the wrong end of the student.”

 

The question of what should “count” as learning or instruction has become muddied in higher education.

 

The Big Idea is based not on time, but on student accomplishment. We expect students to move at different paces and to approach their learning in a sequence that reflects personalized pathways. Therefore, we are designing the program in a way that we believe will take students roughly the same amount of time to complete as a “traditional” bachelor’s, but does not use time to regulate progress.

 

General Education, Majors, and Degrees

Lower-division undergraduates take a “distribution” of courses intended to expose them to different ways of thinking and expression, in order to develop intellectual breadth. Upper-division students engage in a series of courses defined by their major, designed to develop intellectual depth. Readiness for graduation from the university is measured by having accomplished at least a passing grade in all of the courses specified by the selected program of study and accruing the required number of credit hours. This system is designed to create both guidance to students in selecting courses and managerial efficiency in terms of university planning. Unfortunately, it also does little to promote learning and can suppress individual agency.

Within the system of distribution requirements and majors, it is unusual for a degree-granting program to employ program-level assessment of individual student learning as the qualification for graduation. Normally, assessment is conducted at the course level, with students being assigned grades based upon their performance within courses, and with those grades being compiled and reported as an average GPA across courses. If a student takes a predefined set of courses and maintains an acceptable GPA, that student may graduate with a degree in that area. In specialized cases, undergraduates may complete a capstone or summative project, such as a thesis. But for the majority of programs these are optional exercises, and the assessment criteria for summative projects are not usually aligned with any program-wide criteria for learning. In the Big Idea project, we have come to refer to business-as-usual as “faith-based” education, because one needs to have faith that students are learning something, even if the assessment design of the program is not designed to record or report that learning.

Charles Muscatine, in his book “Fixing College Education,” argues that the current system of majors exists in no small part to create “peace among the departments” (Muscatine, 2009, p. 39), ensuring that students continue to take courses in different areas of the university, even if the divisions that are created are not grounded in reality. As Muscatine pointed out, there has never been research conducted on whether the division of learning into sectors like “humanities,” “science,” and “social science” has real benefit to learners, or even reflects a valid distinction among different ways of thinking. He recommends distinguishing between disciplines in terms of the methods employed for sensemaking—“logical, empirical, statistical, historical, critical (analytic, evaluative), creative” (Muscatine, 2009, p. 40)—and making sure students have practice in each. This is what might once have been described as a truly liberal education, meant to liberate the mind and broaden our ability to think in flexible ways.

 

Assessment in the Big Idea: Business-as-Unusual

As we introduce our plans for assessment in the Big Idea, we note that, as with other elements of our design, the ideas presented here are not necessarily new; similar approaches have existed in different educational contexts in the past, and similar ideas continue to be employed in specialized applications today. However, the use of these practices at the core of a degree-granting undergraduate program within the modern research university is unusual.

In contrast to the checklist approach to graduating with a major described in the previous section, the Big Idea program starts with an ambitious set of program-wide learning goals, and has only one criteria for graduation: that a student has met all the goals at an acceptable level, as evidenced his/her “mastery transcript,” a document designed to: record accomplishment, provide evidence of accomplishment, and allow for tailoring to meet different student, program, employer, or other needs. (Our thinking about this kind of documentation is inspired by the work of the Mastery Transcript Consortium.) There are no required courses, and no minimum GPA. In fact, we do not intend to record or report grades for students in the Big Idea program. Nor is progress measured by the number of credit-hours completed, which is a metric of time rather than learning. What we are interested in is the progress a student makes towards mastery of the learning goals.

 

In the Big Idea program, there will be no required courses, and no grades… what counts is the progress a student makes towards mastery of the learning goals.

 

Our approach to assessment is designed to promote personalization of learning pathways, agency, and self-authorship. It is designed to promote resilience and personal growth. It is designed to support learners from diverse backgrounds. Where most current assessment paradigms are geared towards ranking and sorting students, the Big Idea assessment model is designed for transparency with respect to learning.

 

Assessment for Learning

There are two primary types of assessment: formative and summative. Summative assessment is a report of student knowledge, skill, or accomplishment at the end of some defined period or event, such as a course, high school, etc. Examples of summative assessment include final exams, which are typically used to measure end-of-course understanding and serve as the end of a student’s relationship with a course and instructor, and the SAT or ACT tests, which are meant to “sum up” a student’s academic potential at the end of secondary school and provide a measure of the student’s their readiness for post-secondary education. Formative assessment, on the other hand, is meant to inform learning, to give feedback on student progress, or to serve as a milestone towards a larger goal. Assessment in the Big Idea program is, by design, intended to be almost entirely formative.

Students in the Big Idea are expected to always be making progress towards the learning goals. Feedback on learning will come from many different places in the program — research supervisors, faculty across different learning experiences (including courses), and members of various communities where students conduct research and learning. The learning goals themselves are not meant to be a “final” report of a student’s potential or accomplishment; rather, they represent a certain level of attainment that can continue to be built upon throughout a learner’s life and career.

 

Students in the Big Idea are expected to always be making progress towards the learning goals.

 

Paths Towards Competence and Mastery

To emphasize the importance of progress and practice towards mastering the learning goals (no learning goal is a box to be checked), we describe paths towards mastery as encompassing several levels of proficiency: Awareness, Literacy, Competency, and Mastery. These terms are defined as follows in the Big Idea program:

  • Awareness

Students who achieve this goal know that this goal, practice, or skill exists, and they understand how it fits within the larger field or profession.

  • Literacy

Students who achieve this level are in the middle of learning this goal, practice, or skill, they understand its dimensions, they can apply or demonstrate it in a basic manner, and they are able to learn more about it through additional work. Such a student could play a supporting role in a project that employs the skills or practices inherent in this goal where someone else is leading and would understand what the other person was doing.

  • Competency

Students who achieve this level are ready to begin professional work requiring this practice or skill. A competent student can do a task requiring this skill on his/her own, he/she knows how to ask well-formed questions in the area or provide sound answers to others’ questions, and he/she could advance their understanding through self-study.

  • Mastery

Students who achieve this level are ready to employ the practices or skills embodied in this learning goal in the real world. Such students could supervise, guide, or teach others with respect to this goal. We note that “Mastery” for the purposes of graduation from the Big Idea program is not necessarily lifelong mastery; our rubrics will emphasize areas for ongoing growth even beyond our program goals.

Note that while we expect all learners in the program to achieve competency in all the learning goals, we expect each learner to achieve mastery in only a subset of the overall learning goals. Which goals a learner masters will depend on their particular focus and choices made during undergraduate study.

 

Getting Started (even before the “start”)

We expect that students in the Big Idea will arrive with some initial proficiency in many of the learning goals for the program, and we will invite students to present evidence of that learning. This would be a natural outgrowth of the kinds of experiences one might have had in secondary education (including extra-curricular experiences) that might lead a student to be interested in the Big Idea in the first place. We note, however, that traditional markers of student “accomplishment,” such as Advanced Placement scores, will not be considered as evidence of learning in and of themselves. We will invite students to present a case for their current level of learning that might include information about courses taken or tests passed, but that also requires demonstration of their actual knowledge or skills.

One of the first formalized activities of the Big Idea curriculum, to be conducted as part of a “Forum” experience (this is meant as a multi-year home base for students—similar to homeroom in secondary education—with access to more experienced peers and an advising faculty member), involves helping students become familiar with the learning goals, understanding why the goals were chosen and how they are to be assessed, and gaining exposure to a range of examples (perhaps provided by the more advanced students in the program). We will also engage learners in self-assessment activities to allow them to calibrate their current level of proficiency in each of the learning goals.

 

Demonstrating Achievement

We will develop detailed rubrics for each learning goal, providing descriptions and examples at each level of learning towards each goal. The rubric is meant as a guideline for learning and for assessment, not an “answer key” or template. We expect broad individual variation in the way each learner expresses their current levels of learning, to reflect individualized interests, choices, and pathways.

Students will use an electronic portfolio as a tool to record their work and accomplishments, to share/present those accomplishments for evaluation and feedback, and (following graduation) to offer evidence of learning in the future. We intend the portfolio to be a “mastery transcript” as it will serve as a replacement for traditional transcripts, among other uses. Portfolios have long been used as a means to encourage self-authorship by students, allowing them to shape the narrative of their accomplishments, and even customize that narrative for different audiences and purposes. Electronic portfolios also allow for the inclusion of many different forms of evidence in support of claims of learning, including links to assessment evidence and they can be assembled in different ways for different needs an audiences. Understanding how to communicate one’s own abilities using the portfolio (and underlying data) will be an important component of the Big Idea program, related to the “Communication” learning goal.

 

We expect broad individual variation in the way each learner expresses their current levels of learning.

 

Digital badges, also known as micro-credentials, are one mechanism for tracking and reporting accomplishment that can be employed in an electronic portfolio. Digital badges have many useful affordances. They can be used to establish pathways towards complex learning, to record progress, and to (when used as credentials), to signal accomplishment. Digital badges can also be used to expand assessment beyond our formal processes by encouraging students to make progress towards the learning goals in all areas of their life, whether part of the formal activities of the Big Idea program or elsewhere.

In addition to the portfolio-based mastery transcript, assessment in the Big Idea will include in-person interviews or performances. The sessions will be personalized to each student, allowing the individual cases demonstrate achievement in individual ways.

 

Formal (and Formative) Assessment in the Big Idea

Students in the Big Idea program are responsible for making an evidence-supported case for their progress towards or accomplishment of learning goals. There will be two levels of panel review for students to demonstrate achievement, both of which include review of the mastery transcript and in-person interviews or performances. As part of the overall learning process within the Big Idea, these two levels of review allow personalized assessment and feedback at a large scale through a manageable workload. The process is managed by the Forum/homeroom instructor, who (together with more advanced students in the Forum) can give advice to students about how to assemble their materials for review and when they are ready to start the process.

The first level involves review by a panel comprised of more advanced, “arm’s length” students, working under the supervision of a faculty member. This arrangement allows for frequent assessment of student progress at a large scale and is a key part of the learning process for student panelists, as they learn to give constructive feedback to more junior students with respect to each of the learning goals. The goals of this first level review are both to mark progress and, more importantly, to provide feedback to the student. This review level is expected to be employed regularly for students at the awareness and literacy stages of proficiency. While the actual number of reviews might vary for each student according to their needs and pace, we would expect students to engage in this first-level review several times a semester with respect to different learning goals. We also anticipate that not all students will succeed in each review step (though we hope that feedback from Forum instructors and peers helps mitigate this). The Big Idea is designed to support progress towards meeting the learning goals, so we would not necessarily consider this a “failure,” but rather progress towards eventual success.

As an example of how the first-level review might work, consider a student who believes she is making good progress on the statistical and computational learning goals (Ways of Knowing) and also the resilience goal (Personal Good), because of various difficulties she encountered towards learning these goals. The student discusses her readiness to be reviewed with her advisor and other students in Forum, collecting input on what evidence to include in her mastery transcript and how to assemble it. This evidence could include examples of the work done in statistics and computation, including a computer program that could be used to display statistical analyses related to a public health dataset about water quality (the dataset comes from a project the student is working on led by a faculty member in the School of Public Health) and a reflection statement about what was learned and how it represents progress towards achieving competency on those learning goals. To present evidence of resilience, the student writes a narrative discussing various challenges faced as she worked with this data and how she worked through those challenges. The panel reviews the material and provides feedback and questions to the student, and the student is invited to respond in writing. An in-person conversation could be scheduled for the student to meet with the panel for further discussion, if needed. Finally, the panel would issue feedback and a decision about what level of proficiency the student had reached for each learning goal. Students who believe they are ready to be reviewed for competence or mastery in particular learning goals would also use this student-run panel, and the panel would “approve” portfolios for review at the second level.

The second, more advanced level for assessment, involves a panel of faculty, advanced students, community members, alumni, etc. who will review and give feedback to students. This panel will primarily hear cases at the competency or mastery level of learning goals. Students will be encouraged to present cases that combine multiple learning goals (though we do not expect any single case to contain all learning goals), and they would be expected to engage in this second level of review for each learning goal, with the expectation that any review should include multiple goals to represent the full range of learning goals. This stage will also include a public presentation and discussion, similar to a doctoral thesis defense. As part of the Forum activities, we would work to prepare students to be proficient in a range of presentation modalities (again, the Communication learning goal), recognizing that this is another area where students will vary. We hope to make these community events. Once the program is operating at scale and the scheduling of such events is difficult, we envision an annual (or semi-annual) public celebration involving a poster fair, talks, and panels of students and others involved in the research.

 

The assessment infrastructure of higher education has evolved to emphasize efficiency and to simplify the management of learners, courses, and programs, but without a focus on learning or support for student individual differences or independence.

 

Summary

The use of grades, credit hours, and majors has served to help the modern university “manage” the processes of undergraduate education, but these tools were not designed to support learning or make it more transparent. Eventually, these structures became the tail that wags the dog. The assessment infrastructure of higher education has evolved to emphasize efficiency and to simplify the management of learners, courses, and programs, but without a focus on learning or support for student individual differences or independence. Periodic attempts at reform often focus on the re-introduction of learner-focused ideas such project- or problem-based learning, but these efforts strain against the boundaries of the existing infrastructure. We need “infrastructuring” work (Star & Ruhleder, 1996), a conscious reshaping of the practices, technological supports, and cultural norms that guide our thinking about assessment and support for learning in education. Our proposal for assessment in The Big Idea requires us to re-engineer the infrastructure for higher education assessment to emphasize progress, resilience, and eventually mastery of ambitious learning goals. Re-shaping these structures is a key component of our plan to design undergraduate education to take best advantage of the resources and opportunities of a major public research university.

 

References

Brookhart, S. M., Guskey, T. R., Bowers, A. J., McMillan, J. H., Smith, J. K., Smith, L. F., … Welsh, M. E. (2016). A Century of Grading Research: Meaning and Value in the Most Common Educational Measure. Review of Educational Research, 86(4), 803–848. https://doi.org/10.3102/0034654316672069

Lagemann, E. C. (2000). An elusive science: The troubling history of education research. Chicago: University of Chicago Press.

Muscatine, C. (2009). Fixing College Education: A New Curriculum for the Twenty-first Century. Charlottesville: University of Virginia Press.

Rose, T. (2016). The end of average: How we succeed in a world that values sameness. New York: HarperOne.

Silva, E., White, T., & Toch, T. (2015). The Carnegie Unit: A century-old standard in a changing education landscape. Retrieved from http://www.carnegiefoundation.org/resources/publications/carnegie-unit/

Star, S. L., & Ruhleder, K. (1996). Steps Toward an Ecology of Infrastructure: Design and Access for Large Information Spaces. Information Systems Research, 7(1), 111–134. https://doi.org/10.1287/isre.7.1.111

See What’s Next for our Graduating Student Fellows

Hannah Brauer, Communications Writing Fellow
@brauhan

Student fellows power our work at Academic Innovation — their hard work and commitment help to further our mission to improve education at the University of Michigan and support lifelong learners around the world. This year, we would like to thank the 24 student fellows who are graduating this May and recognize their significant contributions to innovation at U-M.

Our graduating fellows are heading across the country after graduation, including JPMorgan Chase & Co., Carnegie Mellon University, and a start-up company in the San Francisco Bay Area. Some are even sticking around as post-graduate research fellows at Academic Innovation.

Xucong Zhan, U-M College of Engineering ‘19, is graduating after working as a Software Development Fellow. After graduation, he will work for Academic Innovation as a full-time Software Developer after his positive experience with the fellowship program.

“I got a chance to work on something I think is meaningful,” he said. “It was encouraging to work with a lot of people with the same goal as me.”

Xucong Zhen presenting at a podium with a presentation in the backgroundZhan recounted his work with Academic Reporting Tools (ART 2.0), which allows students to easily research U-M courses and instructors. In a lightning talk at the Academic Innovation Student Showcase last month, Zhan presented his work and his experience using Django and Python to add new features to the tool. He said the Academic Innovation Student Showcase was among one of his greatest learning experiences, as it provided him with public speaking skills he could apply to future positions.

“I had never done something like [the lightning talk] before,” he said. “The [public speaking] workshops helped me to decide my topics and present more confidently. Those were definitely helpful.”

Wenfei Yan smiling at the academic innovation student showcaseWenfei Yan, U-M College of Engineering ‘19, worked as a Data Science Fellow this year and is now heading to Carnegie Mellon University to complete her Master of Computational Data Science degree. Like Zhan, the Academic Innovation fellowship program helped her with her communication skills.

“Working with the awesome people at [Academic Innovation], I got better at explaining my analysis to others, and I also became a more joyful person,” she said.

Zhan worked on a case study for the Privacy, Reputation, and Identity in a Digital Age Teach-Out, which uncovered how digital reputation shapes social interaction.

“We explored the learner engagement pattern through the lens of the discussion forums,” Zhan said. She identified the top keywords users mentioned in discussions and if learners were staying on-topic.

Brandon Punturo explaining a project to two other men at the academic innovation student showcaseBrandon Punturo, U-M School of Information ‘19, also worked as a Data Science Fellow this year. Punturo shared his work with ViewPoint, a role-play simulation tool, at the Academic Innovation Student Showcase poster session. His project utilized Natural Language Processing to understand users in a policy simulation through the U-M Ford School of Public Policy.

After graduation, Punturo will continue his growth as a Data Scientist at JPMorgan Chase & Co. in New York. He said he hopes to work in a similar environment to Academic Innovation in his future job.

“I think that’s the really cool thing about [Academic Innovation] — you’re doing a lot of important work, but it’s not a stressful environment,” he said. “You feel like you’re at home, even though you’re at work.”

The student fellowship program at Academic Innovation strives to provide students with work experience in their field as an equal member of our team. Since the program began in 2015, 89 student fellows have worked on foundational projects in software development, graphic design, UX design, data science, educational technology, and public engagement.

Marissa Reid, Student Program Coordinator, is proud of the 24 student fellows graduating this year.

Student fellows are the backbone of Academic Innovation,” she said. “They keep us moving forward and working toward our goals every day.”


If you are interested in our fellowship program, there are various positions open for the upcoming academic year. We’re looking for well-rounded, innovative, responsible, driven, and flexible fellows to help us design and develop digital applications that facilitate engaged learning at the University of Michigan.

Our fellows range from undergraduates, graduate students, doctoral students, and recent graduates. U-M student fellows are appointed for one academic term with the possibility of extending to multiple terms. Visit our student opportunities page for more information.

Two Academic Innovation Tools Awarded Provost’s Teaching Innovation Prize

Hannah Brauer, Communications Writing Fellow
@brauhan

Each year, the University of Michigan Office of the Provost, Center for Research on Learning and Technology (CRLT), and University Library honors five innovative projects that aim to improve student learning with the Provost’s Teaching Innovation Prize (TIP). The winners were recognized Monday at the annual, campus-wide “Enriching Scholarship” conference including two digital tools in our current portfolio (ViewPoint and Problem Roulette) and one graduated tool (M-Write). We are excited about the recognition of our team’s tireless efforts working with faculty innovators on these award-winning tools.

Men and women standing on stage holding a framed award certificate.

ViewPoint

Elisabeth Gerber

Professor Elisabeth R. Gerber, Jack L. Walker, Jr. Collegiate Professor of Public Policy in the Ford School of Public Policy

ViewPoint is a digital simulation tool created by Professor Elisabeth R. Gerber, Jack L. Walker, Jr. Collegiate Professor of Public Policy in the Ford School of Public Policy. This platform provides customized, hands-on role-playing simulations that allow students to experience strategy development, collaboration, advocacy, and communication.

Gerber began working on ViewPoint three years ago after using policymaking simulations in her own classroom for many years. As her simulations became more complicated and difficult to manage manually, she reached out to the Office of Academic Innovation for help.

“I had been thinking about some version of ViewPoint for many years but did not have the knowledge or skills to do the programming, user experience design, etc.” she said.

Through her partnership with Academic Innovation, the tool allowed Gerber’s students to gain hands-on experience with policymaking.

Jackson G. Voss, U-M Ford School of Public Policy ‘18, recommended Gerber for the award after using ViewPoint for a policy negotiation activity in the three-day Integrated Policy Exercise (IPE) simulation for master’s students.

“Basically, we held a simulation premised on the government making an important decision, with students role-playing as elected officials, high-level government employees, lobbyists, interest group leaders, and journalists,” he said.

Voss received his Master’s of Public Policy in 2018 and now works as a professional staff member in the Washington, D.C. office of U.S. Senator Gary Peters. He noted Gerber’s exercise was the best preparation for his position on the Hill.

“IPE is largely about balancing the interests of a wide-array of people and groups with what you think is the best thing to do from a moral, political, or policy perspective — and that’s largely what I try to help do here in Washington,” he said.

viewpoint logo

While Gerber initially envisioned ViewPoint as a way to create and run simulations in her own classroom, the tool is now expanding into a powerful teaching pedagogy more accessible to a variety of instructors. Multiple institutions currently use the tool, including the U-M School of Education, Public Health, Social Work and U-M Flint, as well as UC Merced, Boston College, Stanford, and Ball State University.

Professor Daniel Hummel, Assistant Professor of Public Administration at U-M Flint, uses ViewPoint in his “Politics, Policy and Public Administration” course in the Master of Public Administration degree program. In the course, students learn about the policy process in the state of Michigan and participate in simulations based on actual state legislation, after which they must produce a final bill for the Governor to sign.

Before using ViewPoint, Hummel found teaching the policy process through lectures and readings to be difficult.

“I noticed a limitation in this approach as the process is quite complex and abstract,” he said.

Since implementing the tool, Hummel has noticed a significant difference in student engagement.

“The energy in the classroom increases when we’re doing the simulation,” he said. “It’s learning by doing.”

Gerber said she is honored to receive the TIP award and feels proud to be part of the team that created this powerful tool.

“When I lead students through a role playing simulation, I see them learning in real time: actively engaging with new ideas and situations, working together to solve problems, grappling with unfamiliar perspectives,” she said. “ViewPoint has the potential to bring this profound learning experience to a much larger and diverse pool of students.”

Problem Roulette

August Evard

Professor August (Gus) Evrard, U-M Arthur F. Thurnau Professor of Physics and Astronomy

Problem Roulette is a study tool offering access to previous exam problems for selected courses at U-M. By giving students equal access to prior exams, Problem Roulette saves instructors the time normally devoted to uploading old exams every semester and allows students to assess their problem solving accuracy and speed as they develop subject competency.

Professor August (Gus) Evrard, U-M Arthur F. Thurnau Professor of Physics and Astronomy, created the tool in 2011 when he noticed a need for equal access to study materials for courses. He reached out to Academic Innovation to continue developing and expanding the tool, which led to the launch of a new version in Fall 2017.

The [Academic Innovation] service made dramatic changes to the front- and backends,” Evrard said. “[Students now have] multiple study mode options and a new Study Group feature [in Problem Roulette] allows multiple students to work in synchrony on a set of problems. The backend has brought all the content together into a single relational database, and the service now connects to other student data sources which allows us to personalize the experience.”

Professor Brenda Gunderson, Senior Lecturer in the U-M Department of Statistics, said she endorses the tool for students in her “Statistics 250 – Introduction to Statistics and Data Analysis” course.

“After students are introduced to a new topic in lecture… they need more practice to help both solidify their acquired knowledge and identify areas or concepts that they need to strengthen,” she said. “Problem Roulette is an awesome, low stakes, high payoff tool for productive practice.”

problem roulette logo- interlocking rings with a check mark in the ring on the right side

Jon Reid, U-M College of Literature, Science, and the Arts ’20, found the tool beneficial while studying for his Statistics 250 course and recommended Problem Roulette for the TIP award.

“Stats 250 is a notoriously difficult class at the University of Michigan, and as a History major, I was very anxious in preparation for the exams,” Reid wrote in his letter of recommendation. “Problem Roulette was incredibly helpful in a group setting. With their innovative group work feature, I would virtually meet up with friends in the class and we would work on the same problems at the same time.”

Not only did he find the group feature to be helpful, but the tool made individual studying more efficient as well, according to his letter.

“I could identify areas of weakness during my studying,” wrote Reid. “I could target these weaker sections by reviewing my notes and then return to Problem Roulette to see if my understanding of these topics improved.”

Evrard said he is excited to to receive the TIP award for his work on the tool.

It helps justify a lot of work by many people on this campus since 2012,” he said.

Since its creation, Problem Roulette has served more than 7 million problems to more than 20,000 students across eight introductory courses at the University of Michigan. But Evrard said he isn’t finished.

“The work’s not over,” he said “There’s more to be done!”

What’s next?

The Provost’s Teaching Innovation Award provides $5,000 to winners, which is designed to allow Gerber and Evrard to enhance their tools into the future.

“We are hoping to build out more simulation content [for ViewPoint],” said Gerber. “Instructors can start with a [prepared] simulation template and adapt it to their own unique classroom needs.”

For Problem Roulette, Evrard envisions the tool expanding further than U-M.

“I imagine a future where a [Problem Roulette]-like service exists on many campuses, each built on authentic local content but all committed to openly sharing student engagement data for the Learning Analytics community to interrogate,” he said.


Faculty looking to partner with our office on an innovative project in support of personalized learning at scale, curricular innovation, learning analytics, or public engagement are encouraged to explore the Academic Innovation Fund proposal process. Faculty interested in implementing ViewPoint, Problem Roulette, or one of our other digital tools in their classroom can reach us at academicinnovation@umich.edu.

Engaging Learners in Educational Product Development for Michigan Online

James DeVaney, Associate Vice Provost for Academic Innovation
@devaneygoblue

Syed Amaanullah, Senior Product Manager  
@syedamaanullah

Vishal (Vish) Chandawarkar, Product Management Fellow

 

What does product management look like in an educational context? What does it mean to explore the connection between a digital product and its manifestation in the physical world? What does it look like to co-create new products to solve real problems faced by learners and learning communities? How do can universities empower students to explore new knowledge and skills and enable student-led innovation?

As we foster a culture of innovation in learning at the University of Michigan, the Office of Academic Innovation builds products with faculty and learner communities, not simply for them. Remarkable breakthroughs happen at research universities every day. Yet the strength of the research enterprise doesn’t necessarily translate to educational product development and innovation. Bridging the gap between early-stage innovation and widespread adoption is a challenge that institutions know all too well.

I sat down with Syed Amaanullah, Senior Product Manager, and Vishal (Vish) Chandawarkar, Product Management Fellow and current MBA student in the Ross School of Business, to demystify product management in an educational context. We talked about Michigan Online – one of our signature initiatives – and how we think about product development, student-led innovation, and the connections between online, hybrid, and residential learning.

 

Q: Syed, What is Michigan Online and how has the product evolved over the last eleven months since launch? What factors are contributing to its rapid growth trajectory?

Michigan Online is the destination for online learning experiences created by the University of Michigan. With a growing portfolio of courses and programs in emerging fields like design, data science, leadership, and technology, we’re quickly becoming the go-to resource for students, professionals and lifelong learners to equip themselves with the knowledge and skills that are relevant in today’s world. A significant part of our core value proposition is based on extending the University’s mission of developing leaders and citizens who will challenge the present and enrich the future, and we’re proud to be able to offer Michigan Online as a free resource for all U-M students, staff, faculty, and alumni. The value of the University of Michigan community extends far beyond the time that students spend on campus, and through Michigan Online, we’re able to support the learning needs of Wolverines throughout their careers and lives.

A lot has transpired since we first launched a year ago, from simply getting the technical infrastructure in place needed to aggregate courses from both Coursera and edX, to facilitating free access specifically for the U-M community, and recently completing work on a site redesign that made significant improvements to the discoverability of content and overall user-experience. Learners will find Michigan Online to be a welcoming and intuitive platform that gives them access to a broad range of learning experiences. As with many new products, the biggest hurdle for us to surmount has been awareness, and so we’ve worked very closely with on-campus and alumni groups to get the word out that this incredible resource exists. And as more and more learners come to Michigan Online and engage with our courses and content, we’re finding that they share this information and recommend the platform to their peers, which has really driven the rapid growth of our user base.

 

Q: Vish, when did you first learn about Michigan Online? What was your first impression  as a grad student? What possibilities did you see?

I first learned about Michigan Online from a university-wide email announcement from Provost Philbert about the product launch last spring. As an incoming grad student at the time, I saw the opportunity to take free online courses as an amazing way for me to get a head start on learning subjects I have always been interested in, but did not have access or bandwidth to pursue. I am pivoting from a career in brand management to one in product management, and courses like “Introduction to SQL” and the “Python For Everybody” series were extremely helpful in supplementing my creative background and MBA curriculum with a more technical skillset. Gaining access to the wide range of learning opportunities provided through Michigan Online, I saw an opportunity to fill an important gap faced by business students which led me to found the Michigan Code Academy.

 

Vish Chandawarkar speaking at the academic innovation student showcase

 

Q: Syed, when you approach product management in an educational context, how do you think about engaging learners and users in product development?

Building educational products requires a balance of leading with what we know, and adequately exploring what we don’t know. Given that our courses and programs are developed through a collaboration of expert faculty and our own highly-skilled Learning Experience Designers, we have a strong sense of what it takes to design and develop effective online learning content. Where we benefit from engaging deeper with our users is in understanding what types of content are most valuable to them, how they want to discover and engage with that content, how they might want to interact with their instructors or fellow learners, and what their expected outcomes might be. One way that we ensure that we’re incorporating the voice of users into product development is by building meaningful relationships with our users that go beyond the important, but transactional touchpoints of usability-testing and user-feedback-surveys. At the University of Michigan, we have a unique competitive advantage by virtue of our thriving campus of nearly 50,000 students. We’ve been able to leverage that advantage by working deeply with student, staff, faculty, and alumni groups to understand and address their needs as learners. It doesn’t hurt that we’re able to bring on talented students like Vish, as student fellows, to work as members of our teams and undoubtedly ensure that the voice of the learner is heard each and every day.

 

Q: Vish, tell us about the Michigan Code Academy (MCA). What problems are you trying to solve?

The Michigan Code Academy is a learning community that empowers U-M Students to activate, advance, and apply practical technical skills. Employers now expect candidates to not only have business acumen and leadership skills, but also technical skills to navigate an increasingly data-intensive workplace. I started the club last fall because a significant proportion of MBA students, including myself, were interested in advancing their data analysis skill set but did not have a formalized way to do so during their core curriculum.

I found the Michigan Online coursework to be extremely valuable in addressing this need and decided to partner with Academic Innovation to translate relevant online learning experiences into engaging peer-led, in-person bootcamps. By forming a community around extracurricular learning, we are breaking down the barriers and intimidation that comes along with exploring new, unfamiliar subjects like programming and data science. What we’ve created with Academic Innovation as our sponsoring organization has provided an amazing outlet for student-led learning which is a great complement to the awesome learning experiences we have at Ross.

 

Q: Syed, why is it important to support and incubate both unique and abstract use cases when creating a successful educational product?

It’s important for product teams to avoid being overly prescriptive with how a product might evolve. Although it is necessary to establish a sound product identity, the early stages of product development provide a tremendous opportunity to nurture a broad spectrum of potential use cases that could grow and evolve into valuable product features. The reality is that today’s users are very adept at manipulating a product beyond its intended purpose in order to get the value that they desire. As product leaders, we need to pay close attention to the way our users are behave because it often uncovers an underlying problem that is worth solving. With Michigan Online, we’ve been able to leverage user behavior to better understand how to engage learners with different motivations. We’re learning about the nuanced ways in which a learner looking to upskill and develop themselves professionally might differ from a learner who needs to reskill and change careers. And although we’re committed to serving all learners, we want to ensure that our content and features reflect the specific needs and use cases that our learners demonstrate through their behavior.

 

Q: Vish, after launching the initial MCA activities, what new opportunities emerged? What might we see next from the MCA team?

Because the core MBA curriculum is very structured, MCA found opportunity in providing first-year MBA’s with flexible extracurricular learning experiences to accelerate their learning.

For example, the MBA1 experience culminates in Multidisciplinary Action Projects (MAP), a capstone seven-week component of the program where students consult with companies and organizations around the world to solve real business challenges. MCA wanted to ensure students were equipped to take on any large datasets or databases they encounter. The Michigan Code Academy conducted a very successful 50-person Pre-MAP SQL training session for MBA’s. We applied Academic Innovation learning experience design expertise to make the session interactive and engaging. The student response was overwhelmingly positive and we received feedback that many MBA’s wished they had this kind of training earlier in the school year. One of the founding members of Michigan Code Academy and Ross student, Connor Nickell, attested that he immediately applied his understanding of database sets to help a hospital client assess and size transfusion markets using nationally syndicated data.

We will expand the scope of our activities into a larger bootcamp. We are now partnering with Business Analytics Club and Data Analytics Consulting Club to design and run a full-day training available to all incoming MBAs on data querying and visualization this August.

 

Q: Syed, why is it important to explore the connections between your digital products and its manifestation in the real world?

I believe that this goes back to building a deep understanding of user needs and behaviors in order to effectively provide value to them. Although digital products often live exclusively in the digital world, our users don’t necessarily adhere to the borders and boundaries of that space, and are very comfortable in bringing things like online learning content across the divide and into the real world. Seeing a self-organized learning community like the MCA utilize our content for in-person trainings and bootcamps is really powerful and serves as a great example of how providing broader access to U-M expertise and networks helps Ross to create data-fluent and tech-savvy leaders. Supporting such real-world use cases can lead to the germination of interprofessional and interdisciplinary communities of early-adopters that have the potential to shape the future of our product. So, we’re excited to continue working with the MCA to co-design experiences, tools, and resources that would benefit them and other similar learning communities. If we had put our heads down and focused solely on user analytics within the platform itself, we might have missed this opportunity.

 

Q: Vish, what kind of work have you been focusing on as a Product Management Fellow at Academic Innovation. What are you doing this summer?

Because Academic Innovation is very much like a start-up and moves quickly, the Product Management team has exposed me to a wide range of projects over the past few months. I conducted competitive analyses as part of our product discovery initiative. I helped ensure the Michigan Online web redesign launched on time by defining acceptance criteria for our QA process. Currently, I am working on a feature prioritization exercise to determine what features should be added to the product roadmap in the next development cycle.

This summer I’ll be working at LinkedIn as a Product Marketing Intern on the Consumer Growth team. I’ll conduct quantitative analysis on LinkedIn product usage data and providing recommendations on how the organization should re-prioritize the ecosystem to drive user engagement. I’m very excited to apply the skills I learned from Michigan Online and through the SQL and “Statistics with Python” courses to query and analyze large data sets.

 

Q: Syed, what might we expect to see next with Michigan Online?

Now that we’ve established a strong baseline for Michigan Online as an online learning platform for students, professionals, and lifelong learners, we’re committed to growing our catalog of learning experiences in emerging content areas like FinTech, AR and VR, Mobility and Product Management to provide the most relevant learning experiences for today’s world. Beyond an aggressive content strategy, we’re actively working toward personalization features that would not only provide a holistic overview of learners’ progress and achievements, but could also make intelligent recommendations and learning pathways based on the interests and motivations of the learner. And of course, we’re committed to working very closely with our learners to ensure Michigan Online has the tools, features, and resources to support their lifelong and life-wide learning needs.

University of Michigan Library Unbound: Learning Beyond and Across Courses

Laurie Alexander, Associate University Librarian for Learning and Teaching
@lauriea_umich

Barry Fishman, Faculty Innovator-In-Residence
@barryfishman

With: 

Anne Cong-Huyen, Digital Scholarship Strategist @anitaconchita

Breanna Hamm, Instructional Technologist

Jamie Niehof, Engineering Librarian

Amanda Peters, Student Engagement Librarian

Rob Pettigrew, Senior Academic Technologist

Justin Schell, Director Shapiro Design Lab @612to651

Meghan Sitar, Director Connected Scholarship @meghansitar

 

The Academic Innovation at Michigan: Transforming Residential Undergraduate Education (AIM:TRUE) series is designed to provoke our thinking as we explore new possibilities for how we prepare students at Michigan for their future lives and careers. AIM:TRUE sessions are part of a larger project called “The Big Idea,” which poses the question: Given the resources and opportunities provided by a major public research university, how would you design undergraduate education, if you could start with a blank page? There are many possible answers to this question, but we believe few would resemble the current structure of undergrad education. You can learn more about the proposed Big Idea program in this post, and the learning goals we propose for the program in this one. You can also revisit our first two AIM:TRUE sessions, which focused on a completely re-thought academic transcript and the challenging landscape for current higher education.

In the third AIM:TRUE session, we turned our attention to campus experimentation and collaborations focused on student learning. The U-M Library, in partnership with faculty and students, is actively reimagining the way undergraduates interact with the university research enterprise and engage in real world problem solving. Through a series of lightning talks, we connected recent projects with the the core learning outcomes of the Big Idea: ways of knowing, personal good, public good and team good. Through a series of lightning talks (described below), trends emerged around innovative scholarly practices and new possibilities for students to connect scholarship to real-world challenges. Central to the discussion was the need for connection, partnership and creative application. We invite you to explore with us through the examples below.

Anne Cong-Huyen: Collaborative Digital Pedagogy and Wikipedia Edit-A-Thons

Anne Cong-Huyen, Digital Scholarship Strategist, discussed two projects. First was her work with collaborators and partners from across the library and LSA to help Professor Jason Young transform a 200-level U.S. history lecture course. A second project involved the use of Wikipedia editing assignments to engage students in rigorous peer-reviewed public scholarship. In Professor Young’s course, library experts helped design three digital assignments that asked students to 1) participate in annotating primary texts in an online database, 2) curate and author an online exhibit of 19th century texts from Special Collections, and 3) make arguments with maps using census data. These assignments actively engaged students in alternate modes of scholarly production that were open and contributed to the public good, and which also pushed them to engage in unfamiliar activities that helped them grow as individuals, as collaborators, and as scholars.

Jamie Niehof: Information Literacy Through Canvas Modules in Engineering

Engineering librarian Jamie Niehof created two Canvas modules for students in the Multidisciplinary Design Program. These modules introduced students to information literacy concepts like citation management in a lab setting, and finding engineering literature in the Scopus database. By Fall of 2019, the modules will undergo further adaptations and reach more than 700 first-year engineering students.

You can find the Canvas Modules here: https://umich.instructure.com/courses/301354

Amanda Peters: Engagement Fellows, Mini Grants  
As an academic hub for all disciplines, the U-M Library is committed to actively engaging with the campus community to extend learning beyond the classroom. The Student Engagement Program provides and supports transformative student experiences, enabling practical opportunities for students to explore, experiment, create, lead, and reflect — capacities and skills that are critical to addressing 21st century problems in any field. The Student Mini Grants opportunity is one way we support student-driven work. Students can receive up to $1000 to support innovative and collaborative projects that make a real-life impact. Projects must strengthen community partnerships, enhance global scholarship and/or advocate for diversity and inclusion. Students who are awarded grants are then paired with librarian specialists for mentorship.

How can libraries contribute to engaged learning and student-driven work? As one mini-grant recipient stated “The library enhanced my scope of the university and broadened the depth of my project.” Not only are librarians able to connect students with research sources for their projects, and support for things like data visualization, 3D printing, video and podcast creation and more, we are able to offer them the right kind of collaborative spaces for their work, and exhibit spaces to showcase their projects. Librarians are also the connective tissue for many students to finding other stakeholders on campus and in the community who are integral to taking their projects to the next level.  

Justin Schell: Citizen + Community Science

Building on his background with community and citizen science, Justin Schell and the Shapiro Design Lab have helped develop a number of different projects that facilitate public use of and engagement with science. These include two projects built on the Zooniverse crowdsourcing platform; collaborations with communities in Dearborn and Detroit on environmental justice issues; and partnering with the U-M Museum of Natural History on a Project Incubator program that helped graduate students, faculty, and staff develop a variety of projects. All of these projects take a real-world problem as its focus, and bring together a variety of expertises. For instance, the Zooniverse project Unearthing Michigan Ecological Data involves collaboration between students, faculty and staff with expertise in project design, community engagement (in this instance on Zooniverse’s message boards), historical and cultural understanding of scientific processes, and multiple dimensions of data science to analyze volunteer classifications and explore algorithmic identification of information.

Rob Pettigrew & Breanna Hamm: Faculty Collaboration and Project Design

In the past year, we worked with a number of English 125 sections on projects to move a writing assignment from earlier in the semester into an oral and visual presentation. While we do some instruction on lesser-known, yet useful, features of presentation software, the majority of our instruction now focuses on incorporating visual design principles to create well-designed, clear, and engaging visuals to enhance the student’s own oral presentation.  Working with a Writing 120 course, we have collaborated with LSA-IT to provide support for students who were creating comics. Again, the focus of our collaboration has centered on themes of digital literacy and communication as students were transforming a written paper into a digital format. In a History 224 course, we aided a project where students were to create zines. We provided the guidelines for creation; had students visit the Labadie Collection for inspiration; talked about composition, mood, design, and storytelling; and provided consultations and office hours for students to drop in for assistance and support. There were few zines that were created using digital tools, so learning the process was more important in this course than learning the tools. The zines were not graded, rather, students created a written reflection on their process as we wanted the focus to be on their planning, creation, decision making, and topic selection.

Meghan Sitar: ScholarSprints and Library as Research Lab

Inspired by similar programs at the University of Kansas and the University of Minnesota, the University of Michigan’s ScholarSprints program aligns resources to help faculty and graduate students overcome challenges in their research and teaching through an intensive immersion with expertise from across the library and across campus. The intent is for the sprint team to work intensely for a short period of time, and to produce a tangible product or outcome. Sprints, which typically last four days and are hosted in ScholarSpace, differ from standard consultations in their timing and depth of interaction, in their orientation to public scholarship, and in their aim to build sustainable campus community connections. Sprints illustrate a strong commitment to the values of diversity, equity, inclusion, and accessibility. We believe that the expertise of scholars and librarians can enrich the goals and outcomes of a collaborative project. We remain attentive to the issue of equitable labor in scholarly collaborations. In evaluating applications, support is prioritized for research, teaching, and creative projects that contribute to the public good. We are offering another version of the program in a one-day format as ScholarDash to focus on narrower projects this spring. This model of focus and project management could easily be translated into an undergraduate-level challenge. Students might work on an independent or collaborative project, where students use new ways of knowing while engaging with the resources and expertise that have been customized for their challenge.

The University of Michigan School of Information and U-M Library received a 3-year IMLS grant to embed three research labs in the Library, composed of UMSI graduate students, librarians, and UMSI faculty, with the goal of providing real problem-solving experiences to future library professionals. The three labs focus on the assessment of student learning, the assessment of research and scholarship, and design thinking for library services. This model of intergenerational learning and applying theory to real challenges faced by organizations provides students with the opportunity to explore problems and their solutions in creative ways that aren’t typically presented by the curriculum, where projects are often already scoped and scaffolded to reach an outcome. In the Design Thinking for Library Services lab, students picked up a project that had already been started by another group of librarians and had to make sense of what came before and what needed to happen to continue. They created multiple prototypes of an exercise that translated user research into an alternative strategy for creating stories about our users, with these stories fostering empathy and shaping more nuanced research questions about our community when developing library services. Students worked within the complexity of an organization and had to employ systems thinking to understand how their work connected or challenged those complexities and power structures, gaining mentorship and networking opportunities along the way.

Conclusion and Coming Attractions

These examples illustrate not only interesting approaches to ambitious learning aligned with the Big Idea, they also show new forms of collaborative partnerships for instructional innovation. One aim of the Big Idea project is to break down barriers between disciplines and programs at the University of Michigan. The University Library, which lives—almost literally—at the center of our university campuses, is a natural hub for building connections  that enhance teaching and learning.

Our next AIM:TRUE talk is scheduled for Tuesday, May 14 at 11am. Sean Gallagher, Executive Professor of Educational Policy and Executive Director, Center for the Future of Higher Education & Talent Strategy at Northeastern University will address the question, “What Does College Prepare Students For?” and explore how employer demands on online credentials are reshaping college education. The next Big Idea post will focus on assessment, so stay tuned for that as well.

Teach-Outs: Reimagining Public Engagement in Online Learning

Benjamin Morse, Design Manager

Rachel Niemer, Director, Strategic Initiatives
@rkniemer

We need new modes of teaching, learning, and connecting in an increasingly digital society. We must challenge how we define expertise by connecting scholars with engaged citizens and by bridging the gap between digital and physical communities. At the Office of Academic Innovation, we believe that we must come together with other institutions of higher education to collectively reimagine how we engage with the global public and how we can create engaged learning experiences with diverse learners.

We are thrilled to announce the second annual Teach-Out Academy at the University of Michigan, which will be held on June 11-12. We will once again convene a group of like-minded institutions to catalyze an emerging mode of public engagement: Teach-Outs, which bring together people from around the world to learn about, discuss, and address important topics in society. Inspired by the collective discourse embodied in the original Teach-Ins, which started at the University of Michigan during the Vietnam War era, Teach-Outs echo the sentiment that we can and must bring people together to collectively address our world’s most complex social issues. They are free and open online learning events where learners engage with diverse experts, discuss and connect with other participants, and consider actions to take in their own lives and communities.

A group of three women having a discussion

One of many collaborative discussions from the 2018 Teach-Out Academy.

Since March 2017, the University of Michigan has produced 23 Teach-Outs and engaged with over 76,000 learners from across the world. In developing these Teach-Outs, we have worked with over 200 experts, both from inside higher education and beyond, to foster conversations about a range of salient social issues such as self-driving cars, immigration and family separation at the border, free speech and the aftermath of Hurricane María.

In May 2018, we hosted the first cohort of Teach-Out Academy participants from Brown University, Davidson College, Emory University, MIT, Stanford University, Texas A&M University, University of Colorado, University of Illinois, University of Notre Dame, and University of Pennsylvania. As a community, we discussed what kinds of institutional support are needed to engage our publics through digital platforms, how important storytelling is to bring participants into the conversation, and how to design calls to action that give participants the opportunity to impact an issue in a way that is aligned with their values. Since then, we have seen several Teach-Outs designed and developed by our colleagues at the University of Notre Dame, Emory University, University of Leiden, and Davidson College.

We partnered with the University of Notre Dame to create the Listening to Puerto Rico Teach-Out, which captured the stories and perspectives from Puerto Ricans one year after Hurricane María devastated the island on September 20, 2017. In June 2018, teams from both institutions traveled to Puerto Rico to film testimonies from Puerto Ricans from all walks of life. By listening, we hoped to take notice of, and act on, what Puerto Ricans said. Learn more about this Teach-Out and the continued work that the University of Notre Dame is leading, visit the Listening to Puerto Rico website. The Teach-Out and all of the videos, are also available on Michigan Online.

Another one of the 2018 Teach-Out Academy participants, Emory University, launched the “Making” Progress Teach-Out on April 15. This Teach-Out is “an invitation to think about what progress means, and how you can look for it wherever you are—in your city, community, or neighborhood—and reflect upon your own ideas about the place you live in.” The focus of the Teach-Out is on Atlanta, Georgia and how communities change and evolve over time and how physical spaces are defined. Many of these questions are “comparable to many of other places in the world on what it means to progress.” In this Teach-Out learners gain insights into the process of finding “the history of public spaces in any community and how to reflect upon the idea of progress.” You can join the Teach-Out by signing up on Coursera.

In October 2018, the University of Leiden hosted the Global Human Rights Teach-Out. It was designed to coincide with the opening of the “Young One World” Summit in The Hague where more than 1500 young people from across the world came together to discuss the issue of human rights. The Teach-Out consisted of an “online discussion on various human rights with scholarly input in the form of podcasts from over 20 academic instructors, including some contributions from advocacy groups addressing the urgency of issues.” The Global Human Rights Teach-Out is still available on the Coursera platform.

Another Teach-Out Academy participant, Davidson College, continues to produce online learning experiences as part of their Davidson Now initiative. Davidson College has produced four courses as part of this initiative, the most recent being the Community, Race, and Space in the U.S. and France course. This course explored questions such as: “When we think of where we live, who do we imagine as our neighbors? What actions invite some people in and keep others out?” This course was designed to foster dialogue about what forces determine the makeup of our cities and communities. Learn more on the Davidson Now website.

Building on this momentum, our goal for the 2019 Teach-Out Academy is to provide in-depth, hands-on consultation for institutions interested in producing a Teach-Out of their own within the next 12 months. We will explore the various dimensions of instructional design, project management, media production, and product management through a range of workshop activities and collective conversations. We will also feature a panel of representatives from other institutions who have developed Teach-Outs of their own, so we can all learn from insights generated beyond the context of the University of Michigan. We are also beginning to engage with non higher-ed organizations, who are looking at ways that the Teach-Out model helps their missions of engaging diverse publics.

Two women engaging in a hands on experience within the Academic Innovation filming studios

The 2018 Teach-Out Academy attendees explored all areas of a Teach-Out, including the studio filming experience.

We are proud to see this new mode of public engagement expand and are excited to welcome another cohort of institutions to Ann Arbor this summer as we seek to foster global problem solving communities comprised of diverse teachers and learners who aspire to solve the world’s most complex problems.

The application is live at http://teach-out.org/academy/2019 and will remain open until May 10, 2019.

Thank you and we look forward to exploring this growing movement of institutions who seek to redefine public engagement in online learning.

Presenting Our Research at the 2019 AERA Annual Meeting

Rebecca Quintana, Learning Experience Design Lead
@rebquintana

Yuanru Tan, Learning Experience Designer for Accessibility
@YuanruTan

Ricky LaFosse, Compliance and Policy Lead
@rglafosse

Jeff Bennett, Design Manager

AERA in Toronto

This year’s annual meeting of the American Educational Research Association (AERA) in Toronto, Ontario brought together a community of more than 14,000 educators, researchers, policymakers, and school leaders. Over the five-day conference, attendees navigated through a phonebook-sized program of over 600 sessions featuring more than 6,000 papers. With this year’s  theme, “Leveraging Education Research in a ‘Post-Truth’ Era: Multimodal Narratives to Democratize Evidence,” many sessions contributed to centuries-old conversations on expanding access to educational opportunities and welcoming a greater diversity of learner and educator perspectives in educational policy, research, and design.

Surrounding the Toronto Metro Convention Center and three nearby hotels needed to accommodate such a large conference, the city of Toronto offered its guests an endless supply of entertainment, fine dining, vibrant neighborhoods, and beautiful parks. As a city that lives up to its international label, a short walk downtown can feel like a trip across the globe.

Several staff members from the Office of Academic Innovation attended this year’s meeting.  Rebecca Quintana and Yuanru Tan, two members of our Learning Experience Design team, presented their research on learning design in MOOCs in two different sessions. One of our newest team members, Ricky LaFosse, presented a paper he co-authored with former colleagues at Indiana University, which focused on distance education compliance challenges. Jeff Bennett, Design Manager, presented MOOC design research with Academic Innovation co-authors in a structured poster session, along with doctoral student Alison Bressler, from the School of Environment and Sustainable Studies. The School of Education had a strong presence at AERA, including participation from Professor Chris Quintana who contributed to the structured poster session with Academic Innovation staff.

Session Types and Reflections from Presenters  

AERA sessions now come in ten varieties, including off-site visits and tours. Presenting authors from Academic Innovation briefly describe their experiences engaging and presenting in just three of these formats.   

The Paper Session: Reflections from Rebecca Quintana, Learning Experience Design Lead; Yuanru Tan, Learning Experience Designer for Accessibility

A typical structure for a Paper Session allows authors to present their papers in a 10-15 minute time slot. Usually, four or five papers investigating similar research topics are grouped in one paper session. Questions and commentary from the audience follow each presentation. At the conclusion of the paper presentations, a discussant (an expert in the field covered by the paper session) provides prepared observations and critiques. Rebecca and Yuanru presented their research Characterizing MOOC Pedagogies: Exploring Tools and Methods for Learning Designers and Researchers, in a paper session titled MOOCs: Pedagogies, Participation, and Perspectives. Their talk highlighted the work they have done to refine the Assessing MOOC Pedagogies (AMP) instrument, which was developed by Professor Karen Swan and her colleagues. Interestingly, the AMP instrument was based on a framework for computer-based education developed in the 1990s by Professor Thomas Reeves, who was the “virtual” discussant for our session. In his recorded remarks, Dr. Reeves encouraged us to consider how the refined AMP instrument could be used in educational design research that seeks to produce more effective MOOCs. He also pointed us to a book he co-edited on the subject called Conducting Educational Design Research. We look forward to diving into the book, which he has sent by mail to our team!

Rebecca Quintana presenting at the 2019 AERA conference

Rebecca Quintana presenting at the 2019 AERA conference

The Roundtable Session: Reflections from Ricky LaFosse, Compliance and Policy Lead

Roundtable sessions, as the name would imply, position presenters and guests across from each other at a table to encourage more interaction and conversation than a paper session might allow. For this session, presenters were paired together by theme at the same table—in our case, “Organizational Leadership as a Praxis for Higher Education Institutional Change.” With so many presenting groups scheduled at the same  table, even a session scheduled for 8 a.m. on a Saturday was well attended.

Going first, a former colleague and I briefly described the contents of our paper, What Happens When Compliance Officers and Online Educational Design and Support Mingle?, and not surprisingly, the discussion in this group quickly moved toward practical implications and leadership strategies. Our table of higher education administrators discussed how to engage faculty and course designers in compliance efforts and guidance for high-risk areas, considering our presentation highlighted how distance education is moving closer to individual course decisions. Discussion of the other two featured papers evolved similarly, with practical significance being of paramount importance. Within the 90 minutes allotted, we covered an impressive range of topics and strategies. Given the intimate session format, it was also quite easy to exchange contact information and continue these conversations after the conference.   

The Structured Poster Session: Reflections from Jeff Bennett, Design Manager  

After five days of presentations, our intrepid team still had one more session to go. In the last hour of the conference, we participated in a structured poster session titled “Innovating MOOC Pedagogies,” organized by Rebecca Quintana and her colleague, Hedieh Najafi, from the University of Toronto. The session featured U-M authors on four of seven papers/posters and was a great showcase of the diverse educational design research creating innovative, effective, and high quality MOOCs.

The unique Structured Poster Session format tasks authors with presenting the central message of their paper in four minutes or less, after which attendees may visit authors at their posters and receive feedback. Following this informal discussion opportunity, the authors assemble in a panel and a discussant surfaces important aspects from each paper and makes connections to broader themes across the papers. Audience members may engage in a question and answer period with the authors.

Professor Carol Rolheiser chaired the session alongside discussant Professor Jim Hewitt, both from the University of Toronto. Professor Chris Quintana, Rebecca Quintana, and I shared our work, Exploring the Integration of Project-Based Learning Approaches into MOOCs. Interestingly, while our poster and research primarily focused on sharing novel ways we are implementing project-based learning into the MOOC context, many audience questions centered on the use of software tools within U-M MOOCs. It was a great example of how socializing our team’s work can spark ideas for MOOC practitioners beyond U-M. Authors who participated in the poster session hope to assemble a special issue of a journal to expand and share each paper with a wider audience.

 

The AERA conference experience was a very positive one for our team members. In addition to having the opportunity to present Academic Innovation research to an international audience, it allowed team members to reflect on emerging research from colleagues outside of the University of Michigan and to consider how new ideas might impact our ongoing work.

The 3 Things to Know about Tandem

Holly Derry, Associate Director of Behavioral Science

Group projects.

Many people we talk to about group projects say things like “One person totally took over” or “I ended up doing all the work.” We also hear “As the only woman, I got stuck taking notes.” We believe it doesn’t have to be that way, so we set out to make group work better.

Academic Innovation teamed up with Robin Fowler, Laura Alford, and Stephanie Sheffield from the College of Engineering to create Tandem, a tailored software tool that supports students working on group projects.

 

“A bad teaming experience can sour students on collaboration, and can convince them that their voices aren’t being heard or valued. With Tandem, we get to help students recognize the value of their own voices; that these voices are likely to be those of underrepresented populations makes a support system like Tandem even more necessary.”  

~ Stephanie Sheffield, Faculty Innovator

 

And now, as the title promises, here are the 3 things to know about Tandem:

1. Tandem uses data to connect students and instructors.

It’s human nature to want feedback on our performance and progress. The Tandem team believes that if students and instructors receive timely information about how things are going, teams may resolve issues before they spiral out of control.

Tandem charts weekly team checks for students on five dimensions: how the team is doing overall, logistics, idea equity, workload equity, and confidence they’re going to do well.

Tandem screenshot

Teams can use the information to check in with each other and make course corrections. They can see, at a meta level, how their own personalities mesh with others’ and learn new ways to work better with people who are different from them.

 

“Tandem scaffolds students as they reflect on what is working well (or not) for their teams. It provides them with evidence-based strategies to try, tailored to the particular struggles of their team. We hope it will improve team experiences in the moment but also improve students’ ability to abstract from this team experience and transfer this knowledge about themselves and about interacting with others to new teaming situations.”  

~ Robin Fowler, Faculty Innovator

 

In addition, instructors can see how teams are doing and which dimensions need improvement. The Instructor Dashboard also gives additional context to instructors so they also know why teams might need support.

 

“One enlightening moment I had while working on Tandem was hearing from instructors how challenging it is to not assign stereotypical labels (e.g. slackers, overachievers, etc.) to students who are having issues with their teamwork. It means that Tandem needs to be just as effective at engendering empathy and equity in instructors as it is in students.”   

~ Molly Maher, Behavioral Scientist

 

Screenshot of tandem

 

“Tandem takes in hundreds of thousands of data points over time and converts these into a small amount of useful output. The Instructor Dashboard is ‘mission control’ for instructors and clearly tells them which teams and individuals need additional support (as determined by our algorithm), as well as providing an overall view of everything happening in the current class at a glance.”


~ Ollie Saunders, Software Developer

 

2. Tandem is tailored for each team and individual.

Tandem’s beginning of term survey asks students about their personalities and work preferences, including their likelihood to speak up in a group or work close to a deadline. The tool then uses this information, along with weekly team checks, to tailor the students’ lessons and activities.  

 

“We are lucky to work with an incredibly diverse student population, and we know that everyone learns and works differently. Tandem’s beginning of term survey helps both faculty and students understand what their strengths are and what areas we want to work on over the coming semester. I love that Tandem gives students feedback and strategies that are specific to them and their current team situation — just like I would do if I was meeting with each student every day.”  

~ Laura Alford, Faculty Innovator

 

A team may have trouble sharing ideas equitably, and so they’re assigned a lesson on communication, which points out when a student’s survey answers indicate they’re more likely to listen rather than speak. The lesson talks about why that might be a challenge for them or the team, and in the activity, the students see different thought-provoking reflection questions personalized to their communication styles. By raising awareness early on, individuals have more insight about their differences and gain a shared language to have a productive conversation about these differences. When conflict does arise, team members may have healthier attributions for the conflict: “Maybe my ideas aren’t in the final design because I didn’t speak up much, not because the team doesn’t care.” And over time, teams may also learn to adapt to each other’s styles: “Tom’s quiet again. How can our team make sure he has the space to speak up?”

3. Tandem is the product of many disciplines.

  • Behavioral scientists work with instructors to figure out which positive and negative team and individual behaviors to target. They also work closely to develop the surveys, lessons, and activities.
  • Learning experience designers help the behavioral science team align activities with learning objectives of the lessons.
  • User experience designers ensure Tandem’s user interface is easy to use so students can see which surveys and lessons are due, as well as make sense of their team’s ongoing data.
  • Software developers make the tool operate smoothly so that students get the right message at the right time based on the correct data.
  • User research specialists interview instructors and students so that Tandem focuses on topics and feature sets that bring value to users.
  • Data scientists meet with the Tandem team each week to make sure the algorithms that assign lessons and triage teams are adjusted properly.

 

“The Manage Members page was one of the most challenging pages within the site to style and code. It allows faculty to build teams by dragging and dropping members onto or between teams. The drag and drop functionality was new for me and was fun to get familiar with, and I worked very closely with one of our developers, Ke Yu.”  

~ Kristen Miller, UX Designer

 

What’s next for Tandem?

This term, we’re piloting Tandem in Engineering 100 with a group of 60 engineering students. In the next year, we have plans to expand into courses that will stretch Tandem’s capabilities even further. We’ll think more about group formation, changing groups multiple times in a term, and adding a wider range of lessons and reflection activities. In the future, the tool aims to move beyond programmed algorithms and use machine learning to form teams, predict possible issues, and address teaming issues.