Jeremy Nelson, Director of XR Initiative

Tom Finholt

In the final episode for season one of the MiXR Studios podcast, we talk with Tom Finholt, the dean and a professor at the University of Michigan’s School of Information. Dean Finholt was instrumental in the early work in collaborating with faculty and other deans at U-M to explore what the university should be doing in terms of XR for research and pedagogy. Dean Finholt was strongly encouraged by alum and external advisory board member Jamie Voris to invest in XR technologies broadly across the university to enhance teaching and learning. As we wrap up our first season, I couldn’t think of a better person to share his insights and thoughts about where we have been and where we should go with the XR Initiative.

As the dean of the School of Information, Tom worked closely with Mark Newman to create the XR Graduate Certificate program and enlisted the support of other deans. XR has been an important strategic focus area for the School of Information and Dean Finholt sees these technologies as a natural extension of the current User Experience Design teaching that is a core part of the curriculum. He shares his thoughts and perspectives on how broadly these technologies can be applied to almost any domain and how a place like Michigan is a perfect place to experiment and lead because of the breadth and depth of programs and faculty. Additionally, there is a great opportunity for junior faculty to make a big impact on the XR design space to create toolkits, best practices, and foundational research.

Microsoft Garage Reality Room

We discussed many different examples of XR technologies that have potential to help shape the future of education that vary from space exploration to medicine to social work to remote expert assistance. Dean Finholt was part of the faculty team that visited Microsoft in 2017 to learn more about their work in mixed reality and he talks about some of the amazing technologies they experienced at the Microsoft Garage Reality Room. The School of Information created an Augmented Reality application, MGoView, for its 2017 bicentennial celebration that allows people to see autonomous vehicles at MCity, view lions at the Natural History Museum, and learn more about the history of the university. 

Man holding up smart phone with extended reality elements on screen on U-M's campus
Evan Hoye, MGoView project manager from the School of Information, demonstrates the MGoView app. Image credit: Jeffrey Smith, U-M School of Information

Dean Finholt shares his perspective on the future of learning and how XR could play an integral role in how learning can be scaled to more students. There are concepts like nano structures and 3D objects that are challenging to teach in two dimensional space and Augmented and Virtual Reality can make these invisible constructs visible. Additionally, teaching about complex machines and architectural structures can be done in virtual environments with game engines like Unity and Unreal Engine to broaden access to things that would be very challenging otherwise. Humanities education could be transformed with immersive technology to visit historical places and time periods via these engines and could even incorporate more gameful pedagogical techniques. 

We talk a lot about the goals of the XR Initiative and how our program can plant a lot of seeds and evaluate which ones have more broad applicability and really impact learning. It is important to dean Finholt that we reduce barriers of entry for students and faculty to creating XR content. This starts with working to build frameworks and platforms that allow the creation of content with low or no coding requirements. His goals align quite well with the current strategy of the XR Initiative to fund another round of XR projects for faculty that will use 360 video and commercial authoring platforms to create XR content to support courses across many different schools and colleges. Dean Finholt sees a natural opportunity to engage more with the department of Film, Television, and Media (FTVM) to explore virtual production and share the knowledge and learnings over the last 100 years of motion picture production to expedite XR production.

Virtual Production of Disney’s The Mandalorian

We conclude our conversation by discussing some of the exciting work being discussed with live performance units across the University such as the University Musical Society. Dean Finholt is facilitating a group of faculty, performers, staff, and corporate partners to explore how to recreate live musical performance at Hill Auditorium in a remote and immersive way. With the current restrictions on large gatherings because of COVID, there is a collective gap in society around shared immersive experiences and XR technologies can begin to bridge that gap. In the future there may be more ways for people to experience events remotely combined with people attending in person.

I had a great conversation with dean Finholt and I am proud of the work we did in season one. I would like to give special thanks to all of the work our exceptional team at the Center for Academic Innovation did to help produce and distribute all 27 podcast episodes. Thank you Sean Corp, Matt Lima, and U-M student fellow, Nick Froelich.

Please share with us what you would like to learn more about in the XR space at jernel@umich.edu.

Subscribe on Apple Podcast | Spotify

Transcript: MiXR Studios, Episode 27

Jeremy Nelson (00:10):
Hello, I’m Jeremy Nelson. And today we are talking with Tom Finholt, who is a professor and Dean of the school of information at the university of Michigan. We are talking about his work and convening a faculty working group to explore XR technologies for research, teaching, and learning, and how he sees where students can shape the future of XR here at Michigan coming up next in our MiXR podcast.

Jeremy Nelson (00:49):
Good afternoon, Tom. Thank you for joining us.

Tom Finholt (00:51):
Great to be here. Thanks for having me.

Jeremy Nelson (00:54):
Yeah. I’m excited to have our conversation today and explore your perspective of XR across Michigan. And as I understand it, you were, you were one of the key people to help bring together all the exciting work that’s happening across the university, ultimately led to the XR initiative. I’m thankful for. Um, but I’d love to, for you to share with our listeners, like your, your role in this kind of, how did, how did it start from your perspective?

Tom Finholt (01:27):
Sure. So, um, it’s pretty recent history, all things considered. So I was, uh, appointed the, uh, interim Dean of the school of information in 2015. At that time we, um, had, we still have on our external advisory board, Jamie Voris, who’s the CTO of Disney Studios. And he, he was, uh, constantly bugging us about, uh, augmented reality and, uh, which is what we called XR back in the day. And, um, and that this was an important technology, Disney thought it was important. Their technology partners thought it was important, uh, such as Microsoft and, uh, that there was a great opportunity for the University of Michigan, both in terms of research and, and pedagogy to, to jump into this area. And so that led us to, um, explore who the, the likely participants might be on campus. And we orchestrated a number of, uh, lunchtime conversations, faculty across probably about 14 or 15 of the 19, uh, units, then a couple of the centers and institutes.

Tom Finholt (02:40):
And, uh, out of that emerged a consensus within a group of deans to convene a faculty committee and, um, Mark Newman, chaired that committee. And it had really three elements to its charge. Uh, one examined the research space XR and determine what the opportunities are for the University of Michigan. Um, look at the nature of instruction about XR and what gaps, uh, Michigan would need to, to fill and then, uh, articulate some rules of the road governing access to say donated equipment at that time, um, Microsoft was anticipating, uh, a fairly substantial, uh, in kind gift of, uh, of HoloLens sets, uh, to, to the university, um, which have been, which have been deployed and like Michael Nebeling uses those in his lab and so forth. So, um, the result of that committee’s work was, uh, uh, creation of this research white paper, which I think in part guided the formation of your initiative and, uh, activities that were, uh, that were promoted and led out of the provost office, such as by, by Sara Blair and, and others.

Tom Finholt (04:03):
Uh, and also the architecture of the, um, graduate study certificate that was, uh, proposed to Rackham graduate school and ultimately approved to be offered just this past winter term. I think there were six or seven students already who were, were taking it, and that’s built on a foundation of, of courses, um, including, uh, to that Michael Nebeling on the school of information faculty developed from, from scratch. So that really got our, our toe in the water and articulated some principles about things that Michigan wanted to do. Um, there were things that were in the mix early on that fell by the wayside. I’d say the big one was, uh, engagement with athletics. At one time, we talked a lot about, uh, you know, using XR to, to augment the fan experience in the, in the big house. Uh, something that’ll be a little bit moot this year, at least, uh, sure. And, um, and, uh, and other kinds of, uh, sort of more production scale entertainment venue, uh, type things, uh, you know, it’s, it was, uh, was an interesting experience. Uh, we made a number of field trips. We went out to, uh, Microsoft research and got that the whole HoloLens, uh, kind of dog and pony show. You know, we walked on the, on the Martian surface and so forth. So it was it, It was a fun time. Lot of really, I met a lot of really interesting people across campus. Very exciting.

Jeremy Nelson (05:36):
Yeah. I mean, that was one of the things that really drew me to this position in the role. It was just like the broad scope and interest from many of the different schools and faculty already deeply into this space and exploring and learning. And to me, I thought there was a lot to, a lot to work with here. And as I’ve explored a lot of other institutions and work around the country and around the globe, you know, we have a pretty unique opportunity here at Michigan, I think with our cross disciplinary interests and, and the graduate certificate. I haven’t heard of anyone else with that, that, so that, I think that might be the only one as far as I know at this point.

Tom Finholt (06:14):
Yeah. I mean, there are, um, sprinklings of, of, uh, coursework. It’s, there’s some at the University of Washington, for example, there’s some at, uh, at USC, um, and, uh, some at Carnegie Mellon. But, uh, I think we’re the, we might be the first to have sort of formal graduate program and the, and the way those certificates work is, you know, you’re getting your, your degree in some primary area. And then you pick up this certificate as a, as a, kind of a 12 to 15 credit sidecar, and you can do some double counting to make it easier to, to get there. And we’ve, we thought that, that the certificate was an important starting place in terms of knitting together, the, the broader community we’d seen this process work very successfully for high-performance computing at the University of Michigan, um, and also for data science. So, uh, so we were, we were excited to do it and I think it’s going to be a big success.

Jeremy Nelson (07:11):
Yeah, I’m, I’m, uh, very excited about that and directing a lot of folks toward it. And obviously in the current situation, folks are asking, can we do this remotely? And, you know, I think we’ll explore that as we move along. Uh, but it wasn’t quite built for that yet. So as, as you started or got into this or got the advice, you know, why did you think XR was important? What did it mean to you?

Tom Finholt (07:35):
Yeah, so it to us it represented a couple of important paths forward. So one was, uh, we saw it as a, as a critical frontier in user experience. So the, the, the, uh, primary modes of user experience have been through two dimensional, uh, interaction. We, you know, initially at the, at the desktop and increasingly on the, on the, on the mobile platform, uh, our smartphones and so forth, and, uh, XR to us represented the natural extension of the, of the user interface into the environment, uh, things that we’ve seen in terms of, uh, underlying infrastructure of the internet of things. And so this was a natural, uh, parallel to that, uh, to that development. And one of the things we saw missing in that space is that the whole array of, uh, tool kits and components and building parts that have accumulated over, you know, 50 years of, uh, graphic user interface development were largely non-existent in XR space.

Tom Finholt (08:45):
So it was, uh, was an opportunity to, to, uh, plow a Greenfield, especially for junior faculty. It seemed like, uh, a low threshold to really make their mark, um, in terms of application area. Uh, one of the things that was very exciting to us is that it, it has, uh, a broad array of, uh, application. And that was a great match to University of Michigan, which is comprehensively excellent across a host of domains, medicine, engineering, natural sciences, social sciences, humanities, arts, and so forth. And in each of those domains, XR has a, has a principal presence. And you can see that in the composition of the, of the governing committee, of the graduate certificate and of the signatories to that original committee, that Mark Newman chaired. We have faculty from art and design. We had faculty from dentistry, faculty from medicine, from computer science, from civil engineering, from, uh, from, uh, music, and dance from, uh, from the college of LSA, uh, you know, literature, history, um, of course, a lot of interest from the technology side in electrical engineering and, uh, in the school of information, school of public health.

Tom Finholt (10:06):
So across the, across the waterfront, a lot of interesting opportunities, and then, you know, it was identified as a significant growth sector for technology development. And I think that that promise hasn’t quite been realized we’ve, we’ve seen some, uh, interesting breakout successes Pokemon Go, I think, is one that most people are familiar with. Um, but you know, I think, uh, things like the Ikea catalog are now commonplace, uh, such that people don’t even realize that they’re, they’re, they’re dealing with an XR app. It’s just, that’s just the way you, you know, you shop at, at a place like Ikea. Um, I think the pedagogical implications are, are vast, but not, not fully realized yet. Partly this is because the barrier to entry is so high for, for production. Uh, you know, faculty want to develop PowerPoint slides. We have a 40 year legacy of, uh, presentation software to draw on.

Tom Finholt (11:09):
You want to develop your, uh, XR app. It’s like, go, go learn Unity or go learn, go learn the Unreal Engine or, or whatever. And, you know, some faculty are going to do that, but not, not very many. And, uh, so we kind of lack that, that middle layer of enabling technology that will really allow this to break up, you know, Apple, uh, in their, what their iOS release of four or five cycles ago, they started including the, the AR toolkit and we saw a flowering of, of apps as a result of that. Um, you know, the Washington Post and the New York times have units that produce XR content now for their, um, for their media platforms. You can your triceratops in your living room with you and stuff like that. So, uh, and there’s, there’s a lot of exciting cutting edge technology companies operating in this space, both at the, at the hardware level, trying to, um, optimize, uh, battery life against, uh, uh, CPU performance, um, form factor. It’s been a big issue. And in Microsoft design thinking, um, making, making the headsets comfortable to wear and so forth. And, um, so I, you know, I think there are a lot of reasons to be excited about clearly the entertainment companies are, are keeping an eye on it in investing because they think it may be the direction of, uh, of mass entertainment. Um, certainly the experience with the pandemic suggests that they accelerate that program, but, uh, yeah,

Jeremy Nelson (12:45):
Yeah, yeah. Abroad, I mean, I tested out the, the NBA through the Oculus last week and it was, it was interesting, obviously it’s kind of a first gen experience and kind of a blend between avatars and computer game ish with a real like viewing experience. But I could see the potential, like it’s, it’s definitely well on its way. And the investment they’ve made is I think has really pushed the whole industry forward is, you know, from, from Oculus and from Facebook, right. They’re really driving down the costs. And then, you know, they’re, the Quest is in many ways a revolutionary device to bring, you know, I would carry one around in my backpack as I met with faculty across campus last fall. And it’s like, Oh, have you ever tried it? No, boom. We’re in VR in less than a minute. Yeah. It’s very powerful.

Jeremy Nelson (13:33):
Especially with some of those other schools we talked about, you know, I tried to meet with at least one person from every school and, you know, some that, you know, maybe you wouldn’t normally think of XR. You know, I met with folks from Public Policy and I, I showed them a couple of experiences and they just began to ideate immediately. Oh, do you mean, like we could create a whole simulation where students can go try out, you know, working a bill through Congress or negotiating and yes. Yes. Like that’s, that’s the power of it. So I, I see many opportunities and as we continue to help, um, try to enable more, uh, content generation. Yeah. The point you made about creating something right now is still is a high bar. Right. So we’re very interested in like, can we enable tools or procure tools that allow more people to create faster without a lot of deep C, C++, 3D modeling coding.

Tom Finholt (14:27):
Yep. That’s the dream.

Jeremy Nelson (14:30):
Yup. We’re getting closer. Uh, well, so what are some examples that you’ve experienced that you really enjoy? I think you started to mention, like going to Mars or you saw at Microsoft.

Tom Finholt (14:41):
Yeah. So, um, so in the, in the speaker series that you guys sponsored, um, I was very impressed with the applications, uh, Dr. Courtney Cogburn has been developing to allow, um, uh, you know, a non-black audiences to experience, uh, anti-black bias and, uh, and racism. I think that’s it, that’s an outstanding and very imaginative application of the, of the technology. And even though, you know, not perfectly realized from a, from a production point of view, the, even the, the primitive experience I’ve found to be very, uh, evocative and, uh, and, and moving. Um, so that I think would be, we would be one impressive example, um, you know, from the, from the application side, that the kinds of things that, um, I’ve been impressed with, uh, I think, uh, Case, Case Western med school did a, a demo application on the HoloLens, which was, uh, allowed you to kind of have the equivalent of Grey’s anatomy at your disposal while you’re looking at, uh, you know, say the body and you could, you know, reference, uh, anatomical structures and so forth and kind of blow them up in that exploded view while you’re looking at, uh, the actual patient or the bone or the vessel or tissue that you’re concerned with.

Tom Finholt (16:20):
Um, uh, I liked that the Mars simulation at, uh, at Microsoft, they have the whole kind of, uh, four car garage size room in the, in the basement of that building 90 it’s in one of the other, right, right across the lawn from, from the research building, you put your HoloLens on, and then you can physically explore the space, which is basically about the size of the field of view for one of the Mars Landers. And, um, so, and you can, and you can turn your head and you can see that the horizon, the 360 view and all that kind of stuff. So that was, that was really cool. And then, um, and then the other, uh, application area that I’ve been very impressed with is for, um, field maintenance and, and repair. So, uh, you know, the classic or canonical kind of example is, uh, if you’re working in telecommunications, the, or on the field, you open up a Western electric box from, circa 1948.

Tom Finholt (17:21):
And right now that thing was built 70 years before you were, or 30 years before you were born or something like that. And then you can pretty much with your XR set, it can superimpose onto this alien technology kinda known references. It’s like the, this wire right here, this is actually the same as what you’re familiar with in the, you know, the RJ standard as, as the ground. And, uh, and that kind of a thing. Um, and I think Microsoft has a demo at the lab of a motorcycle repair. So you’re essentially, uh, you can have the, the, the experts view superimposed onto the bike. And, you know, did you look here, did you just had, or in the medical care setting, um, you know, it’s the experience where the resident is, is doing the workup, and then she steps into the hall with the attending.

Tom Finholt (18:17):
And the attending says, well, did you check, you know, like a hand injury? Did you check the stuff box? It’s like, Oh, just that little space between your index finger and the thumb, and, you know, that the resident hurries back in and do that, do the inspection of that site. And that’s the kind of thing you could imagine occurring through some sort of collaborative interaction mediated through, through XR. I mean, in terms of the, kind of the blue ribbon demos, I think that the magic leap one, the breaching whale on the, in the gymnasium is, is pretty awesome in terms of a wide scale deployment. I think Pokemon Fo is a very interesting achievement. Um, and, uh, I have a lot of respect for the, the stuff that New York Times and Washington Post, um, have done with their, with their XR, uh, tech. I mean, I would include in there their, you know, their use of those Google 360 cameras, which they, they sent out everywhere. Um, yep. The cardboard. Yeah. And we did our own little, uh, XR implementation for the bicentennial. We had a little game you could play on campus and at various locations you could bring up, um, some, uh, predesigned XR elements. So if we went out to the autonomous vehicle test track, you know, the autonomous autonomous vehicle would materialize and drive around in front of you and nice, the lions would come to life at the exhibit museum or the old exhibit museum. And that had that kind of a thing.

Jeremy Nelson (19:51):
That’s great. Yeah. That’s, uh, there’s so many, the examples you gave were great, and we’re, we’re actually exploring some of those right now, particularly in medicine, you know, with COVID, they’re, they’re very interested in how do they continue to allow med students to get their rounds and get their time in there. So we’ve got an upcoming pilot, I think it’d be pretty transformative for med student education.

Tom Finholt (20:16):
I think the coolest one I saw in the medical space was for a, with an ER app that would cause a big problem in the ER, as patients come and go, and you don’t know where they’ve gone off to. And so this would create a kind of an XR trace of the patient’s path. Cause you know, like five minutes ago, when you left your patient on a gurney at this location, you went to check on something and came back and the patient was gone, where the heck did my patient go? And you can look down at the floor and you could kind of see an indication of the direction of away from that space and then an indicator of the, of the destination. And then that thing, you know, might be set up to, uh, decay showing the, you know, the passage of time. So kind of a faint representation. We will, that’s where they were an hour ago. We know very bright flashy one would be, it just happened kind of a thing. So, which I think speaks to the, the variety of, of UX opportunities here as you’re overlaying the UX onto the under the world, that creates all kinds of interesting, um, chances to, to innovate and develop novel ways of interacting with technology. That part is really, I think that’s super exciting.

Jeremy Nelson (21:32):
Yeah. Well that, that leads into my next question. And maybe you answered it, but like how do you see XR as an important technology for students and what they can bring to the space?

Tom Finholt (21:42):
Yeah. You know, I think there’s a couple ways of answering that. The, the instrumental way is to say for, for students who wish to learn to master this technology, they’ve got to have the opportunity to work with it and learn about it. So people that, you know, that want to become fluent in, in a, in a 3D or, or, uh, a XR engine like Unity or Unreal Engine or, or what have you, um, they can develop student projects and, and perhaps work with external clients to develop applications and become, become fluent in their, um, in their, in their development and deployment. So that’s, that’s one obvious area of, uh, of application. I think a lot of the graduate certificate as launched addresses that need, I think there’s a second critical need. And this one is often articulated by, um, by Joanna Millunchick in the college of engineering, which is that, uh, these technologies can dramatically expand the experience of students.

Tom Finholt (22:44):
So she’s a material scientist, and of course they’re dealing with, you know, structures that are nano at best. And so no, no one will ever see them with the naked eye, uh, and, and their intuitions that you have to develop about these 3D structures, especially these, uh, you know, crystals that are very difficult to gain an intuitive understanding with, from 2D representations. And now she can, as the instructor, uh, pull up a representative, uh, structure and then beam it to the 20 students in the class who all can independently manipulate it to S to start to understand features of its structure and, uh, and characteristics of it as a, as a material that would have been, would have been very difficult to accomplish previously. Um, and similarly, you know, we’re, we’re in the, in the realm of materials, we can now render things that are not visible to the naked eye.

Tom Finholt (23:42):
In the context of the humanities and sciences, we can create an experience that’s not possible to replicate in the modern time, like going back to a thriving city in Asia minor, or a trading port that was part of the, the Atlantic, uh, triangle, um, to understand, you know, the, the heinous nature of the, of the slave trade and supporting, uh, commerce between the, the new world and the old world, and that kind of a thing. And those could be embedded in, in game engines or some kind of, um, uh, gameful metaphor, I think could, could make it very exciting. Um, and then I think there’s, um, there’s a case where, uh, XR creates the opportunity to, to manipulate and work with things that are too costly or too expensive to provide on a mass scale. Like, um, you know, we’re not gonna be able to put each student in front of a precision controlled machining set up.

Tom Finholt (24:45):
Um, but in XR you can imagine each student can, you know, have their own machine set up and know this is the, this is the dream of the Deweyites is that you would, uh, is that you would learn by doing through your immersion in these, uh, these XR technologies. And, you know, I don’t, I don’t think the, the state-of-the-art is sufficient to support that widely right now, but that’s definitely, that’s definitely why there’s a lot of excitement amongst educators about what these things, um, uh, could, could deliver. And that’s, I think significant in the mission of the, of the public university is we want to broaden access to knowledge and also, you know, use our, uh, our university as a platform for, for social mobility that happens through exposure and manipulation of things that would otherwise be impossible to do. Right. So I think that’s really the, that’s really the exciting aspect of it.

Tom Finholt (25:47):
And I think there’s a lot of interesting things at the margins. You know, the collaboration support, uh, people through XR technology referencing a shared a virtual space, you know, much, much as we would do in a gaming situation. Uh, we would be in the, in the, in the rendered world that rendered world could be the classroom, or it could be the lounge, or it could be the operating theater, or it could be the machine shop or the interior of the jet engine or the inside of the, of the DNA molecule. You know, these are all places we can go. It’s like having a, this, that science fiction thing, the fantastic voyage, you know, where we all get reduced down to sub sub nanoscale and, uh, you know, hemoglobin

Jeremy Nelson (26:34):
The magic school bus. Yeah, exactly. That kind of stuff. Yep. Well, I’ve tried a couple of those, you know, being working from home, you know, fortunately I’ve had access devices and, you know, I’ve, I’ve met somebody from DC, somebody from Greece and somebody from London in a virtual operating room, and we did a knee replacement surgery all collaboratively, and it was very powerful and, you know, just opens up a lot of thinking and opportunities, especially as, you know, we’re trying to teach students more remotely and, you know, kind of whatever survives on past the pandemic. Right. There’s I think there’ll be opportunities to continue that distance learning through the technology. Yeah. Well, you know, we’ve talked about all the exciting things, the great things, you know, what concerns do you have about the future or this technology, or what areas should we keep an eye out for or help shape?

Tom Finholt (27:26):
Yeah. Um, well I think, uh, you know, the, an obvious concern is that we don’t allow the, the bias and, um, and, and systemic dysfunction that’s present in, in real life to, uh, to infect, you know, virtual life or XR life and, um, uh, that these, these technologies be deployed for, for good and not for, um, not for, not for malevolent purposes. I I’m okay with commercial application. I wouldn’t go so far as to say, you know, gaming companies shouldn’t be able to benefit from them, but I wouldn’t want to reproduce some of the mistakes that have been made with video games in terms of the, you know, the endemic, um, it’s some have called it endemic misogyny and, and so forth. Uh, and I think there is, there’s always a danger in a, in a new field that is a technology centric that it will by that for by that virtue create barriers to entry for others.

Tom Finholt (28:32):
And so, you know, one of my interests in the Initiative at Michigan is to make sure that we reduce those barriers of entry, so that there’s more opportunity for expression and engagement that doesn’t require that, you know, the 20 years of nerding out that, that you currently see if you go, you know, if we go look at a, at a development shop for a, for a game for game publisher or, or, or that kind of thing. So I th I think that’s one concern. Um, obviously it creates an avenue for, you know, new kinds of harassment and abuse that could be experienced much more viscerally than would be the case through, you know, say text-based technologies and, uh, and even video based technologies. So all of the, you know, all of the issues we have with, uh, with technology come, come to the fore here, you know, privacy, I think is, uh, is a significant concern, particularly if people are wearing these devices as part of their apparel, or, you know, every day, uh, glasses, we saw some of this with the, with the Google Glass project, um, you know, who, who is looking at whom, what are they doing with that information?

Tom Finholt (29:51):
Uh, how has it being aggregated and, and processed? So I think those things are all, you know, incumbent on XR developers as much as, as they are on, on all technology developers.

Jeremy Nelson (30:04):
Right. Right. Yeah. And it’s, we’re keenly interested in, you know, laying the foundation for some of these, uh, areas of security and privacy into the projects we’re working on to make sure we address that now, you know, even though it’s easier to skip over that, we’ve seen that in other areas and other software development over the last 10, 20 years that has come back to cause a lot of pain down the road with data breaches and, you know, uh, access to information and hacking. And so we’re trying to keep that forefront for all the work we’re doing well. So, uh, what do you want to see Michigan do? What would you like us to do with the initiative? Like where do you want to see us take this and go from here?

Tom Finholt (30:46):
Yeah. So, and, you know, in the, in the research space, I think the, the function of the initiative is to, is to plant a lot of seeds. And, um, and then as things, uh, sprout and mature determine which of those are pathways that look like they’re particularly promising and, you know, identify those for, for more support and more funding, whether that’s from the University or, or corporate partners or from, you know, federal agencies and so forth. I think, I think that it’s a great model and a one that the University should use use more often with these kind of small grants competitions and have a low threshold for entry, um, competitive, but not, you know, and not in a NSF competitive. Sure, sure. You know, when one out of 20 proposals gets funded or something like that, or, but one out of 15, um, uh, so that would be one that would be one, one direction.

Tom Finholt (31:49):
Um, I’d like to see a development of, uh, of an XR production unit that faculty could turn to when they have, uh, ambitious projects to, um, you know, develop entire course length, uh, programs. Um, that would be, um, you know, on platforms that students could access across a host of, of devices, uh, from smartphone to purpose-built headsets. Um, I’d like to see, um, film and video and some of the other kind of, uh, back of the camera operations become invested in this area and start addressing some of the, uh, production challenges, you know, in the, in VR sense, you know, where do you hide all the gear, right? You got a 360 view. There’s no, you know, there’s no plane that you can hide all the cameras and, and people behind. Um, and, uh, and we don’t, you know, I think, uh, understanding point of view, uh, photography and cinematography is, is very different than, you know, what we’re accustomed to seeing from, you know, DP’s on motion, pictures and, and stuff like that.

Tom Finholt (33:10):
Um, so I think there’s a lot to be done in that area. And I think, you know, we could do more to engage film and video in, uh, in LSA. I think they’ve been a fairly quiet player for, for the most part. And, uh, I think it would be interesting cause they’ve, they’ve got the whole book on, um, on technique and approach. I mean, all of these things were, were invented at the, at the turn of the 20th century to support a motion picture production. And, you know, a lot of, a lot of those mistakes could be avoided for XR with a little bit of, of, uh, of understanding. Um, and, uh, so I think that that’s interesting. And then I think there’s a, a great role for for humanists to play in thinking about how these technologies should be to be used. Um, you know, the kinds of things that, uh, Dr.

Tom Finholt (34:05):
Cogburn was trying to accomplish with her, with her demonstration application to see, you know, can you really create the sense of being in someone else’s shoes and, uh, does that, uh, ultimately change your behavior? Does that make you, um, you know, more likely to be, uh, an anti-racist ally, um, and are these, are there streamlined approaches that we could use to convey this information that would kind of, uh, if you will shock people out of their, their comfort zones and, uh, spur them to, to change their behavior, to hold others accountable for their behavior and that kind of a thing? I think that’s really an exciting, I know, um, our colleagues in LSA have been very interested in that Sara Blair in particular has pushed that line, uh, pretty hard, which is independent of the underlying technology. And it says, which I think is important. It’s important to have voices at the table that aren’t going to geek out over the right, the frame rate of the headset or the battery life of the CP of the, of the device. And yeah,

Jeremy Nelson (35:12):
Yeah, no, she was, uh, working on, uh, Uncle Tom’s Cabin project. We did with the folks at the theater status center for her course in the fall of 2019. And she was really interested in exploring perspective from multiple points of view. And so they were able to recreate the scene from the book and you could experience it from three different points of view, and it was just her students as they went through it. They just had a lot of interesting reactions to being the main character, being the pursuer kind of being a birds eye view and just like the technical and from a technological standpoint, that’s, I’ll say easy to move the virtual camera around, but from the viewer or the, the person experiencing it just opens up a lot of different affordances.

Tom Finholt (35:53):
Yeah. So we’ve, we’ve recently initiated a conversation with UMS about, uh, uh, staging, a virtual performance of some kind, and we’ve, we’ve, uh, uh, discussed the involvement of various, uh, corporate, corporate partners who might be willing to donate in kind resources, particularly, you know, programming talent and code libraries to, uh, for example, create, create a replica of Hill auditorium, the experience of entering the auditorium, finding your seat, uh, the, the preamble to the performance, the, you know, you could actually hear the rustling and the talking to the crowd, and then the hush as the house lights go down and then the performance unfolds, uh, before you, I mean, I think UMS is understandably concerned that they’re not going to have, uh, the ability to stage large audience live performances, really, anytime, I don’t think short of 12, 12 months, uh, to be realistic. And so they’re very interested in, in ways of maintaining connection with, uh, with audiences and artists that, um, that can, can be done in the, in the virtual space. And of course, they’re just all kind of novel territory for them. It’s not really a sure that they’re, they’re used to thinking about. I don’t think a lot of them play Fortnite, for example.

Jeremy Nelson (37:24):
No, no. Yeah. I would imagine that there’s some interesting work happening, you know, with, um, uh, popular artists are doing performances on altspace and other platforms. I there’s a new title called the Under Presents and where there, you can actually purchase a ticket to go see a live performer in a virtual environment, so you can navigate through, uh, it’s, it’s a, it’s all graphics, right. It’s not 360 camera, but the, the avatars you’re interacting with are real people on the other side. So there’s just some really interesting ways to experience some of these areas that are difficult to conduct in a pandemic, right. Live theater, musical performances, large crowds. Yeah.

Tom Finholt (38:12):
Yeah. And I think, um, these are the, these are the kinds of experiments that we, we need to be conducting, and we are the people to do it. Um, because society is going to need these outlets. Our artists have to make a living. We, we benefit from the experience of their, their performance. If we’re looking to limit it to just audio recordings or, uh, you know, televise presentations, that’s not really the same. That’s not really the same thing. I also think it will be a driver for, for technology development, particularly around, um, you know, three dimensional audio so that the concert hall is experienced as much more vivid than even you would get through, uh, through stereo headphones.

Jeremy Nelson (39:01):
Yeah. The, the spatial audio we’re working with, uh, Anıl Çamcı in SMTD, and he’s doing a lot of work in that space to try to make it easier for non sound engineers to create that spatial audio in these applications. So, so very exciting work. Well, I have really enjoyed our conversation today. Uh, learning quite a bit more about how we got here, and I appreciate all, all the work you’ve done and continue to do. And, and I look forward to where we take this

Tom Finholt (39:31):
Great, well, it’s a pleasure chatting with you that XR Initiative is very exciting, and I’m glad you’re serving in this role. And I think these podcasts and the speaker series, when it, when it can renew is, is a great thing for convening the community and presenting, you know, a set of, uh, provocative and thoughtful, uh, researchers for both locally and from afar.

Jeremy Nelson (39:58):
Yeah. Well, thank you.

Tom Finholt (39:59):
Great stuff. Take care.

Jeremy Nelson (40:02):
Bye.

Jeremy Nelson (40:13):
Thank you for joining us today. Our vision for the XR initiative is to enable education at scale that is hyper contextualized using XR tools and experiences. Please subscribe to our podcast and check out more about our work at ai.umich.edu/xr