Jeremy Nelson, Director of XR Initiative
In this week’s MiXR Studios podcast, we explore XR development in the College of Engineering. We talk with Talal Alothman, an XR developer at the University of Michigan, to learn more about the projects he is working on and his interest in creating VR interactions and hand interfaces. Talal has been instrumental in the early shaping of the XR Initiative and provided great insight into our initial XR Innovation fund projects and the hiring of our XR team.
Talal talks about his work with faculty in the College of Engineering to help them with early prototyping and shaping the scope of their XR projects. One of the more unique projects he is working on is with Robert Hovden, an assistant professor of materials science and engineering, using a Looking Glass display to view crystal structures in a manner that didn’t require a Head Mounted Display (HMD). Talal created unique interactions with a Leap Motion controller that allows students and faculty to manipulate the 3D model with their hands.
We also discuss the work that Talal has been doing with Arthur F. Thurnau Professor of Materials Science and Engineering and Associate Dean for Undergraduate Education in the College of Engineering Joanna Millunchick. The College of Engineering funded a set of XR projects targeted at Engineering faculty, and Talal is instrumental in helping shape those projects.
I was excited to learn more about Talal’s passion and get an in-depth take on how best we can create XR experiences that will help students transition to XR. Please share with us what you would like to learn more about in the XR space at email@example.com.
Subscribe on Apple Podcast | Spotify
Transcript: MiXR Studios, Episode 2
Jeremy Nelson (00:10):
Hello, my name is Jeremy Nelson and today we’re talking with Talal Alothman, an XR developer at the University of Michigan College of Engineering. We’re going to be talking about XR interactions, hand gestures, and all the cool projects he’s been working on coming up next in our mixer podcast. Good afternoon Talal. How are you?
Talal Alothman (00:36):
Doing well, thanks for asking.
Jeremy Nelson (00:40):
Yeah, yeah. Thanks for joining us today. I appreciate you taking the time now.
Talal Alothman (00:44):
I’m happy to be here. Excited to talk VR XR.
Jeremy Nelson (00:47):
Yeah, yeah, for sure. All things AR, VR XR. Yeah. Well, why don’t we give folks a little background about yourself, what you do at Michigan, the areas you focus on.
Talal Alothman (00:58):
Sure, yeah. I work as an XR developer for the college of engineering, specifically for, for Kane, which is the college of engineering’s, uh, I guess IT support, um, and my role is really to consult faculty on different XR approaches, uh, or different XR tools that they can integrate into their classroom. Additionally, I work with them on designing and developing XR tools for their own classrooms. So it’s a bit of a, uh, kind of like, I’m going to consult you on what tools to use. But it also involves a lot of work developing, uh, some of these tools for the faculty. Nice, nice.
Jeremy Nelson (01:45):
And are you focused particularly within the college of engineering or do you support faculty across the university?
Talal Alothman (01:53):
Uh, the main focus for me has been supporting faculty that have been funded by the college of engineering, uh, scholarship or grants program, uh, XR grant program. But I also work with a lot of folks like you outside of the college of engineering to kind of consult, uh, consult them on different issues as well. So yeah, it’s mainly college of engineering, but also to the university as a whole.
Jeremy Nelson (02:23):
Yeah, I know, I think we’ve really been trying to bring together a community of folks in this space that are doing work in XR, interested in XR, and really trying to bring more of that together to expand it out. Well tell us a little bit about some of these projects you’ve worked on so far and what you’ve liked about them. What are good examples of them that we could have our listeners imagine or to see more about?
Talal Alothman (02:47):
Sure. Yeah. One project that I worked on for a couple of months was a funded project for Dr. Hovden out of the school material science. They had a holographic display, which is not your typical XR, it doesn’t fall within the typical XR kind of space, but is kind of considered that. So it’s kind of like a thin line, a gray area of whether it falls into XR or not. But the idea is they had these looking glass displays that showed off 3D content and really, uh, in a, in a 3D environment that could be viewed by multiple people. So you’re not, you’re not thinking, it’s not a headset that someone puts on, but it’s actually a monitor screen that people, a group of people could sit in front of and see content in a 3d manner and they’re a really cool headset.
Talal Alothman (03:43):
They’re really cool monitors. Um, the company that makes them, Looking Glass, has really pushed them in the last couple of years and they’ve got, gotten some success. But the idea is they want it to visualize different crystal structures with these monitors and they’d already done a lot of the work to do that, but they didn’t have a way to interact with it. So my role was to come in and design and build out an interface for them. And the interface we chose to go with was a leap motion interface. And uh, for those who don’t know what leap motion is, leap motion is a hand sensor. It kind of picks up on your hand and rigs the hand to a couple of, uh, bones that you can then effectively use and visualize your hand in that space and use the hand to, to do a lot of really cool, uh, interactions. And we worked on that for a while. Right now the project is kind of wrapping up, and that was one of the ones that I really enjoyed working on cause it involved a lot of design work, a lot of interface work and then a lot of development work so a good mixture of that, a healthy mixture.
Jeremy Nelson (04:47):
Nice. And when you say a monitor, I mean this, this is like a glass cube almost or like a structure, right? That sits on a desk or uh, that somebody can touch and so that you were able to, so how did you do that? Were you working with, you know, Unity or some SDK for looking glass or how did you bring these models in and what was the hope of Dr. Hovden or, how people would use this device?
Talal Alothman (05:11):
Yeah, so the SDK that comes with, that’s available for the monitor does have a unity support. Uh, so it really is a process of just importing a lot of the unity classes and components that you would need to get up and running with the monitor. And after that, it really is your typical, you know, you need development cycles. So if you’re trying to build an interface, you sit down and kind of figure out what kind of interface you want, um, and work on implementing it. Integrating it with another component, which was leap motion was also, uh, an interesting challenge. The leaf motion also comes with a lot of unity kind of libraries to get you up and running. But a lot of the core basic interactions aren’t really there and there aren’t, there are general guidelines, but they’re really focused on VR interaction, um, egocentric, like VR interaction and not like any other type of interaction. We were doing something different with it. So there were some challenges there that we approached really openly and, and I think we came up with a pretty cool interface, um, I think students and, um, some of the people we tested the interface on really enjoyed it and found it to be like a fun, interactive tool.
Jeremy Nelson (06:36):
Nice. So was this, was this part of a course in material science to learn about structures or was this more for research on how to this type of device, the interactions might be used to explore them?
Talal Alothman (06:47):
The end goal is to integrate this into a classroom. Um, at this point really it is mainly a research project. But the end goal, once we can go back to the classroom at some point, um, is to really bring this into the classroom and see what role it has in kind of bettering the students’ understanding of these different structures, these different crystal structures that, um, are scanned in, um, through the tomographic process, which is what Dr. Hovden really specializes in from my understanding. Yeah.
Jeremy Nelson (07:25):
So how is this different than how students learn today or before this?
Talal Alothman (07:29):
Yeah, that’s a great question. I think the typical approach right now is to use software, um, 3D manipulation software. You know, like your typical mouse and keyboard interface, right? And those tools are very powerful I think. But, um, there still remains this, uh, this idea of, of um, it’s a, it’s a 3D, uh, it’s a 3D body or a 3D mesh that’s projected on a 2D surface, which is the monitor and it really kind of makes it a bit hard to, well, there are some challenges there, right? There are challenges with that projection where like, are we seeing the structure, all the components of the structure? Are we seeing the angles or the structure that we need? We could be we could be seeing in 3D or are we missing out on some of that content once it gets projected on a 2D monitor? That’s the big question really. Right?
Jeremy Nelson (08:27):
Yeah. That’s exciting. Yeah, I like that. I think it’s, you know, it’s not your traditional XR as you say, but I think it, you know, gives another affordance or opportunity to potentially have multi-user, right. More than one person can be looking at this looking glass display and interacting with it and shared experience versus everybody being in a headset. AR headset. Nice. Nice. Well, that’s a great example. I think that that could give folks something to look to when we’re back to campus and people want to check that out. So what other examples have you seen at XR, either at Michigan or more broadly in education that you really like or you think are transformative?
Talal Alothman (09:06):
Yeah, I think, uh, the recently funded projects by the assistant Dean of undergraduate education office they, they fund, they funded, um, XR projects for the last two years and the recent round brought in about five new projects and there’s some really interesting approaches very different approaches across the multiple different projects and they’re all engineering focused, but they span mechanical engineering, computer science, one of them is actually computer science, but like really focused on understanding whether spatial reasoning gets better or not, which kind of which, which that knowledge really does span the fields of engineering. And you know, other fields as well. So one of the most interesting ones that I think has done are already most of the, or a good portion of the work is Dr. Altawheels’ from the disaster visualization project. So Dr Altaweel does a lot of simulations about buildings and, um, what happens to buildings and specific disasters like hurricanes or earthquakes and so forth.
Talal Alothman (10:12):
And he has all this data and that they can simulate currently in a 2D manner, but it’s really hard to simulate, well, it’s really hard to get the idea across of the destruction and the problems that could occur. Um, and the idea is to take that same those simulations and really put, put folks into VR with them and kind of give them a first, uh, per, you know, first person perspective of the amount of, uh, of possible issues that could arise with, when a disaster like this happens. So, um, that’s a really interesting challenging project cause right now he has tons of really, uh, really complex data being visualized all at the same time. And how do you do that while also rendering to a VR headset that requires also lots of compute, you know, like it’s, it’s, so it’s an interesting challenge that we’re working on.
Jeremy Nelson (11:08):
What are these disasters? Is this mostly like climate science, like simulation or what are these? Floods?
Talal Alothman (11:13):
Yeah, they’re mainly building. So it’s like how, uh, earthquakes affect buildings. Um, and all this content is generated in very specialized mathematical based tools. But then they have the data that gets output from that, which is lots and lots of data, right. What happens, uh, to the content and then you, the goal is to, to try and bake it into a simulation or a unity simulation, right? Or a VR environment so that it can be replayed and in a kind of more approachable manner without having to run the initial, disaster simulation right? It would be just the big data. So it’s, it’s an interesting challenge and he has a really cool guy, which I think you guys would be interested in talking to as well. Andrew Hillinka he’s a unity developer working with him, um, as well.
Jeremy Nelson (12:05):
Yeah, for sure. We’ll reach out to both of them. That’s awesome. Why you’re well aware that, you know, part of our goal for the XR initiative is to really, you know, bring XR tools and technologies university-wide institution-wide, both online and residentially. Do you have any concerns about the future of XR and teaching and learning? What are some of the challenges or concerns that you see have really been able to do this successfully?
Talal Alothman (12:30):
Really I would say like hardware accessibility is a pretty big issue. Whether you have the funds to actually get hardware or you don’t, now you’re kinda on the same page because with hardware it’s just harder to get these days. Um, so I think that’s a big challenge. But then beyond that, you know, headsets still are like an investment and not a cheap one. So I mean when we talk about like VR headsets really they are more affordable now, but when you talk about AR or MR headsets, they are still really, really expensive. And the go to that using your phone is a good kind of, um, kinda middle ground, but it’s, it doesn’t do as like an entry point. Yeah, it doesn’t do, it doesn’t get the job done a lot of ways unfortunately. And um, we have challenges with that. With some of the products that we’ve developed, like mobile AR is good, but does it actually really benefit a student when they’re trying to learn it’s a mixed bag really in terms of research.
Jeremy Nelson (13:35):
Yeah, you’re still looking at a 2D screen basically.
Talal Alothman (13:38):
Yeah, exactly. Right.
Jeremy Nelson (13:40):
Yeah, no, the headset, I mean, that’s a real challenge. We were fortunate to just get a couple hallolens, two devices and obviously they’re not cheap. The opportunities that will afford I think are very exciting, uh, to begin to explore. But, you know, scaling that out to thousands of students, that doesn’t seem feasible from a financial standpoint.
Talal Alothman (13:58):
Yeah. Yeah. I agree.
Jeremy Nelson (14:00):
But I think, I mean, there’s some interesting, especially in this time of sheltering in place and being more remote. I’ve seen a lot of opportunities for VR and even AR for learning opportunities for students or collaborations. So I think there’s some, hopefully we’ll see some unique products or solutions come out of this if we can get the headsets right? Well, well, you know, we’ve talked to a number of other institutions and people working in XR at those spaces. You know, what would you want to see Michigan do that, that could be unique in this space of enhancing student education?
Talal Alothman (14:38):
I think really my opinion is that there isn’t really enough, um, I guess academic research that points towards where this technology really shines. Right? Um, and we know we kind of from a couple, from a lot of the research out there that, well, it works, it’s helpful in building empathy, it’s helping helpful in kind of helping out with spatial learning and so forth. And I feel like Michigan is really a good place to be if you’re trying to figure out what VR is good at or not good at from an educational standpoint. And I think, um, that’s what some of these projects were really attempting to do. Right. Figure out whether the intervention, the, the VR or AR intervention really has helped their students, or not in, in the content that they’re trying to learn. Um, so, and whether that could extend to other disciplines and fields, right and things that you need to learn, and I think, you know, like Michigan is a really, you know, being a university, being a research university, a research focused university that is the right place to have a lot of XR interest.
Jeremy Nelson (15:54):
Yeah. I mean we’ve had broad interests across almost all of the schools, right? Folks you’re working with specifically in school of ed and researching with Carolyn and some of the other folks interested in how do we build these experiences that will ensure students to learn better or, or absorb the information in a way we’d want them to or they’d want to.
Talal Alothman (16:13):
Yeah. And by no means is this like a one field kind of a question to answer. It really is a very interdisciplinary field that brings together educators, folks in pedagogy and education folks in like the specific domain that you’re trying to, uh, teach the content and, and folks who can build the content out. And I think that’s really, really cool about the space that it really is a collaborative interdisciplinary space at the moment.
Jeremy Nelson (16:42):
Well, what are you looking forward to next in your work or in this field, in this space? What excites you?
Talal Alothman (16:49):
Yeah, I think I’m really interested in hand interfaces and hand interfaces being you know interfaces that utilize your hands and not a controller to get things done. And I’ve, I’ve really been fascinated with those for a long time. I got like the, the leap motion controller, um, like eight years ago because I was super interested in like what we could do and I played around with it here and there, but it wasn’t until like VR came about that I really felt it was, it was useful in a lot of ways. Um, but hand controllers and hand like interfaces are really the way forward because they remove a lot of the friction that comes with having to learn how to use another tool, right? Your hands are really the perfect tool. Like you’ve used it for a very long time.
Talal Alothman (17:43):
We use them all the time and um, extending them into like the digital space, being able to use them in the digital space without an intermediary, um, a controller or other tool, um, is I think, um, a really exciting space that would open up the door for more natural, uh, interactions, um, in the space. And it’s really not easy. Like I’ve, I’ll tell you, it’s not an easy space. It’s still really hard to build things that make sense to people, build interfaces that feel natural and comfortable and don’t cause, you know, to like to be in pain because you have your arms out in this specific way. It’s just a lot of really interesting challenges. Holding them up, right.
Jeremy Nelson (18:27):
Right, right. You’ve been writing a bit about this. I mean, folks wanted to learn more. Where would, where could they go to learn more about what you’ve been doing or writing about this?
Talal Alothman (18:37):
Uh, yeah, so I, so I, I like to kind of write about these interfaces that I work on. My blog is talothman.github.io and there’s some content there. I try and update it at a regular basis, which doesn’t always work out. But yeah, I like putting out a lot of the interface work, uh, there for, for folks to learn, kind of communicate to me whether things are, or like, did I do this the right way or did I not? Or was this a crazy idea or not? Are there better ways to do things?
Jeremy Nelson (19:12):
Yeah no, it’s great. I mean, it really adds to the field and as we’re all exploring and learning and building on top of each other’s work, I think it’s super important. What else, you know, I really appreciate you spending some time with us today and sharing your thoughts and, you know, what other topics should we explore on this podcast or what else would you like to know more about?
Talal Alothman (19:30):
Yeah, I haven’t heard the first one that you guys did. So, uh, I’m not sure like what happened, but I think it would be interesting to look at easily accessible XR experiences that folks can try really quickly. Right? Like and maybe VR is a bit harder to kind of bring to, you know to try out than AR is because you know, the phone, you can get AR on your phone, you get VR in your phone, but it’s a bit different. It’s not that high of a quality experience. But I think like really exploring available content so that people could get exposure to what this tech actually looks like. Because I think one of the biggest issues, uh, at this point, it’s still, people don’t know what the difference between AR and VR is. Uh, people haven’t really tried. A lot of people haven’t really tried the, these mediums at all. And I think that that makes it harder for us to actually see whether adoption is a reality or not. Whether like gross in adoption on campuses or between students is a reality or not without, you know, having had smaller and smaller groups of people use this content. So the more people that get exposed to this content, I think the
Talal Alothman (20:48):
better it is for our ability to talk to folks about this content. So, yup.
Jeremy Nelson (20:56):
Yup. A lot of this, you have to try it. You have to put on a headset or try AR on your phone or something. So this is great. Well, I really appreciate all the time you spent with us. I’m excited to see you in person when, when we’re able to get back to campus. But thanks again Talal.
Talal Alothman (21:14):
Thanks for having me.
Jeremy Nelson (21:18):
Thank you for joining us today. Our vision for the XR initiative is to enable education at scale that is hyper contextualized using XR tools and experiences. Please subscribe to our podcast and check out more about our work at ai.umich.edu/XR.