InstaVR Interviews is a blog series where we turn the spotlight on our customers. We find out why they create VR, how they use InstaVR, and what the future of VR will look like. To read more interviews, visit the InstaVR Interviews homepage.
Grayson was born in Austin, Tx and earned his BFA and MFA in Communication Design from Texas State University. After working in the field as a brand, advertising, interaction, and game designer, he returned to Texas State to teach in 2004.
He teaches User Experience in the MFA graduate program. From a research perspective, Grayson has created mobile applications for museums, high-school Government classes, physician-physician communication, and campus safety. Currently, he is developing new rapid prototyping techniques for virtual reality (VR) and working with his students and co-researchers, designing environments to help veterans cope with PTSD.
Other people involved in the VR projects produced by Texas State University include:
Commander Noble, Keith | Austin-Travis County EMS
Mr. Cayetano, Marbenn | City of Austin
Dr. Lehr, Ted | City of Austin
Humphrey, Victoria | Graduate, School of Social Work
Garrard, Benjamin | Undergraduate, Department of Computer Science
Shivesh, Jadon | Undergraduate, Department of Computer Science
Kim, Kevin | Undergraduate, Department of Computer Science
Mr. Lawrence focuses on user experience, with VR being a new technology where UX greatly matters
Question: Tell us a bit about yourself. What do you focus your teaching and research on?
Answer: I’m an associate professor of communication design at the School of Art & Design at Texas State University in San Marcos. I teach mainly user experience design, user interface design, mobile app, web — those types of courses.
My research goes in lots of different directions, mainly in user experience. Since I’ve been hooked up with the Virtual Reality Lab at Texas State, it’s also included user experience in virtual reality. Rapid prototyping and user testing, which we’re used to in making apps and web sites, we’re not as used to in virtual reality. So that’s my base interest.
Texas Ambulance Bus Used for Large Scale Emergencies
VR Technology Lab at Texas State built by Dr. Scott Smith at the School of Social Work; many early applications have been designed for returning veterans
Question: Can you tell us about the use of VR on your campus?
Answer: The VR Technology Lab was started by Dr. Scott Smith in the School of Social Work. This is a real interdisciplinary group — it’s social scientists, computer science, engineering, and art & design — all working together to create these environments.
This lab started off as Dr. Smith’s lab. He was using it, and we’re using it, for things like treating alcoholics in virtual bars. With mindfulness training, we them the ability to experience it in relative safety, with a social worker there with them.
We’re also using it to help returning veterans. We have a high population of returning veterans on campus. And so a lot of them have anxiety around stuff like just being in class, really huge classrooms with 200 people in them, or walking across the quad in between classes. You have hundreds of people walking past you and it’s crowded and noisy. Or even going to the grocery store.
We can put them in a big class with no one in it and then ramp them up to half full, then full. Or get them to go through the grocery store on a Wednesday when there’s no one there, and then slowly build them to going to a grocery store on a Saturday, when it’s crowded.
GoPro Hero Used For Capturing the Ambulance Bus
The VR Technology Lab does expansive research, including gathering biometric feedback, to determine if immersive VR can create a more emotional response in the user
Question: Can you expand a bit on why the intersection of user experience & VR matters?
Answer: When we talk about user experience, it’s less about how the thing looks and more about how the thing works. When somebody is using a virtual reality experience, how easy is it for them to get around in the environment? And also how real does the environment have to look to still emote and bring empathy, and still feel like emotionally you’re in it?
We’ve been doing a lot of testing such as does it have to be full on 360-degree video for somebody to feel like they’re there? Or can it be more roughly rendered 3D graphics? We’ll even hook people up to biosensors, and check their heart rate and their galvanic response to test if they really feel like they’re there. We want to answer what level of reality do you need to emotionally get someone to feel like they’re really there.
Planning the 360 Shoot
Exploring rapid prototyping for VR helps organizations to determine how long it takes to go from idea to marketable product
Question: You also mentioned rapid prototyping before. Can you explain that in a VR context?
Answer: From a rapid prototyping standpoint, we’re looking at iterative design and design thinking, and the idea of prototyping very rapidly to validate an idea before you spend hundreds of thousands of dollars and years of software development time. Is there a way to easily validate an idea with as little programming and as quickly as you can? To get it in front of the users as quick as you can, user test it, make sure that it’s working, and that it’s doing the things that you want it to do.
And then you get to that point 3-4 versions down the road where you can say — “OK, this is what we want. Now we’re going to hire the programmers and the developers to spend the year developing this final product.” That’s what I was really interested in InstaVR for.
InstaVR helps reduce Mr. Lawrence’s project scope from 5-6 students to 1-2, and build time from a year down to 1-2 weeks
Question: Why did you first decide to incorporate InstaVR into your workflow?
Answer: The current project we’re working on is kind of a complex environment, and we don’t know if our idea will work or not. So the quickest way we could do this is shoot 360 still imagery and stitch it together. Then make it interactive using InstaVR.
So we could do that in a week or two with me and a student worker, what would normally take five or six students and a year to fully render in 3D and program. And we still don’t know if it would work. So this way we can really use your product (“InstaVR”) to quickly validate if this is going to work before we move on to the next step.
Setting up the GoPro Hero and Tripod
Mr. Lawrence’s team will film in 360 initially, look at biometric feedback, and then design virtual environments based upon response rates from 360 videos
Question: Can you talk a little bit about your research design approach? Are you using 360 video?
Answer: When we’re going through these environments, we hook our test subjects up to biosensors. So heart rate, breathing rate, galvanic skin response for sweat. We also have a thermal imaging camera on them to see if their face flushes, or things like that.
We’re using 360 video initially to quickly prototype the idea. For example, we think the grocery store is going to be a scary environment for them (returning veterans with PTSD). We’ll interview them before building the environment. We have ideas like maybe the bakery is the most scary part, or the dairy case, or the cereal aisle.
What we’ll do is film 360 video, because we can do it very quickly, without creating a grocery store from scratch. We can go to a grocery store, shoot video in those areas, and put test subjects through that to get feedback very quickly. Then we’ll say, “We thought the bakery was going to be the scariest part. But it’s not. It’s actually the fruit aisle.”
This informs us as we make the full-on 3D rendered version — the full virtual reality version. One of the disadvantages of 360 video is we can script it, but we can’t trigger very specific things that cause people anxiety. Everybody experiences anxiety from slightly different things.
So we can make assumptions based on interviews for the first round. And then we can look at the data and say these are the areas that are the scariest parts of the grocery store. Or this is the level of crowd size we need in the grocery store to create an anxiety response. Then we can go to the next step and program something like a baby crying or broken glass or carts smashing in a true 3D rendered environment. We can trigger those things on-demand, which we can’t do with 360 video.
EMTs in Texas only given one day of training on Ambulance Buses each year, which are used for mass casualty or disaster events; VR is used to increase memory recall for the Ambus
Question: Can you dig into some other projects, ones specifically built and distributed using InstaVR?
Answer: We also train medical personnel on equipment that’s hard to get a hold of. The project we’re currently doing with InstaVR is used to train ambulance bus medical technicians.
The ambulance bus is a gigantic school bus that is fitted to be an ambulance. It can handle between 20 and 40 people at a time. It’s used for rare occasions — mass casualty events (like shootings) or disaster events like hurricanes (where you have to evacuate people). The problem with the Ambus is there’s only 13 in the state of Texas.
The normal EMT training for the Ambus only happens once per year. They get one hour to walk through the Ambus, and then they get a PowerPoint presentation. And that’s it. So it might be another eight months before they get on the Ambus and have to perform.
The memory recall is tough — there’s 14 drawers in the thing and equipment everywhere. Can you remember where the gauze is? Can you remember what drawer has the IV solution? There’s very poor memory recall after that amount of time.
Adding Navigation Links to Move Within the Bus
Ambulance Bus VR experience helps drivers and EMTs with memory recall, and leverages Hotspots to make the training interactive
Question: How specifically are you using VR, and InstaVR in particular, to create this training for Ambulance Bus EMTs?
Answer: With InstaVR, we’re using a 360-degree camera and we took shots all the way down the bus. And we took shots with the drawers closed and with the drawers open. And we’re using Hotspots so the users can walk through the bus, stare at a drawer, and see what’s inside it.
For the driver, there’s a bunch of switches. So the prototype helps with remembering where the wiper switches are, where the light switches are, where the exhaust fan is. We’re giving them close ups of all the switch gear in that area.
Our theory is that we can help them with memory recall, as well as augment their training, replacing the PowerPoint with something that will truly immerse them in the Ambulance Bus.
Mr. Lawrence’s team will be A/B testing older training model vs. newer training model created using InstaVR to determine factors like recall and confidence
Question: How do you determine if the VR training you create is successful or not?
Answer: We just finished the prototype and sent it off to the main commander in the Austin and Travis County EMS. We sent it to get his feedback on is our labeling correct, are there any spots we missed on the bus, etc. What we’re doing now is creating a training protocol to see if the experience works or not.
The idea is to put half the trainees through the old training and half through our training. Then test things like time on task, how fast can they figure out where things are, generally memory recall. And also satisfaction and confidence level.
One of the pieces of feedback from EMTs when we originally did the study was they didn’t feel ready, they didn’t feel confident that they knew where everything was. Even something like confidence can be something that’s important to them.
Hotspots Added to the Driver’s Area
Plan is to leverage InstaVR to publish widely — including both InstaVR and the App Store
Question: What’s the distribution plan for your Ambulance Bus application?
Answer: What I love about InstaVR is we can put this on a full-on headset like Oculus Rift or I can put it on a phone, and distribute it through the App Store. We can train all these people using just a $20 headset, the ones that fit with your phone. That’s extremely affordable and makes a whole lot of sense.
The AmBus is hard enough to get access to. So let’s make this as easy and affordable as we possibly can to get these people access. I’m cognizant of taxpayer money, so if we can create great training that’s affordable, I think that’s a win.
Exciting use cases for VR traverse many categories, help solve big problems that aren’t as easily solved in other mediums
Question: What areas are you seeing VR used? What excites you about the medium?
Answer: There’s a lot of things exciting in VR and AR. There’s the training and the medical aspect that I work on at the VR Lab with Dr. Smith and the others. Then my department is also into advertising and brands, and the stuff you can do with advertising is amazing. I had a student do an Honors Thesis with a colleague of mine using Augmented Reality for a project with Knoll And it was a great thesis.
IKEA using VR with their catalogs was impressive. Think about how much can be saved by someone not buying a chair and realizing it’s too big? That’s restocking fees, shipping fees, all money wasted because they couldn’t envision it in their house.
I’ve seen plenty of opportunities for real estate agencies and short-term rentals. Using virtual reality walkthroughs is an obvious one that’s seeing a lot of use.
There’s a lot of good use cases in the research space and the retail space. It’s kind of wide open. It’s nice to see a resurgence in VR.
User experience is all about solving a specific problem for a specific group of users in a specific situation. I’d rather start with “What’s the problem this person has and how can I help them solve it?” If I can help them solve it with an app or a brochure or a web site – great. But if virtual reality allows me an opportunity to solve the problem, that’s just one more arrow in my quiver.
It’s a very nice medium to solve some big problems that weren’t as easily solvable in other mediums.
Hotspots Help EMTs Identify What are in Drawers
Thank you to Grayson Lawrence for his time and insight! You can read more about him and his projects at: https://www.lorenamartinezdesign.com/just-in-time-vr and http://www.txst-vrtlab.com/
thanks you RSS link