XR Research in the Summer

photogrammetry model of the mural in Kodak Hall.

There is a strong emphasis on fostering cross-disciplinary collaboration in extended reality (XR) at Studio X. Over 50 researchers across the UR use XR technology for their research and teaching, and many come to Studio X for consultation and advice in either program development or engineering. As an XR Specialist at Studio X, I got the opportunity to work on two XR-related research projects during the past summer, one in collaboration with the Brain and Cognitive Science Department (BCS), and the other with the Computer Science Department (CS). Through the Office of Undergraduate Research, these projects were supported by a Discover Grant, which support immersive, full-time summer research experiences for undergraduate students at the UR.

The research with BCS includes digitizing the Kodak Hall at the Eastman School of Music and bringing it into VR. The result will be used to provide a more realistic environment for conducting user testing to better study how humans combine and process light and sound. The visit to Kodak Hall was scheduled way back in March. Many preparations had been done before the visit that included figuring out the power supply and cable management, stage arrangement, clearance, etc. One discussion was had on what techniques will be used to scan and capture the hall. Three object scanning techniques were tested before and during the visit: photogrammetry, 360-image, and time-of-flight (ToF). 

Photogrammetry creates 3D models of physical objects by processing photographic images or video recordings. By taking images of an object from all different angles and processing them with software like Agisoft Metashape, it is possible for the algorithm to locate and map key points from multiple images and combine them into a 3D model. I first learned about this technique by attending a photogrammetry workshop at Studio X led by Professor Michael Jarvis. This technique has been very helpful for the research since we are able to get great details on the mural in Kodak Hall, at which other techniques had failed.

photogrammetry model of the mural in Kodak Hall.
Photogrammetry model of the mural in Kodak Hall

360-image, as its name suggests, is a 360-degree panoramic image taken from a fixed location. With the Insta360 camera borrowed from Studio X, the capturing session requires almost no setup whatsoever and can be quickly previewed using the app on a phone or smart device.

360 image of Kodak Hall, captured from the stage.
360 image of Kodak Hall, captured from the stage

The Time-of-Flight (ToF) technique shoots light and calculates the time it takes for the light wave to travel back from the reflection in order to get the depth information. Hardware using the ToF technique can be easily found on modern devices, such as iPhone and iPad with Face ID. I tested the ToF scanner on the iPad Pro at Studio X. It provides a great sense of spatial orientation and has a fairly short processing time.

3D capture of Studio X from an iPad Pro.

We used the Faro Laser Scanner in order to get a scan with higher accuracy and resolution. Each scan took 20 minutes, and we conducted 8 scans to cover the entire hall. The result is a 20+ GB model with billions of points. In order to load the scene to the Meta Quest 2 VR headset, we shrunk down the size and resolution of the model dramatically using tools such as gradual selection, adjusting the Poisson distribution, material paint, etc. We also deleted excessive points and replaced flat surfaces with better quality images such as the stage and mural. The end result is a nice-looking model with decent details around 250MB, good for the headset to run. 

partial 3D model of Kodak Hall.

The model was handed over to Shui’er Han from BCS as a Unity package, where she is going to implement the audio recording and spatial visualization before conducting the user testing. It is amazing to see many people working and bringing together their experience and knowledge in making this cross-disciplinary project to reality. I would like to thank Dr. Duje Tadin, Shui’er Han, Professor Michael Jarvis, Dr. Emily Sherwood, Blair Tinker, Lisa Wright, Meaghan Moody, and many more who gave me the amazing opportunity to work on this fun research and all the help they provided along the way. I can’t wait to see what they can achieve beyond this model and research project.  


You can read more about this cross-disciplinary collaboration here.

Hao Zeng
Hao Zeng

XR Specialist

My Summer At Studio X

a group of pre-college students posing for a group photo.

This summer, I worked full time at Studio X. Even though the campus felt pretty empty with almost all the other undergrads home for summer, there was a lot going on in Studio X! For example, for two weeks in July, we held a pre-college program called “XR: Content Creation and World Building.” In this program, high schoolers came all across the country to learn about the world of extended reality or XR.

“Learn how XR (the umbrella term for augmented and virtual reality) experiences are created! Students will study the history of immersive technologies and gain technical skills by exploring both the basics of 3D graphics for asset creation and how to develop XR environments with Unity, a popular game engine. We will also discuss the applications and impact of XR across humanities, social science, and STEM fields. All learning levels welcome.”

It was really exciting to be a part of this program to teach passionate students about XR creation. As we prepared for the students’ arrival, we asked ourselves, “How can we introduce a dozen high school students to the complex and technically challenging world of XR development, all within two weeks of half-day sessions?” This was a challenge indeed. We knew that we wanted the students to walk away with a basic understanding of the fundamentals of Blender, a 3D modeling and content creation tool, and Unity, a game engine commonly used for VR development, but we did not want to overwhelm them with too much new material all at once. We decided that we would have to create a highly detailed plan, carefully crafting how we would use the two weeks that we have with the students.

Over the course of June and early July, we worked to create this plan, taking every little detail into consideration. The first major obstacle we faced was how we were going to ensure that each student would have the necessary hardware and software in order to complete the activities we were planning. Blender and Unity can both be very taxing on computers, and it is often the case that folks don’t have the necessary hardware, even for our undergraduates. It was very important that this program was open to anyone who was interested and that technical experience or personal hardware was not a limitation. We decided that instead of having each student bring in their own computer, we would use the high-powered workstations that we already have in Studio X. This, however, created the question of how to organize a dozen PCs in our space that each use a very large amount of power. With 12 high-powered PC’s running all at the same time in the same place, we actually ended up blowing a circuit and had to re-think our plans. We considered several options, including using another space or splitting up the group into different rooms, but we eventually decided to completely reorganize Studio X in order to keep the group together in one space. I really liked the way we eventually configured the space, as it allowed us to keep the whole group together and helped us build a stronger community as we worked.

An image showing Studio X configured to have all 12 PCs in the same place
Studio X configured to have all 12 PCs in the same place

After solving our issue of how to organize the computers, we could focus our energy entirely on planning out how to best use the two weeks with the students. The first week was focused on learning Blender. We wanted to give an introduction to 3D concepts, Blender basics, and character modeling. We felt that this would give our students a foundational understanding of how to navigate Blender, while still being realistic with the time that we have. Blender can be a very challenging program to learn. There are many different things that you can do using the software, and oftentimes it can be very overwhelming the first time that you try it out. Although we felt like we were trying to introduce a lot in a short amount of time, we were very excited to see what the students could make. At the end of this week, each student had their very own 3D modeled character. The students did an amazing job creating their characters in Blender. It was so impressive how fast they were able to learn, and it felt so good to see our planning pay off.

Image showing an example of Blender's UI
An example of Blender’s UI

The second week of our program was focused on learning Unity. We wanted to teach the basics of Unity, get the students thinking about core game design principles, and introduce the world of VR development. The end goal for this week would be that each student would create their very own VR mini game, using the 3D character they modeled as the antagonist in their experience.

With so little time, it was really important that we had milestones to reach each day to make sure we stayed on track. On the first day working on their games, the students got an introduction to a template VR Unity project. I created this template using a beginner VR asset from the Unity Asset Store, a place where you can find free or paid packages to help you create games. The asset I used is linked here: VR Escape Room. This package handled a lot of the initial setup for a VR project that can be very complex, allowing the students to focus on their game concepts without being tied down or having to use too much coding. I also created a full VR mini game myself, giving the students an example of what their final project would look like. My game was called Jellyfishin, a game where the player has to go around catching Jellyfish. This game highlighted some of the main mechanics of the template and also was fun for the students to play around with.

Image showing a screenshot from the template project provided to the students
Screenshot from the template project provided to the students

After being introduced to the template project, day 2 was all about environmental design. The students learned how to find resources to create their game world using a combination of free models, primitive objects, and their 3D characters that they made the week prior. By the end of day 2, the games really came together. I was really amazed at how much detail and care that each student put into their project, especially considering how little time that they had. The final development day was used to polish and finalize the games. We made sure that each student’s game could be playable start to finish and that there were no major problems with the experience. I think each project was really unique despite coming from the same template. It was so rewarding to see the tools we had created be used so well to create these awesome experiences.

On our final day with the students, it was time for the showcase. Staff members from all over the library came to Studio X, and each student had the opportunity to present their game. One-by-one they gave a quick introduction to their concept and then showed off some gameplay. In the world of game development, you never know if something is going to go wrong. One minor bug could throw off an entire demonstration. Thankfully, these students did an amazing job finalizing their games, and everything went off without a hitch. After two challenging weeks, our students left with a complete VR game, a 3D modeled character, and a set of skills they can continue to grow and use on their journey with XR.

XR Content Creation & World Building – Final Showcase

Being a part of this pre-college program throughout the summer has been an amazing learning experience for me. Through all of the preparation and thinking that went into making our goals possible, I really had to put my technical skills to the test. In the end, our planning really made all the difference and is what made the program run so smoothly. It was a great challenge to think about how we can teach so much information to the students in such a short amount of time, and I’m really proud of what we all accomplished. I can’t wait to see how this program continues to evolve and find more ways to lower the entrance barrier to the world of XR. Overall, it was a pretty great summer in Studio X.

Liam O'Leary
Liam O’Leary

Karp Library Fellow, XR Developer

In a World Full of 3D Models, Researchers Build a New One for Leukemia

hand holding the the bone-marrow-on-chip device.

Wilmot Cancer Institute scientist published data that show a new microchip-like device that his lab developed can reliably model changes in the bone marrow as leukemia takes root and spreads.

hand holding the the bone-marrow-on-chip device.
Ben Frisch, PhD, holds the bone-marrow-on-chip device in his lab.

Ben Frisch, Ph.D., assistant professor of Pathology and Laboratory Medicine and Biomedical Engineering at the University of Rochester, and colleagues have been building what is known as a modular bone-marrow-on-chip to enhance the investigation of leukemia stem cells. The tiny device recapitulates the entire human bone marrow microenvironment and its complex network of cellular and molecular components involved in blood cancers.  

Similar tissue-chip systems have been developed by others, but they lack two key features contained in Frisch’s product: osteoblast cells, which are crucial to fuel leukemia, and a readily available platform.

The fact that Frisch’s 3D model has been published in Frontiers in Bioengineering and Biotechnology and is not a one-off fabrication will allow others in the field to adopt a similar approach using the available microfluidics system, he said.

Read more.

Sensory Processing – in a Virtual Kodak Hall

a binaural microphone set up with a dummy head.

Rochester researchers will harness the immersive power of virtual reality to study how the brain processes light and sound.

A cross-disciplinary team of researchers from the University of Rochester is collaborating on a project to use virtual reality (VR) to study how humans combine and process light and sound. The first project will be a study of multisensory integration in autism, motivated by prior work showing that children with autism have atypical multisensory processing.

The project was initially conceived by Shui’er Han, a postdoctoral research associate, and Victoire Alleluia Shenge ’19, ’20 (T5), a lab manager, in the lab of Duje Tadin, a professor of brain and cognitive sciences.

“Most people in my world—including most of my work—conduct experiments using artificial types of stimuli, far from the natural world,” Tadin says. “Our goal is to do multisensory research not using beeps and flashes, but real sounds and virtual reality objects presented in realistically looking VR rooms.”

UR students working on the project are looking at information on a laptop with Kodak Hall in the background.
Members of the team begin the setup for audio and visual data collection. From left to right are Shui’er Han, a postdoctoral research fellow in Duje Tadin’s lab; brain and cognitive sciences major Betty Wu ’23; computer science and business major and e5 student Haochen Zeng ’23, who works in River Campus Libraries’s Studio X; and Victoire Alleluia Shenge ’19, ’20 (Take Five), who earned her degree in brain and cognitive sciences and is a manager in Tadin’s lab.

A cognitive scientist, a historian, and an electrical engineer walk into a room . . .

Tadin’s partners in the study include Emily Knight, an incoming associate professor of pediatrics, who is an expert on brain development and multisensory processing in autism. But in creating the virtual reality environment the study participants will use—a virtual version of Kodak Hall at Eastman Theatre in downtown Rochester—Tadin formed collaborations well outside his discipline.

Faculty members working on this initial step in the research project include Ming-Lun Lee, an associate professor of electrical and computer engineering, and Michael Jarvis, an associate professor of history. Several graduate and undergraduate students are also participating.

Many of the tools they’ll use come from River Campus Libraries—in particular, Studio X, the University’s hub for extended reality projects, as well as the Digital Scholarship department. Emily Sherwood, director of Studio X and Digital Scholarship, is leading the effort to actually construct the virtual replica of Kodak Hall.

The group recently gathered in the storied performance space to collect the audio and visual data that Studio X will rely on. University photographer J. Adam Fenster followed along to document the group’s work.

Read more.

Exploring Extended Reality in the Libraries with Studio X

Senior Creative Writing major and Karp Library Fellow Ayiana Crabtree '22 was featured in this post for the UR admissions blog! Link to original post at the end.

Located on the first floor of Carlson Library, as the hub for extended reality at the University of Rochester, Studio X fosters a community of cross-disciplinary collaboration, exploration, and peer-to-peer learning that lowers barriers to entry, inspires experimentation, and drives innovative research and teaching in immersive technologies.

Studio X runs tons of fun workshops and events that aim to make XR fun and easier to understand. For example, I run an Intro to XR workshop every semester that teaches participants, no matter their skill level, all about the basics of XR with a fun hands-on learning experience. There are other workshops too, like Blender and Unity tutorials to teach you the basics of 3D modeling and game development. If workshops aren’t your thing, we also have events like our Beat Saber competition and a speaker series called Voices of XR, where you can learn about XR directly from professionals in the field.

Studio X has a wide range of XR technologies that students, faculty, and staff have access to using both inside and out of the space. Our most popular attractions are the Meta Quest 2 VR headsets, which can be borrowed and taken back to your dorm for up to three days at a time. On our VR headsets, there are a bunch of fun pre-downloaded games and experiences for you to play, like Beat Saber, Walkabout Minigolf, Job Simulator, and more! In addition to the VR headsets, we have 360 cameras and 360 audio recorders which can also be taken back to your dorm for a three-day period. If you don’t mind staying in the space, you can ask to try one of our Microsoft HoloLens 2’s (MR headsets) or use one of our high-end workstations for homework. You can also use any of the aforementioned technology in the space if you don’t want to take it back to your room.

Studio X’s main goal is to break down any barriers that may be preventing students from getting into XR technologies. Whether that be making resources readily available, or giving introductory tutorials, Studio X is here to help!

Read the full article here!

First and Lasting Impressions of VR

Personal Experience

I first encountered the idea of virtual reality (VR) when I read the book Ready Player One by Ernest Cline. As an avid reader of science fiction books, I loved the idea of being able to escape to some virtual world through a VR headset. Soon after I read the book, the movie was released and seeing the concept executed in a visual form only increased my interest in the subject. Despite my fascination, I took it as the book genre labeled it. Fiction. I believed that there were no VR headsets, as I had never seen or heard of anyone having them.

In the fall of 2020, I happened across an advertisement for the Oculus Quest 1. My interest in the novel had not wavered, but nevertheless, I was shocked. I hadn’t realized that the concept introduced to me through a science fiction novel was real in the form of a readily available, and relatively affordable technology. I had been saving money for a while and prepared to make my purchase. Luckily, a friend encouraged me to wait a few months, as in October 2020, the Oculus Quest 2 was released. I ordered the headset and eagerly awaited its arrival.

Photo of the author of the article, a young woman, with a virtual reality headset resting on top of her head

When I finally got my hands on it, I was over the moon. It may not have looked like the vision Cline painted in his novel, nor like the version in the movie, but it was virtual reality nonetheless. In the time between ordering and receiving the headset, I researched various games and experiences that I wanted to try upon its arrival. Beat Saber, a rhythm game, was top of my list and was my first purchase on the device. I’d never been one to read instructions for consoles, games, or anything at all, so I dove right in and set up my account.

As soon as I began playing, I was hooked. Whether it was the idea of actually experiencing VR or the catchy songs of Beat Saber, I absolutely fell in love with my Quest 2. I played it every moment I had time. As I danced around my living room slashing to the beat of the songs, my parents asked me what I was doing.

I excitedly explained to them what VR was, and how it worked. It was at this point I had my first experience sharing VR with someone else. After a long tutorial on how to wear the headset, how to navigate the menu, and how to play the game, my parents tried out VR for the first time.

This was all back over the winter break of 2020-2021, just before I interviewed to join the Studio X team as a Karp Library Fellow to do XR research. This was during the time that the pandemic was still pretty bad, and VR provided an escape from the harsh reality around me. It helped my anxiety and allowed me to relax, even if just a little bit while I was immersed in the world of VR.

Ever since that initial experience, I made an effort to introduce as many of my family members as I could to virtual reality.

Family Experiences

Mother

It was the various times of having my mum try VR that really inspired me to explore the topic of user interaction with VR further. Her reactions to the various experiences I had her try really made me understand the impact that VR can have on people’s lives. Her first experience was with Beat Saber, which she thoroughly enjoyed due to its catchy songs, but it was Job Simulator that really captured her attention. “I thought it was going to be dumb,” she said. “When I saw you doing it, it looked silly, but when I tried it, it blew my mind. It was a strange experience because it really made me feel like I was in the room.” For me, it was especially funny watching her play Job Simulator. I had to make sure she wouldn’t forget about the guardian boundary, as she kept trying to walk down the virtual hallway when, in reality, she was about to crash into the coffee table. Another interesting thing was how she was worried about dropping the virtual coffee cup, because she didn’t want it to break or make a mess on the floor.

a screen grab from the game job simulator

Father

While he didn’t try Job Simulator, my dad tried Walkabout Mini Golf. He’s not much of a mini-golf fan, but he was blown away by how realistic the physics were in the game. He said he kept feeling like he was going to fall off the edge of the map and even tried walking from hole to hole (which would have required a lot more space than we have in our living room). “You really don’t know what it’s like until you try it, and when you do, you can see all kinds of applications this technology may have in the future.”

Grandparents

I, of course, wasn’t going to have them do Beat Saber, and I didn’t have Walkabout Mini Golf at the time, so I had them watch a few Oculus TV videos.

Having my 96-year-old Great Grandma try VR was quite an interesting experience. She was in awe at the capabilities of the technology and loved the fly-over nature documentary about the ice caps.

My Gran tried a few different parachuting and paragliding videos. “It was amazing to feel like I was there. I feel like I could do paragliding now!”

My Grandpa watched a few shorter space documentaries and was thrilled to be immersed in the galactic environment.

a photo of an elderly woman with a vr headset on
Photo of my 96-year old Great Grandma trying VR for the first time

Running a Survey

After seeing the unique reactions from all my relatives, I was curious to know how others felt about their experiences with VR. I had joined several VR-focused Facebook groups to see the kinds of conversations people were having about VR and then decided to run a survey to directly ask the community about their experiences.

With the survey titled “How do Users Experience VR,” I asked a range of questions about age, their perception of VR, what they used it for, and if they had any stories they wanted to share. After about a week and a half of running the survey, I had 282 responses to go through.

One of the things that interested me the most was the age range distribution of those who responded. I’m not sure if this is directly related to the survey being run on Facebook and the demographic of Facebook users, but it doesn’t feel like a misrepresentation as the group was specifically for Oculus Quest users, and Oculus, now the Meta Quest, is owned by the same company.

a photo of a histogram of the age range distribution of people who participated in the survey

Quotes

The following information is taken directly from the survey results. Some of the quotes have been reformatted a bit for coherency.

This first quote is from a friend who had some previous experience with VR. I had them try VR on my headset before asking them to fill out the survey. Their unique perspective on the potential threat VR poses to society is one that I haven’t seen discussed much elsewhere, which is why I believe it is important to include it here.

Jenna, 21, Non-Binary

“At first, I thought it was super cool, but a little bit scary. As I’ve had more experience with VR games, I still think it is an awesome technology with a lot of potential uses, but I fear that VR video game violence will further desensitize users to violence in the real world. I was at an arcade once and played a VR zombie game and had to ask the worker to stop the game because it felt too much like I was killing real living things. Hence my fear of it desensitizing people to violence.”

I included the next quote because it shows VR being used as a tool for long-distance interaction and also how people in your environment perceive you as you play VR.

Tracie, 49, F

“I saw it as an opportunity to stay connected and play with my friend who lives thousands of miles away from me. I used to live in my RV where there was very little space to play, so I would take my headset to the laundromat and play while doing laundry. Can't tell you how many times people were freaked out by what I was doing. I always tap the headset to bring up passthrough when someone came in and when I started interacting with one guy he was totally weirded out ... ‘you can see me!!!’ ... lol. Yes. Yes, I can. (I was in a closed RV community laundry place with the offices and rec center in the same building -- it was completely safe).”

These next two quotes are particularly interesting as they show the potential uses for the elderly and the health benefits of using VR.

Bonnie, 79+, F

“I became so enthusiastic, it was fun, and I moved my body. I bought a headset and began to realize all the possibilities. I finally got my Quest2 in September and found Fitxr and SuperNatural. I have continued to use my quest 2 every day and have barely explored all the apps. My enthusiasm prompted five other sales among my friends as they noted my weight loss and toning of my body. I never thought that an old person could gain strength and balance. Just thought we went backwards physically. I had given away my cross-country skis and now wish I had them back as I have gained strength in my entire body. My balance has improved so much and although I have “bat wings” on one side of my arms I actually have muscle “bumps” on the other side. I can do step ups - more and more each week or so. I can do squats, as many as 40 at a time. Every day my muscles ache, but I LOVE IT as I realize it is a good ache and I earned it. The technology allows me to socialize with others, visit sites I had traveled to previously and brings back happy memories. The technology allows me not to travel to a gym (not that I would have) and to have privacy.”
M, 75, F

“I am seeing more uses for homebound, elderly... seeing it as a way of connecting friends and family scattered around the country. wonderful experience taking my 85-year-old brother to the top of mount everest! Getting to play golf with a group of women every week. exploring worlds in altspace”
an elderly man with a vr headset on

The following quote shows the emotional impact VR can have on people through the experiences it allows one to have.

Sherry, 57, F

“I bought an Oculus for my 11 year old granddaughter for Christmas. She brought me to the kitchen and told me to stand in a spot. She put it on me and told me to close my eyes. When it was on, she told me to open them. I was in a beautiful mountain lodge. Out the window were mountains. I was overwhelmed and began to cry. It was as if I had been transported to my home in the mountains 25 years ago. I could not believe my eyes. Literally! I just kept saying … is this real? I knew immediately that I must get my own Oculus and pretty much immediately ordered one for myself based on that 5 minutes of standing in a room looking at the mountains.”

These next quotes show an optimistic perspective for the future of VR technology.

a photo of a man with a vr headset on in front of a tv screen
Anonymous, 59, F

“At first, it was just a music game that I played. I've since added more experiences with various game types, the Multiverse, and more. Rather than this just being a gaming system, I can see a future for business, education, research, social interacting (that can actual involve talking to one another vs just texting), shopping, and so much more!”
Susan, 61, F

“I came to see it could be used for exercise and education and other non-gaming applications. I have come to see that it is a powerful educational tool, particularly for people who are limited, either physically or not able to travel to other parts of the world. Also, I believe it could be used to deepen educational experiences in a variety of ways. I also continue to believe it is probably pretty addictive and should not be used many hours of the day as it is basically an escape and not particularly productive in general. I think it’s a great tool for people who are disabled or otherwise housebound. I have a concern that entering a VR world takes away from the time that people spend outdoors, which in the end is far more important.”
Skye, 45, F

“It was more real than I thought it would be. And I immediately saw the potential applications to things that I cared about - like art, exercise, and experience with others. When my brother bought everyone in the family an Oculus for Christmas, that was a game-changer. I was SHOCKED with how much further the technology had come and am a total convert and trying to get others to get a headset so we can hang out in virtual worlds and other experiences. I now see VR as being something that is relevant for my life now and into the future. I see how it can improve my interactions with family and friends (we spend more time together...especially since Covid and distance limits our in-person opportunities), and it has given me new ideas for how to approach and use it for engagement for my wellness company and clients.”

The next several quotes show the potential therapeutic and mental benefits of using VR.

Audrey, 38, F

“I have ADHD. I was diagnosed at 37 years old, and I have found that the exercise component for VR allows me to keep engaged in a way no other exercises have previously. I still do other types of exercise (such as strength training or hiking) but when I’m not in the mood to workout, the menu of options in VR still brings excitement for me.”
Anonymous, 42, F

“I live in the Midwest, and it is dark by 4 o'clock in winter. In vr, I can hop into real fishing vr and spend time on the lake in sunshine. It doesn't matter that it's not real, your body still relaxes, endorphins are released. It has helped a lot with seasonal affective disorder this year.”
a photo of a woman surrounded by lights with a vr headset on
Anonymous, 41, F

“My mom passed away June 2020, she had prefrontal dementia, she slowly lost all her motor skills and eventually mobility. One of the last happy memories that I have with her was me putting the oculus quest on her face and guiding her through a tour of the African Sahara. She actually reacted and reached out to try to touch lions and I swear I saw her smile when she saw elephants. Looking forward to seeing what VR therapy for people with dementia, Alzheimer's and other debilitation can bring in the future.”
Susie, 46, F

“I bought a VR to study the exercise game Supernatural and its effect on learning and motivation for Neurodiverse individuals. (Specifically adhd) I realized pretty quickly that this is the platform of the future. Way beyond games. I see it used for mental health/therapy, exercise, social connection, work interaction, performance/skills enhancing (like public speaking) etc. I’m literally applying for a PhD program so I can study VR some more. It’s changed my life! Supernatural daily has decreased my adhd symptoms tremendously. I feel my brain starting to work better. I can see this tool being an alternative to meds for those who can’t take them.”
a photo of a woman with a vr headset on

This next shows how giving VR a second chance can completely change your perspective on the technology.

Anonymous, 24, F

“My first experience was poor. I tried it at the mall when it was fairly new, and it was a video simulation of an amusement park ride. Sitting down, I got a very intense feeling of motion sickness and did not enjoy the video at all. It was a very bland video simulation. Although my 1st experience was bad, I gave it another try at a friend's house. This was a totally different experience compared to my first. I played beat Saber and it was an overwhelming, awe-inspiring time. From that point forward, I began thinking of VR as the future and one of the most advanced types of technology to exist yet. Almost all experience I have had after that has been incredibly immersive and entertaining. I look at VR as an opportunity to take a break from our physical world and enter another world.”

The last quote I leave you with is a pretty cool perspective on how the technology has changed over time, and how it has impacted this person’s life and social interactions.

Gnossos, 65, F

“Early 90s I was hired by a Space Museum to consult on a VR exhibit and traveled to Boston, Chicago and LA to test drive early concepts. First experiences were so bad that I told the Space Museum to hold off on purchasing VR until it was more developed. Oculus Quest's first experience did not disappoint. My perceptions shift with the technology development, of course. I still see it in its infancy - it’s the Pong Era of VR meaning it sucks but we don’t realize it yet. It’s going to be 100 times better in 10 years. I was surprised by having a crush on a guy in Rec Room who played Paint Ball like he was a trained assassin. Crushes are a distant experience for me, so having one with only a voice and a cartoon avatar really surprised me. I think the safety of my anonymous state helped create an openness to flirting that’s not my normal way. It inspired me to wonder more about the potential for intimacy in VR - especially if these spaces were developed by women.”

Reflection

VR is rapidly growing to be one of the most popular forms of XR. It is estimated that in 2020 nearly one in 5 people in the US, or 19% of consumers, used VR. Due to this increasing demand, it is expected that nearly 15 million AR and VR devices will be shipped to customers worldwide in 2022. Source

The quotes provided above shine a small spotlight on the many ways that people are being impacted by VR every day. From new ways of socializing to new methods of staying physically and mentally fit, VR has the ability to benefit everyone in some way shape or form due to its versatility. It is this social and emotional impact that allows VR to become so popular, as people feel directly connected to the experiences they are trying while in VR. The ever-present description of VR being the ultimate empathy machine is growing more and more accurate as the technology progresses and the range of possibilities expands.

Education from the sciences to the humanities, job training, interpersonal relationships, concerts, work meetings, all these fields can and are already benefiting from VR technologies. More and more people are being exposed to VR every day, and soon enough, it will become a household staple, much like cellphones and TVs. And why? Because of the ways we as users experience VR. It is the consumer perspective that shapes the industry, which is why it is so important to understand why people react the way they do to these technologies.

I personally believe that VR has shaped my perspectives on the world in ways I wouldn’t have been able to imagine due to some of the experiences I have tried. VR has opened my mind to new perspectives on personal space, human interaction, disabilities, and even the way I view myself as a person existing in the real world versus in the digital one.

VR has a unique ability to change perspectives and influence emotions, and it is up to the people using it to decide what path VR ultimately goes down.

a photo of a child with a vr headset on sitting in a field of grass
Ayiana Crabtree
Ayiana Crabtree

Karp Library Fellow, XR Research

Is Extended Reality Shaping the Future of Academic Libraries? This Dean Thinks So.

Studio X salon area. Shows students sitting or standing near the entrance of Studio X.
Mary Ann Mavrinac, vice provost and dean of the University of Rochester Libraries, shares insight into how the campus community directed the development of Studio X, the library’s new extended reality hub featuring advanced technology and expert training 

“I don’t believe in ‘if you build it, they will come.’ You can build something, but they won’t come if you don’t know what your users want,” Mavrinac said. It’s the guiding principle she and her team followed throughout the ideation and planning of the library’s new high-tech hub, Studio X. Located on the first floor of the Carlson Science and Engineering Library, the 3,000 SF space allows students and faculty to participate in immersive learning experiences.

Equipped with technology that supports virtual reality (VR), augmented reality (AR) and everything between (extended reality or XR), Studio X allows researchers to perform tasks such as visualizing large data sets and safely experimenting with hazardous materials by creating a virtual environment. Studio X broadens the range of possibilities for discovery and instruction, but what makes it truly special is its source of inspiration. CannonDesign collaborated with the university to design a facility that the campus community not only requested but also intimately shaped. From inception to completion, student and faculty preferences were integrated with expert knowledge to deliver a space tailored to serve the entire campus community.

We spoke with Dean Mavrinac to learn more about the process and impact of the project. She wanted to underscore that the success, to date, of Studio X is a team effort, much of it led by Digital Scholarship and Studio X director, Emily Sherwood.

Studio X salon area. Shows students sitting or standing near the entrance of Studio X.

There aren’t many academic libraries that offer a space like Studio X. What is it, and how did the project begin?

The project began in fall 2017 when Lauren Di Monte joined our team and learned from the faculty that there was a lot of research activity in extended reality and other immersive technologies. We thought it was something the library could get involved in since we had close to 50 researchers engaged in XR technologies. So, we set out to better understand that landscape and how the researchers would engage with any initiative we developed, whether it was a space or specialized expertise. We knew a generic cave wouldn’t work for them, so we thought about what we may be able to do to help them tackle specific research questions. As it turned out, we pivoted to a space and service that would provide an easy on-ramp to those less familiar with these technologies and related needs.

Today, Studio X is a collaborative hub for extended reality where students and faculty are immersed in learning and teaching in ways that just aren’t possible without advanced technology. It’s a high-tech space that allows exploration, experimentation and experience that truly brings education to life.

What was the goal of Studio X? Who is it for?

The overall goal was to offer physical space, a program, services, technology and expertise that students and faculty needed—and expertise was really big. The user research told us that they wanted a space and experts in the space to teach them how to use and apply the technologies. We approached this goal by providing an on-ramp that made it easy for people to gain access to and experience with XR technologies.

Whether a person is an advanced researcher or a novice user, we’re good at helping people feel comfortable to explore their questions. The library is an interdisciplinary crossroads at the university, so it could be someone studying history, biomedical engineering, neuroscience, religion, ethics—whatever it is—if they’re interested in using XR technologies, we provide the support they need to feel welcome.

Read the full interview.

XR and Accessibility Resources

Person in a VR headset using assistive VR hardware.

Summary: This post reviews resources on XR (extended reality) and accessibility and summarizes best practices for centering accessibility when engaging with these technologies.

Technology in general creates many barriers for disabled users. As XR technologies are rapidly growing in popularity, they exacerbate these challenges. When creating an XR product, whether that be a VR (virtual reality) headset or an AR (augmented reality) game, etc., people tend to think more about their product’s aesthetic or its usability for the average user. What people fail to remember is that not every user will be “the average user.” The world is a diverse place, with people of all ages, genders, races, and abilities, and when creating XR, it is important to keep in mind this diversity. XR and accessibility is itself a new area that is a moving target. Because of this, many new developments are in the works, so these resources may be outdated in just a year’s time.

Before we dive into XR, let’s first define some terms:  What are Accessibility and Universal Design? 

Accessibility is the ability to access something and be able to benefit from its intended purpose. It sometimes refers to specific characteristics that products, services, and facilities have that can be used by people with a variety of disabilities.

Accessible Design is a design process that specifically considers the needs of people with disabilities.

Universal Design is the process of creating products that are accessible to people with a wide range of abilities, disabilities, and other unique circumstances.

Resources:
Usability, Accessibility, and Ethical Design from San Diego State University
What is the difference between accessible, usable, and universal design? from University of Washington

The following resources are divided into 5 categories:
Education
Design, UI/UX
Development
Software
Hardware


Education

XR Access

woman showing a man how to use a vr headset while an audience watches

Link to Webpage
Education, Teaching, Research, Organization, Conferences, Resources

XR Access is a community committed to making virtual, augmented, and mixed reality accessible to people with disabilities. Their mission is to modernize, innovate, and expand XR technologies, products, content and assistive technologies by promoting inclusive design in a diverse community that connects stakeholders, catalyzes shared and sustained action, and provides valuable, informative resources. 

The site provides a plethora of materials for those interested in their efforts. Their research network provides valuable information regarding accessibility research that’s happening across the XR access research network. They have workstreams, which are community-led efforts to inform the design, development, and production of accessible XR. In addition to these, they also have a wide variety of other resources that are there to aid people in their own research, some of which are their annual XR Access Symposium reports (see below for more about the symposium). XR Access also curates stories of disabled folks who have used technology both successfully and unsuccessfully to help advocate for accessible XR technology. Those interested can sign up for their newsletter or join their robust Slack community. 

Accessibility Needs of Extended Reality Hardware: A Mixed Academic-Industry Reflection

a man looking at his hands while wearing a vr headset with a tv screen behind him

Link to Article
Education, Hardware

This journal publication walks the reader through the process of and reasoning behind the need for accessible XR hardware and software. By starting out with an explanation of the benefits of XR, they then move on to show why the accessibility movement should start with hardware. If a user cannot wear a headset, then they cannot experience its software. The XR Access Symposium of 2019 allowed many people to connect and expand upon their individual ideas, which allowed them to establish their goals for XR hardware accessibility. They established a need to: understand related fields’ accessibility guidelines, determine the most pressing obstacles, consider industry guidelines, and increase public awareness of the issues at hand. With those needs in mind and a focus on a community-centered approach, they believe it is easily possible to succeed in overcoming the lack of accessible XR hardware.

Barriers to Supporting Accessible VR in Academic Libraries

Link to Article
Education, Libraries

Although XR technologies offer new opportunities to engage students, they also present more challenges for disabled students. Technology, in general, already tends to exclude these users, and XR’s rapid rate of development further complicates things. The article shares statistics as of 2019 from the U.S. Department of Education, National Center for Education: “19.4% of undergraduates and 11.9% of graduate students have some form of disability.” The authors argue that academic libraries, as leaders in supporting and sharing new technologies, are well poised to address accessibility challenges for XR and must create clear policies and service models that support all users. While no clear accessibility guidelines currently exist, there are several promising initiatives such as XR Access Symposium that are working towards this goal. They detail two accessibility initiatives occurring at Temple University and at the University of Oklahoma. The authors then conclude with a list of key takeaways:

Plan for Accessibility from the Beginning: Libraries can save time and resources by thinking about accessibility issues at the start of a program or project.

Lack of Standards: As of 2020, there are no standards for accessible VR design, but there are related standards that could lay the groundwork for their development.

Developer Support is Essential: Libraries that intend to develop VR experiences need to have sufficient developer support with accessibility expertise.

Importance of Auditing and Reporting: Out-of-the-box VR experiences will pose different accessibility challenges from one person to the next and should be audited to better understand these barriers to access. If a library lacks a developer to modify software or create new software, at the very least, available software needs to be audited and have a corresponding accessibility report produced.

VR is Not the Pedagogy: VR should be another tool in an educator’s arsenal, not the sole focus of a class (unless VR is the course subject). As Fabris et al. (2019) suggest “Having VR for the sake of having VR won’t fly; the VR learning resources need to be built with learning outcomes in mind and the appropriate scaffolds in place to support the learning experience” (74).

Acknowledge the Limits of VR Accessibility: There are limits to making VR accessible. The reality is that there will be students who are unable to use VR for a variety of reasons. Therefore, there should always be an alternative access plan developed so that students have access to non-VR learning methods as well.

XR Accessibility Initiatives in Academic Libraries

cover of the asis&t proceedings booklet

Link to Article
Education, Survey, Libraries

As libraries traditionally take the lead in accessibility initiatives, a survey was done to examine the accessibility of their digital resources. Three questions were asked and sent to various academic libraries, and they received responses from 30 universities:

  • Question 1: What is the level of development of accessibility support for XR technologies in academic libraries?
  • The majority of institutions surveyed did not have policies or dedicated staff to support the accessibility for XR resources
  • Question 2: What XR accessibility knowledge do library staff and administrators currently have?
  • Nearly all participating spaces had some awareness of the challenges that XR provides and are able to find resources to assist when needed.
  • Question 3: What are the main barriers to developing accessibility support for XR technologies in academic libraries?

The top three barriers to developing accessibility policies and processes were lack of staff knowledge, lack of funding, and lack of time. 

The concluding result was that XR and accessibility in academic libraries is still developing, so policies and staff are not yet in place. It is also noted that many institutions have plans to begin progressing towards implementing strategies soon.

DLFteach Toolkit: Lesson Plans on Immersive Pedagogy

man in a wheelchair with his hands in the air wearing a vr headset

Link to Toolkit
Education, Libraries, Teaching

The digital library foundation (DLF) has put together a toolkit of lesson plans that facilitate interdisciplinary work engaged with XR technologies. The toolkit is focused on a decolonial, anti-ableist, and feminist pedagogical framework for collaboratively developing and curating humanities content for emerging technologies. 

Located in the introductory materials section of the toolkit, there are three particularly useful resources. Recommendations for accessible pedagogy with immersive technology, an immersive technology auditing checklist, and instructions on how to create an equally effective alternate action plan for immersive technologies.

Recommendations for Accessible Pedagogy with Immersive Technology – serves to provide a background for the increasing need for creating educational resources for disabled learners. The list of materials provided are intended to guide educators on how to incorporate immersive technologies into their teaching while also keeping disabled learners in mind. It is split into three sections: accessibility and disability, readings on the accessibility of immersive technologies, and recommended administrative considerations. It ends with a series of questions to keep in mind when teaching.

Immersive Technology Auditing Checklist – serves to identify and document the various challenges of making immersive technologies accessible. It divides the workflow into three steps: purchasing software and hardware, providing technical support for software and hardware, and ensuring user access to software and hardware. The checklist then walks you through a series of important questions when considering each phase of the process, posing questions such as “What hardware is required?” and “Is there an accessibility page for the software?” It also dives into questions about ease of operation and perception, asks about the robustness of the technology, and asks about any documentation about the technology. 

Creating and Equally Effective Alternative Action Plan for Immersive Technologies – serves to instruct the reader on how to create an Equally Effective Alternative Action Plan (EEAAP). An EEAAP is a document that is used when there is an accessibility barrier in a technology (i.e. when a technology is unable to be used by a person or group with a disability). The components of an EEAAP are a description of the issue, the person or group affected, the responsible faculty, how the EEAA will be provided, the additional EEAA resources required, repair information, and a timeline for unforeseen events. Some examples of EEAAP’s are listed at the end of the resource. 

Exploring Virtual Reality Through the Lens of Disability

young girl wearing a vr headset with her hands in the air

Link to Article
Education. Teaching Resource

This resource comes directly from the DLF Toolkit. It provides a lesson in an interdisciplinary approach to introducing VR immersions through the lens of disability studies. They are not aiming to represent how all people experience disability, rather they are trying to create an activity that includes discipline–specific theory and criticism. They then talk about the different types of VR: cinematic VR uses filmmaking techniques; simulation VR simulates the real and fictional, while the user is an active participant; representational VR creates immersive experiences through sensory embodiment; and therapeutic VR is designed for various treatments. 

The resource then becomes an instructional guide on how to try several disability-related experiences. They recommend the audience, curricular context, learning outcomes, materials needed, how to prepare for the experiences, and provide a long list of sample instructions. Following this, they list several important applications they recommend trying: Notes on Blindness, The Party, and InMind VR. Each experience is paired with a plethora of questions and other external resources they found to be relevant.

  • Notes on Blindness – This experience tells the story of a man who lost his sight and how he coped by keeping an audio diary. For three years, he recorded over sixteen hours of material.
  • The Party – A VR film by The Guardian that allows you to enter the world of an autistc teenager who is at a surprise birthday party. You will hear internal thoughts about how the experience affects her and share the sensory overload that leads to a meltdown.
  • InMind VR – A short adventure that allows the user to journey into a patient’s brain and search for the neurons that cause mental disorder.

Design, UI/UX

Designing XR for Accessibility and Inclusion

diagram of a vr application called SeeingVR

Link to Article
Design, UI/UX 

When you are in the beginning stages of creating something in an XR medium, whether that be a device or an experience, it is important to keep in mind the various factors that might make something less accessible. Accessibility could mean anything from being differently abled than those around you in terms of motor function, sensory deprivation, or wealth and societal standing. 

VR has a plethora of positive features that could be beneficial to differently abled users such as the ability to enhance spatial sound on one side of the body,  render visuals with higher contrast, and enable those in wheelchairs to experience what it would feel like to “walk around” in VR. However, like with any technology, VR also presents many accessibility challenges such as the heavy emphasis on motion controls, the use of the body to control many experiences, and the requirement to stand during some VR experiences.  

Considering these and other challenges, here are some things to keep in mind while trying to make XR design more inclusive: 

  • Hardware – What equipment do people need to participate in a VR environment? Is a standalone headset and controllers all that’s required? Or is there some form of special equipment or a computer to run the experiences also needed?
  • Navigation and Interfaces – How understandable is the XR environment? If a user had no context or guidebook upon entering the space, would they know what to do and how to interact? Make things either clearly labeled or have a guide or some form of instructions available. This could involve an avatar that appears to give instructions along the way, an instruction dialog box, or a guidebook with your product.
  • Communication – How are speech and body language communicated? Do you have an avatar that represents you in an environment? Is there full body tracking, or does your avatar just float from place to place? Do you speak using a microphone, or are there pre-written text options to choose from? Is captioning available? 
  • Customization and Interoperability – Allow users to customize the XR environment to their needs. Can you enable color contrast? Can you toggle on and off captioning when needed? Are there a variety of sound options? 
  • Avatars and Embodiment – Make sure that there are a wide range of options so people can feel accurately represented. Is there a wide range of skin tones, hair colors, hairstyles, clothing, etc. that will enable any person from anywhere in the world to feel as if they are properly represented in the VR space?

Try out the space yourself and see if it works from several perspectives of ability, seated, standing, sound, no sound, etc. Think about the users that you want to be able to access the device and try to see it from their perspective. Another way to do this might be having testing where you have differently-abled people come to try out your device/program and offer feedback.

An Accessible Future – XR: Considerations for Virtual, Mixed, and Augmented Reality

woman wearing a vr headset with her arms in the air

Link to Article
UI/UX, Metaverse, Conferences

There are many XR applications for the workplace, such as virtual orientation events and training sessions. Imagine being able to attend a conference with people from all over the world using VR: you could still get the experience of being among professionals in your field without ever having to leave your home or office. For example, the XR Access Initiative used VR during its annual symposium to foster engagement. They created virtual rooms that conference participants could explore and interact with their surroundings, held virtual demonstrations, and provided captioned rooms and rooms with ASL interpreters. 

The XR Access Initiative emphasizes three key accessibility factors for virtual conferences: captions, sign language communication, and keyboard and screen reader usage. 

  • Captions – Captions should follow a user and be legible regardless of what angle from which they view the environment. 
  • Sign Language – Sign language interpreters should be located in high visibility areas, and those who need interpreters should be able to get easy access to them. 
  • Screen Reader/Keyboard – For those who are unable to or do not wish to use VR to attend, they should be able to interact with the space in the same way a person in VR could, though with simplified controls. Having cross-platform capabilities is important.

This virtual symposium showcases how VR can make conferences and other virtual events accessible to many people.

Why VR/AR Developers Should Prioritize Accessibility in UX/UI Design

image of a man's hands holding vr controllers

Link to Article
UI/UX, Development, Inclusive Design, Accessibility Settings

An important thing that this article touches on is how a lack of accessibility in VR can make people feel left out or ignored. For example, the easier it is for people to understand a game, the more likely they are to play it. Some things that you might not think about for inclusive design are different hair types or people who experience arthritis. If you have long hair that’s in a ponytail or buns or even fluffy hair, putting on a headset might become difficult as you will have to rearrange your hair into a new position to get the headset on. People with arthritis may need to sit down in the middle of a game, or their fingers or hands get sore after a time. Making controls easier to change in the middle of a game or experience would be very helpful in these cases. Some ways to make VR more accessible for glasses wearers could include the ability to change vision settings or the creation of better glasses adapters for current headsets. 

There is a huge importance in having a diverse group of people in your testing groups to ensure that people of all genders, ethnicities, abilities, socio-economic backgrounds, and other identities are able to interact with your product with ease. It may be impossible to accommodate every unique circumstance but taking the diverse voices of others into consideration while making your product will ensure a better end result. While it may take a little more time to try to make sure everyone is included, the end design will be more profitable and beneficial to a larger community, which is most important.

Computers Helping People with Special Needs

Link to Resource
Conference, Resource

This link is to the proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP). The 2020 ICCHP conference proceedings has a section on XR and accessibility. It has several articles on this topic that cover a wide range of subjects from vocational training for students with disabilities, AR for people with low vision, guidelines for inclusive avatars, and more.

Unity UI Accessibility Plugin

image with the following text: UAP make your game accessible to visually impaired players

Link to Store
Development, UX/UI 

This is a plugin offered on the Unity Asset Store that makes the UI for a Unity project accessible to blind and visually impaired players with just a few clicks.


Development

Introducing the Accessibility VRCs

Link to Article
Developers, Oculus, Game Development

This is Oculus’ guide for developers on how to create with accessibility in mind. The Accessibility VRCs (Virtual Reality Check Guidelines) focus on audio, visuals, interactions, locomotion/movement, and other aspects of accessible design. By deploying these guidelines, they ensure that every application officially available on their platform will meet certain accessibility requirements–something that might make their platform usable for more people. 
Link to the VRC Webpage: https://developer.oculus.com/resources/publish-quest-req/

Initiative aims to make virtual, augmented, and mixed reality accessible

Link to Article
Development

This article links to a webinar about a new initiative to make XR accessible to more people. Larry Goldberg, Senior Director and Head of Accessibility at Verizon Media, discusses emerging technologies and how his company deals with this technological growth. The webinar highlights the importance of how we can use existing technologies as a jumping off point to create new accessible technologies from the beginning, or as Larry Goldberg says, have the technologies be “Born accessible.”

W3C Accessibility User Requirements

Link to W3C
Development, UI/UX

This guide from the World Wide Web Consortium provides a plethora of technical guidelines and considerations for developing accessible products.

XRA’s Developer Guide: Accessibility & Inclusive Design in Immersive Experiences

XRA’S DEVELOPERS GUIDE, CHAPTER THREE: Accessibility & Inclusive Design in Immersive Experiences

Link to Guide
Development, UI/UX

The XRA’s (XR Association’s) developer guide serves as a starter resource for developers looking to create XR experiences. The guide offers a series of industry-backed best practices to developing accessible platforms.

Oculus’ Designing Accessible VR

Link to Guide
Development, Production

This is Oculus’ guide for those wishing to develop accessible content for their platform. They note the importance of accessibility as it pertains to widening the potential customer base.

Accessible Mixed Reality

Link to Webpage
Development, News

This is Microsoft’s project that considers how to design mixed reality technologies in a way that makes them usable and useful to people of all abilities. This webpage links to those involved with the project, publications, and other news surrounding their efforts.


Software

WalkinVR Add-on Makes VR More Accessible to Disabled Gamers

image of a woman in a wheelchair using a vr headset with her arm outstretched

Link to Article
Software, Gaming, Accessibility Settings

A custom locomotion driver for Steam VR applications introduces four new features for those with disabilities. The four features – virtual move, motion range boost, hand tracking, and Xbox controller move – can be adjusted to an individual user’s needs on the fly. 

  • Virtual move allows players to use their controllers’ joystick to move, rather than having to physically move their arms.
  • Motion range boost changes the origin point of motion controllers to amplify movement. It translates a small movement into a large one. 
  • Hand tracking allows the position of motion controllers to be emulated based on hand movements rather than having to use actual controllers. 
  • Xbox controller move allows users to use a gamepad to emulate VR controller inputs. 

This driver is free to download and is only available to users who use SteamVR headsets and applications. You must also have a Steam account to download the application.
Link to the Steam store: https://store.steampowered.com/app/1248360/WalkinVR/

Using AI, people who are blind are able to find familiar faces in a room

man holding a laptop with a camera attached pointing at a woman

Link to Article
Microsoft, Developers, Software, HoloLens, AI

Project Tokyo is a Microsoft initiative that aims to help members of the blind and low-vision community with intelligent personal agent technology that leverages AI to extend their capabilities. The long-term goal of the project is to show that this XR technology can be used by anyone and even assist those with disabilities. Their focus is to create a way for those who are blind or have low vision to see the world or at least perceive it in a similar way to which sighted people do.

They provide several examples throughout the article. For example, they demonstrate the device’s AI ability to notify a user that someone is looking at them. If the wearer turns in the direction of another person, the AI is able to identify the other person’s name for the wearer. An individual working on the project states, “Whenever I am in a situation with more than two or three people, especially if I don’t know some of them, it becomes exponentially more difficult to deal with because people use more and more eye contact and body language to signal that they want to talk to such-and-such a person, that they want to speak now,” he said. “It is really very difficult as a blind person.” Social cues, whether conveyed verbally or physically, are so important for interaction. Rather than starting from scratch, the team is using a modified Microsoft HoloLens, as the HoloLens provides essential information to the AI for reading the environment.


Hardware

Accessibility, Disabilities, and Virtual Reality Solutions

an image of the Microsoft canetroller with its parts labeled: brake, slider, tracker, voice coil, controller

Link to Article
Education, Healthcare, Assistive Hardware

Accessibility is a major priority for those in education fields. Approximately 15% of the world’s population has some form of disability, and one in four adults in the US have a disability that affects “major life activities.” As VR evolves, it provides a whole new range of opportunities and experiences for many people. For example, many visually impaired users can actually see better in VR due to the depth perception headsets provide. Moving forward, VR creators should consider the wide-ranging needs of users from the beginning of the development process.

Microsoft has developed several XR products with accessibility in mind:

  • Canetroller [Link] – The Canetroller, a Microsoft patented haptic device, works as a white cane that visually impaired people can use to experience a virtual environment. 
  • Seeing VR [Link] – SeeingVR is a series of tools to make VR more accessible to those with low vision. The tools include a magnification lens, a bifocal lens, a brightness lens, a contrast lens, edge enhancement, peripheral remapping, text augmentation, text to speech, depth measurement, and more.
  • Braille Controller [Link] – The Microsoft-patented, braille-displaying controller attaches to the back of an Xbox controller, allowing for an alternative way for the visually impaired to experience games. The inspiration for this particular project was to make text-heavy video games more accessible to the visually impaired.

Hospitals are beginning to use VR to find new ways of relieving pain and offer palliative care to patients. While there is no technology currently in existence that would be able to restore someone’s sight, tools such as the IrisVision [https://irisvision.com/] can assist those living with such impairments by providing vision-aid features, a personal voice command assistant, a text-to-speech reader, and high contrast fonts. AR is also being studied to determine if such devices could be helpful with those who suffer from age-related macular degeneration.

The article also links to a variety of informational videos and links to accessibility groups and associations.

Inclusivity of VR and AR Accessibility for the Visually and Hearing Impaired

image from the London National Theatre with the following text: Just Enjoy Cinema: diverse audio versions and subtitles absolutely wherever you want - simply from your own smart device

Link to Article
Assistive Hardware

There are a plethora of companies working on creating applications for enhancing the experiences of differently abled users, and this article highlights a small sample of those projects. Microsoft has created the “canetroller,” which allows a blind or visually impaired person to access virtual reality through a controller that resembles a white cane that uses haptic and audio feedback. Nearsighted VR Augmented Aid is an Android application that uses a mobile device’s camera to display images in stereoscopic view. London’s National Theater did something similar with the help of Epsom’s latest smart glasses to display subtitles in the user’s field of vision, so even if a viewer looked away, they would still be able to see the subtitles. There are many more projects linked in the article. 

Ayiana Crabtree
Ayiana Crabtree

Karp Library Fellow, XR Research

The Future of VR: Haptic Immersion vs Full Dive

While the concept of virtual reality (VR) is not new, VR in practice has only become ubiquitous in recent years. Due to an increase in media exposure, new technology developments, and an explosion of use cases, VR is swiftly becoming an in-demand medium for a wide variety of users. This article will give you a look into the past, the present, and the future of VR, using the novels Ready Player One and Ready Player Two by Ernest Cline as a framework.

Let’s start out with a definition. Virtual reality, or VR, is an immersive experience, also known as a computer-simulated reality. Wearing a VR headset, the user is immersed in an experience that has images and sounds that can either replicate the real world or create an imaginary one. Some examples of popular VR headsets are the line of Oculus headsets from Facebook (now Meta), the Quests and the Rifts. Other common VR platforms include the HTC Vive, the Valve Index, and the Pico VR.

Did you know that you can try out an Oculus Quest headset in Studio X and even check one out for a few days?!

A Brief History of VR


Haptic Immersion vs Full Dive

The Ready Player duology, consisting of Ready Player One (2011) and Ready Player Two (2020), is one of the most mainstream works that discusses VR technology in great depth. The novels hold an exciting narrative told by Wade Watts, a teen living in the year 2045 on a dystopian future earth. He and many others use the OASIS, a virtually simulated utopia to escape from a boring, impoverished life and the declining state of the planet. Both in the novel and in real life, the OASIS is used to play games, watch videos, hang out with friends, attend class, go to work, and so much more. The first novel focuses mainly on haptic immersion technology, while the second novel shifts to full dive technology. As the books are set in a not-so-distant future that centers on the emerging technology we are beginning to see, they provide an interesting framework from which to consider the potential impact of this technology.

Let’s talk more in-depth about two main categories for this technology:

Haptic Immersion

Haptic immersion is the branch of VR that relies heavily on using haptic sensory technology to give the user a physical feeling such as a vibration while experiencing VR. By using things such as haptic gloves, and in some cases, haptic vests or suits, this branch of VR allows a user to put on a headset and experience VR with more realism. Haptic immersion also relies heavily on the physical motions a player makes–sometimes incorporating a treadmill or slide-pad (slippery surface to run on with low-traction shoes).

The haptic sensory technology allows the user to feel a new dimension of sensory input from the games they are playing. Some simple applications of haptics might be allowing the user to feel as if they are actually holding objects or allowing more player interaction, such as tapping another avatar on the shoulder. Imagine you are playing a first-person shooter game and that you are attacked by an enemy from behind. If you are playing on a computer, the damage input would typically be shown around the center screen crosshair with the direction of the damage indicated with an arrow or line. By playing with haptic immersion, however, you would be able to feel the exact spot that the damage is dealt, giving you the ability to react more quickly to the attacks and providing for a more immersive experience.

The first description of haptic immersion we see in Ready Player One is most similar to the technologies we currently have available today.

“The wireless one-size-fits-all OASIS visor was slightly larger than a pair of sunglasses. It used harmless low-powered lasers to draw the stunningly real environment of the OASIS right onto its wearer's retinas, completely immersing their entire field of vision in the online world. The visor was light-years ahead of the clunky virtual-reality goggles available prior to that time, and it represented a paradigm shift in virtual-reality technology-as did the lightweight OASIS haptic gloves, which allowed users to directly control the hands of their avatar and to interact with their simulated environment as if they were actually inside it. When you picked up objects, opened doors, or operated vehicles, the haptic gloves made you feel these nonexistent objects and surfaces as if they were really right there in front of you. The gloves let you, as the television ads put it, ‘reach in and touch the OASIS.’ Working together, the visor and the gloves made entering the OASIS an experience unlike anything else available, and once people got a taste of it, there was no going back.”

-Ernest Cline, Ready Player One, Page 58

Then, we have descriptions of things that still lie in our future.

“I spent the majority of my time in my Shaptic Technologies HC5000 fully adjustable haptic chair. It was suspended by two jointed robotic arms anchored to my apartment's walls and ceiling. These arms could rotate the chair on all four axes, so when I was strapped in to it, the unit could flip, spin, or shake my body to create the sensation that I was falling, flying, or sitting behind the wheel of a nuclear-powered rocket sled hurtling at Mach 2 through a canyon on the fourth moon of Altair VI. The chair worked in conjunction with my Shaptic Bootsuit, a full- body haptic feedback suit […] The outside of the suit was covered with an elaborate exoskeleton, a network of artificial tendons and joints that could both sense and inhibit my movements. Built into the inside of the suit was a weblike network of miniature actuators that made contact with my skin every few centimeters. These could be activated in small or large groups for the purpose of tactile simulation--to make my skin feel things that weren't really there. They could convincingly simulate the sensation of a tap on the shoulder, a kick to the shin, or a gunshot in the chest. (Built-in safety software prevented my rig from actually causing me any physical harm, so a simulated gunshot actually felt more like a weak punch.) I had an identical backup suit hanging in the MoshWash cleaning unit in the corner of the room. These two haptic suits made up my entire wardrobe. My old street clothes were buried somewhere in the closet, collecting dust. On my hands, I wore a pair of state-of-the-art Okagami IdleHands haptic datagloves. Special tactile feedback pads covered both palms, allowing the gloves to create the illusion that I was touching objects and surfaces that didn't actually exist.”

-Ernest Cline, Ready Player One, Pages 191-192

As mentioned before, there is nothing quite like this on the market as of yet, and a lot of what comes close is not readily available for mass consumption due to either high prices or the production is mainly focused on high-end business usage. The cost it takes to produce high-end haptic technology puts it way out of the price range of the average consumer. One such example of this comes from the company haptx.

Haptx specializes in industrial-grade haptic technology and is miles ahead of other companies due to their patented microfluidic systems. With a combination of microfluidic skin, force feedback exoskeletons, magnetic motion tracking, and powerful pneumatics, it is the only company that is capable of providing true haptics (for now!). The company’s main focus is their haptic gloves, which are used for a variety of industries from aircraft manufacturing to firefighting. While these haptic gloves are quite an impressive technological invention, the majority of the other technologies mentioned in the novel still lie in the future for the real world.

Image of HaptX Glove on left-hand side. Text in Right bottom corner that reads HaptX Gloves DK2 True-contact haptics for virtual reality and robotics.

While we don’t yet have any high-end haptic chairs that are suspended from the ceiling like in Ready Player One, we do have some basic haptic suits that are on the market. The current main provider of haptic suits is bHaptics who offer a range of haptic technologies like vests with varying feedback points, haptic sleeves, haptics for VR HMDs, and haptics for hands (not gloves) and feet. And they’re pretty pricey. The “cheapest” haptic vest with only 16 feedback points comes in at $299, and the most expensive with PC compatibility and 40 feedback points is at $549. If you’re interested in the full experience, you’re looking at a price of about $1,400+, and it’s not even a proper full haptic suit.

Another company, TESLASUIT (not affiliated with Tesla), has developed a full-body haptic suit, though it is only for industry use. Considering the cost and inaccessibility of these technologies, it’s not hard to see why discussions about haptics haven’t broken the internet yet.

Image of man wearing a haptic vest from bHaptics called the TactSuit
bHaptics TactSuit

Haptic immersion uses these technologies in coordination with VR headsets to bring users a more physically engaging VR experience. Some future potential applications could include military training or physical therapy. Military training in VR might allow soldiers to have exposure to more dangerous scenarios without the worry of facing actual danger. The haptic technologies would allow for them to feel the consequences of their actions without the setback of having a lasting injury. Physical therapy in VR would allow for a more engaging experience, allowing, for example, the patient to choose a fun environment to transport themselves away from the doctor’s office for the duration of their visit. If the haptic suits could be combined with some form of hard robotic exoskeleton, it could allow for those doing physical therapy to have a more exciting and engaging experience. The addition of a robotic exoskeleton could allow for stretches that a patient may be unable to complete on their own, helping them on their journey to a potentially quicker recovery. The possible applications for haptic immersion give this branch of VR a future full of technical applications that may enhance the way we view physical activity and training.

Full Dive

Full dive is a branch of VR that has had a lot more media exposure. At its core, full dive is VR for your mind. It relies primarily on a brain-computer interface (BCI) to allow a user to control the virtual world with their thoughts. This technology is currently undergoing research, and there have been no real instances of it being a reality as of yet. In theory, a true full dive would allow a user to put on a VR headset with BCI capabilities, and the headset would intercept their brain signals, allowing the user to become their virtual avatar while in the simulation. Imagine the possibilities that this technology could present: reduced travel costs, more immersive educational experiences, being able to visit any place or any time period, or working a job you would never be able to do otherwise. Ernest Cline engages with this concept in his book Ready Player Two:

“The device had a segmented central spine that appeared to stretch from a wearer's forehead to the nape of their neck, with a row of ten C-shaped metal bands attached to it. Each band was comprised of jointed, retractable segments, and each segment had a row of circular sensor pads on its underside. This made the whole sensor array adjustable, so that it could fit around heads of all shapes and sizes. A long fiber-optic cable stretched from the base of the headset, with a standard OASIS console plug at the end of it. […] ‘The device you now hold in your hands is an OASIS Neural Interface, or ONI.’ He pronounced it Oh-En-Eye. ‘It is the world's first fully functional noninvasive brain-computer interface. It allows an OASIS user to see, hear, smell, taste, and feel their avatar's virtual environment, via signals transmitted directly into their cerebral cortex. The headset's sensor array also monitors and interprets its wearer's brain activity, allowing them to control their OASIS avatar just as they do their physical body--simply by thinking about it’”

-Ernest Cline, Ready Player Two, Pages 5-6

The term “full dive” was coined in the 2016 anime franchise, Sword Art Online, though the concept was brought to mass media a lot earlier in The Matrix, which was released in 1999. The Matrix revolves around the main character, Neo learning that the entire world is a simulation created to keep humans complacent while AI harvests their bodies for energy. In Sword Art Online, the main character is trapped in a virtual MMO (massively multiplayer online game) by the creator of the device along with all the other players. The creator had trapped the players in the game by removing the log-out option. The only way one can escape the simulation is to beat the game.

“Its telescoping bands retracted automatically, pressing the array of sensor and transmitter pads mounted on them firmly against the unique contours of my cranium. Then its metal joints tightened up and the whole spiderlike device locked itself onto my skull so that its pads couldn't be jostled or removed while the device was interfacing with my brain. According to the ONI documentation, forcibly removing the headset while it was in operation could severely damage the wearer's brain and/or leave them in a permanent coma. So the titanium-reinforced safety bands made certain this couldn't happen. I found this little detail comforting instead of unsettling. Riding in an automobile was risky, too, if you didn't wear your seatbelt … The ONI documentation also noted that a sudden power loss to the headset could also cause potential harm to the wearer's brain, which was why it had an internal backup battery that could power the device long enough to complete an emergency logout sequence and safely awaken the wearer from the artificially induced sleeplike state it placed them in while the headset was in use.”

-Ernest Cline, Ready Player Two, Page 9

It is this aspect of the technology that Sword Art Online, The Matrix, and Ready Player Two all highlight over the course of their respective narratives, and it is these things that make some people (myself included) skeptical about the progression towards this technology. These narratives show the dangers that this technology could pose if some maniacal creator or AI might take advantage of the BCI to trap their users inside the simulation.

The potential ethical issues that arise with any type of VR, especially with BCI-enhanced VR, are innumerable. When approaching full dive and any technology really, both creators and users must keep in mind the potential negative outcomes that may arise from the creation of such a technology. The applications for full dive in conjunction with accessibility research is presented at the beginning of Ready Player Two.

"A few months after GSS launched the OASIS, Halliday set up an R&D division at the company called the Accessibility Research Lab. Ostensibly, its mission had been to create a line of neuroprosthetic hardware that would allow people with severe physical disabilities to use the OASIS more easily. Halliday hired the best and brightest minds in the field of neuroscience to staff the ARL, then he gave them all the funding they would ever need to conduct their research. The ARI's work over the next few decades was certainly no secret. To the contrary, their breakthroughs had created a new line of medical implants that became widely used. I’d read about several of them in my high school textbooks. First, they developed a new type of cochlear implant that- for those who chose to use it--allowed the hearing impaired to perceive sound with perfect clarity, both in the real world and inside the OASIS. A few years later, they unveiled a new retinal implant that allowed any blind people who wished to be sighted to "see" perfectly inside the OASIS. And by linking two head-mounted mini cameras to the same implant, their real-world sight could be restored as well. The ARI's next invention was a brain implant that allowed paraplegics to control the movements of their OASIS avatar simply by thinking about it. It worked in conjunction with a separate implant that allowed them to feel simulated sensory input. And the very same implants gave these individuals the ability to regain control of their lower extremities while restoring their sense of touch. They also allowed amputees to control robotic replacement limbs, and to receive sensory input through them as well. To accomplish this, the researchers devised a method of "recording" the sensory information transmitted to the human brain by the nervous system in reaction to all manner of external stimuli, then compiled these assets into a massive digital library of sensations that could be "played back" inside the OASIS to perfectly simulate anything a person could experience through their senses of touch, taste, sight, smell, balance, temperature, vibration--you name it."

-Ernest Cline, Ready Player Two, Pages 15-16

Being able to give people with disabilities the ability to participate in VR would be revolutionary, as the industry today is currently struggling with creating inclusive designs. Using full dive and BCIs there would be little to nothing preventing anyone from experiencing the technology. Despite these positive applications, it is still important to think about the negative side effects of this technology. BCIs come in direct contact with a user’s brain, giving the device the ability to not only gather all sorts of information about users but also the potential ability to completely take over control of their minds is something that is frightening to most.

The closest thing we have to this technology as of yet is the NextMind BCI. While this device is mainly for developers at this time, it does come with some sample games, several for the computer, and one for VR. The game demos show off the capabilities of the NextMind: A user can move simple components of these games with their mind. One example illustrates that you can control the paddle in a game of Pong, and another shows that you can move obstacles in all directions away from your character. These simple applications are nowhere near as advanced as what the media predicts, but they are a stepping stone to the future of what this technology could provide.

“My vision went black for a moment as the headset instructed my brain to place my body into a harmless sleeplike state, while my conscious mind remained active inside what was basically a computer-controlled lucid dream. Then the OASIS slowly materialized into existence all around me, and I found myself standing back inside Anorak's study, where I’d last logged out. Everything looked the same as before, but it felt completely different. I was actually here, physically inside the OASIS. It no longer felt like I was using an avatar. Now I felt like I was my avatar. There was no visor on my face, none of the faint numbness and constriction you always felt wearing a haptic suit or gloves. I didn't even feel the ONI headset my real body was actually wearing. When I reached up to scratch my head, the device wasn't there.”

-Ernest Cline, Ready Player Two, Page 12

Another example of a BCI technology in current development is Neuralink. Neuralink is a neurotechnology company that was founded by Elon Musk in 2016. Their main goal is to create an implantable BCI that will help people with paralysis. They hope to one day be able to create a full dive VR system to better the lives of people living with disabilities.

In February 2021, Musk released a recording of a monkey that could play video games with its mind by using a Neuralink computer chip in its skull. Musk claims that one day the Neuralink could allow humans to send concepts to one another using telepathy, or even allow people to exist in what he calls a “save state” meaning that after they die, their consciousness could be transported to a robot or another human.

While the concept of full dive is a lot scarier than that of haptic immersion, its applications are endless. One potential use case could be for those who are physically disabled, such as the producers of Neuralink suggest. Most VR experiences at the present day are not very accessible to those who do not have control over their body’s full range of motion. With full dive, however, there is no need for physical movement at all, making this potentially the most accessible type of technology. Another ethical concern for full dive VR is regarding the data it collects, which is a major issue for all the technology we use. This is especially concerning for full dive VR due to its direct connection to a user’s neuro activity.

Conclusion

Image of the ready player one book and two oculus quest controllers set on top of another open book

Although full dive has a lot more controversy surrounding it due to its depiction in mass media, both full dive and haptic immersion have their benefits and drawbacks.

Haptic immersion allows for a more realistic physical experience while also allowing the user to be immersed in other worlds. The user can run, jump, walk, and feel every sensation, every punch, every tap on the shoulder. Full dive connects directly with the user’s mind, making for a more out-of-this-world experience. The user doesn’t need to move a single muscle to experience the simulation, and depending on the programming, they could enable or disable various sensations.

Haptic immersion allows for applications in physical therapy, strength training, and virtual sports, while full dive would allow disabled users to experience VR without the limitations they face in the real world.

Haptic immersion is most likely the first of the two technologies we will see become mainstream due to the wide variety of resources that are already available and will likely remain the one that is most readily accessible and accepted by society. Full dive will provide a much more unique experience for users but will ultimately be the one that undergoes the most criticism and rightfully so.

Personal Take

I am personally very excited to see where these two technologies go as we progress even further in innovation. The two paths are unique in their own way, and I look forward to watching as the paths diverge even further. Of the two, I am more interested in the haptics side of things, due to all of the negative media surrounding the concept of full dive. I’m not quite ready to give up my consciousness to a machine, but maybe by the time we get there, my opinion will have changed.

At the end of the day, knowing the difference between these two types of VR is important as the technology progresses into the future.

Ayiana Crabtree
Ayiana Crabtree

Karp Library Fellow, XR Research

Anxiety Cues Found in the Brain Despite Safe Environment

3D nature scene. Shows a field, a wide sky, and a mountain in the background.

Imagine you are in a meadow picking flowers. You know that some flowers are safe, while others have a bee inside that will sting you. How would you react to this environment and, more importantly, how would your brain react? This is the scene in a virtual-reality environment used by researchers to understand the impact anxiety has on the brain and how brain regions interact with one another to shape behavior.

“These findings tell us that anxiety disorders might be more than a lack of awareness of the environment or ignorance of safety, but rather that individuals suffering from an anxiety disorder cannot control their feelings and behavior even if they wanted to,” said Benjamin Suarez-Jimenez, Ph.D., assistant professor in the Del Monte Institute for Neuroscience at the University of Rochester and first author of the study published in Communications Biology. “The patients with an anxiety disorder could rationally say – I’m in a safe space – but we found their brain was behaving as if it was not.”

Read more.