New grant will use virtual reality to understand trauma and the brain

A ball of energy with electricity beaming all over the place.

Understanding how experience and exposure to trauma changes the brain could improve diagnosis and targeted care for conditions like anxiety and post-traumatic stress disorder (PTSD). Benjamin Suarez-Jimenez, Ph.D., assistant professor of Neuroscience, has been studying this topic for the past several years and was awarded a new $3.5 million grant to use virtual reality and MRI to look into the circuitry of threat, reward, and cognitive mapping in PTSDtrauma, and resilience.

For the next five years, this funding from the National Institute of Mental Health will allow the ZVR lab to build upon work that investigates brain areas that build spatial maps, specifically to discriminate between areas of an environment associated with emotions. Suarez-Jimenez’s most recent research identified changes in the salience network – a mechanism in the brain used for learning and survival – in people exposed to trauma (with and without psychopathologies, including PTSD, depression, and anxiety). His prior research has revealed people with anxiety have increased insula and dorsomedial prefrontal cortex activation – indicating their brain was associating a known safe area to danger or threat.

“This project the RO1 will support will probe whether the neural processes we have identified in the past are specific to threat or if they expand to reward processing,” Suarez-Jimenez said. “We are also looking at how attention allocation to some visual cues of the virtual reality tasks changes from pre- to post-task experience. We are hoping that understanding these brain processes can help us identify better ways to diagnose PTSD and to improve treatment.”

Suarez-Jimenez came to the University in January 2021. He is an active member of the Neuroscience Diversity Commission and has served as a mentor for the NEUROCITY program.

Learn more.

Seed funding reflects how data science, AR/VR transform research at Rochester

professor mudjat cetin standing in front of Wegman's Hall.

The University’s Goergen Institute for Data Science supports collaborative projects across all disciplines.

professor mudjat cetin standing in front of Wegman's Hall.
“I’m very excited about the wide range of collaborative projects we are able to support this year,” says Mujdat Cetin, the Robin and Tim Wentworth Director of the Goergen Institute for Data Science. “These projects tackle important and timely problems on data science methods and applications, and I am confident they will lead to significant research contributions and attract external funding.” (University of Rochester photo / Bob Marcotte)

Ten projects supported with seed funding from the Goergen Institute for Data Science this year demonstrate how machine learning, artificial intelligence (AI), and augmented and virtual reality (AR/VR) are transforming the way University of Rochester researchers—across all disciplines—address challenging problems.

“I’m very excited about the wide range of collaborative projects we are able to support this year,” says Mujdat Cetin, the Robin and Tim Wentworth Director of the institute. “These projects tackle important and timely problems on data science methods and applications, and I am confident they will lead to significant research contributions and attract external funding.”

The awards, approximately $20,000 each, help researchers generate sufficient proof-of-concept findings to then attract major external funding.

This year’s projects involve collaborations among engineers, computer scientists, a historian, a biostatistician, and experts in brain and cognitive sciences, earth and environmental science, and palliative care. Their projects include a totally new kind of computing platform, new virtual reality technologies to improve doctor-patient conversations and help people overcome color vision deficiency, and machine learning techniques to make it easier for people to add music to their videos and to enhance AR/VR immersive experiences based on the unique geometry of each user’s anatomy.

The 2022–23 funded projects and their principal investigators are:

  • Ising Boltzmann Substrate for Energy-Based Models
    Co-PIs: Michael Huang, professor of electrical and computer engineering and of computer science, and Gonzalo Mateos, associate professor of electrical and computer engineering and of computer science and the Asaro Biggar Family Fellow in Data Science
  • A Data-Driven, Virtual Reality-based Approach to Enhance Deficient Color Vision
    Co-PIs: Yuhao Zhu, assistant professor of computer science, and Gaurav Sharma, professor of electrical and computer engineering, of computer science, and of biostatistics and computational biology
  • Audiovisual Integration in Virtual Reality Renderings of Real Physical Spaces
    Co-PIs: Duje Tadin, professor and chair of brain and cognitive sciences and professor of ophthalmology and of neuroscience; Ming-Lun Lee, associate professor of electrical and computer engineering; and Michael Jarvis, associate professor of history
  • Personalized Immersive Spatial Audio with Physics Informed Neural Field
    Co-PIs: Zhiyao Duan, associate professor of electrical and computer engineering and of computer science, and Mark Bocko, Distinguished Professor of Electrical and Computer Engineering and professor of physics and astronomy
  • Computational Earth Imaging with Machine Learning
    Co-PIs: Tolulope Olugboji, assistant professor of earth and environmental sciences, and Mujdat Cetin, professor of electrical and computer engineering and of computer science, and the Robin and Tim Wentworth Director of the Goergen Institute for Data Science
  • Improving Deconvolution Estimates through Bayesian Shrinkage
    PI: Matthew McCall, associate professor of biostatistics
  • Building a Multi-Step Commonsense Reasoning System for Story Understanding
    Co-PIs: Zhen Bai, assistant professor of computer science, and Lenhart Schubert, professor of computer science
  • Versatile and Customizable Virtual Patients to Improve Doctor-Patient Communication
    Co-PIs: Ehsan Hoque, associate professor of computer science, and Ronald Epstein, professor of family medicine and palliative care
  • Machine Learning Assisted Femtosecond Laser Fabrication of Efficient Solar Absorbers
    Co-PIs: Chunlei Guo, professor of optics, and Jiebo Luo, Albert Arendt Hopeman Professor of Engineering
    Rhythm-Aware and Emotion-Aware Video Background Music Generation
    PI: Jiebo Luo, Albert Arendt Hopeman Professor of Engineering

Read the full story.

XR Research in the Summer

photogrammetry model of the mural in Kodak Hall.

There is a strong emphasis on fostering cross-disciplinary collaboration in extended reality (XR) at Studio X. Over 50 researchers across the UR use XR technology for their research and teaching, and many come to Studio X for consultation and advice in either program development or engineering. As an XR Specialist at Studio X, I got the opportunity to work on two XR-related research projects during the past summer, one in collaboration with the Brain and Cognitive Science Department (BCS), and the other with the Computer Science Department (CS). Through the Office of Undergraduate Research, these projects were supported by a Discover Grant, which support immersive, full-time summer research experiences for undergraduate students at the UR.

The research with BCS includes digitizing the Kodak Hall at the Eastman School of Music and bringing it into VR. The result will be used to provide a more realistic environment for conducting user testing to better study how humans combine and process light and sound. The visit to Kodak Hall was scheduled way back in March. Many preparations had been done before the visit that included figuring out the power supply and cable management, stage arrangement, clearance, etc. One discussion was had on what techniques will be used to scan and capture the hall. Three object scanning techniques were tested before and during the visit: photogrammetry, 360-image, and time-of-flight (ToF). 

Photogrammetry creates 3D models of physical objects by processing photographic images or video recordings. By taking images of an object from all different angles and processing them with software like Agisoft Metashape, it is possible for the algorithm to locate and map key points from multiple images and combine them into a 3D model. I first learned about this technique by attending a photogrammetry workshop at Studio X led by Professor Michael Jarvis. This technique has been very helpful for the research since we are able to get great details on the mural in Kodak Hall, at which other techniques had failed.

photogrammetry model of the mural in Kodak Hall.
Photogrammetry model of the mural in Kodak Hall

360-image, as its name suggests, is a 360-degree panoramic image taken from a fixed location. With the Insta360 camera borrowed from Studio X, the capturing session requires almost no setup whatsoever and can be quickly previewed using the app on a phone or smart device.

360 image of Kodak Hall, captured from the stage.
360 image of Kodak Hall, captured from the stage

The Time-of-Flight (ToF) technique shoots light and calculates the time it takes for the light wave to travel back from the reflection in order to get the depth information. Hardware using the ToF technique can be easily found on modern devices, such as iPhone and iPad with Face ID. I tested the ToF scanner on the iPad Pro at Studio X. It provides a great sense of spatial orientation and has a fairly short processing time.

3D capture of Studio X from an iPad Pro.

We used the Faro Laser Scanner in order to get a scan with higher accuracy and resolution. Each scan took 20 minutes, and we conducted 8 scans to cover the entire hall. The result is a 20+ GB model with billions of points. In order to load the scene to the Meta Quest 2 VR headset, we shrunk down the size and resolution of the model dramatically using tools such as gradual selection, adjusting the Poisson distribution, material paint, etc. We also deleted excessive points and replaced flat surfaces with better quality images such as the stage and mural. The end result is a nice-looking model with decent details around 250MB, good for the headset to run. 

partial 3D model of Kodak Hall.

The model was handed over to Shui’er Han from BCS as a Unity package, where she is going to implement the audio recording and spatial visualization before conducting the user testing. It is amazing to see many people working and bringing together their experience and knowledge in making this cross-disciplinary project to reality. I would like to thank Dr. Duje Tadin, Shui’er Han, Professor Michael Jarvis, Dr. Emily Sherwood, Blair Tinker, Lisa Wright, Meaghan Moody, and many more who gave me the amazing opportunity to work on this fun research and all the help they provided along the way. I can’t wait to see what they can achieve beyond this model and research project.  

You can read more about this cross-disciplinary collaboration here.

Hao Zeng
Hao Zeng

XR Specialist

My Summer At Studio X

a group of pre-college students posing for a group photo.

This summer, I worked full time at Studio X. Even though the campus felt pretty empty with almost all the other undergrads home for summer, there was a lot going on in Studio X! For example, for two weeks in July, we held a pre-college program called “XR: Content Creation and World Building.” In this program, high schoolers came all across the country to learn about the world of extended reality or XR.

“Learn how XR (the umbrella term for augmented and virtual reality) experiences are created! Students will study the history of immersive technologies and gain technical skills by exploring both the basics of 3D graphics for asset creation and how to develop XR environments with Unity, a popular game engine. We will also discuss the applications and impact of XR across humanities, social science, and STEM fields. All learning levels welcome.”

It was really exciting to be a part of this program to teach passionate students about XR creation. As we prepared for the students’ arrival, we asked ourselves, “How can we introduce a dozen high school students to the complex and technically challenging world of XR development, all within two weeks of half-day sessions?” This was a challenge indeed. We knew that we wanted the students to walk away with a basic understanding of the fundamentals of Blender, a 3D modeling and content creation tool, and Unity, a game engine commonly used for VR development, but we did not want to overwhelm them with too much new material all at once. We decided that we would have to create a highly detailed plan, carefully crafting how we would use the two weeks that we have with the students.

Over the course of June and early July, we worked to create this plan, taking every little detail into consideration. The first major obstacle we faced was how we were going to ensure that each student would have the necessary hardware and software in order to complete the activities we were planning. Blender and Unity can both be very taxing on computers, and it is often the case that folks don’t have the necessary hardware, even for our undergraduates. It was very important that this program was open to anyone who was interested and that technical experience or personal hardware was not a limitation. We decided that instead of having each student bring in their own computer, we would use the high-powered workstations that we already have in Studio X. This, however, created the question of how to organize a dozen PCs in our space that each use a very large amount of power. With 12 high-powered PC’s running all at the same time in the same place, we actually ended up blowing a circuit and had to re-think our plans. We considered several options, including using another space or splitting up the group into different rooms, but we eventually decided to completely reorganize Studio X in order to keep the group together in one space. I really liked the way we eventually configured the space, as it allowed us to keep the whole group together and helped us build a stronger community as we worked.

An image showing Studio X configured to have all 12 PCs in the same place
Studio X configured to have all 12 PCs in the same place

After solving our issue of how to organize the computers, we could focus our energy entirely on planning out how to best use the two weeks with the students. The first week was focused on learning Blender. We wanted to give an introduction to 3D concepts, Blender basics, and character modeling. We felt that this would give our students a foundational understanding of how to navigate Blender, while still being realistic with the time that we have. Blender can be a very challenging program to learn. There are many different things that you can do using the software, and oftentimes it can be very overwhelming the first time that you try it out. Although we felt like we were trying to introduce a lot in a short amount of time, we were very excited to see what the students could make. At the end of this week, each student had their very own 3D modeled character. The students did an amazing job creating their characters in Blender. It was so impressive how fast they were able to learn, and it felt so good to see our planning pay off.

Image showing an example of Blender's UI
An example of Blender’s UI

The second week of our program was focused on learning Unity. We wanted to teach the basics of Unity, get the students thinking about core game design principles, and introduce the world of VR development. The end goal for this week would be that each student would create their very own VR mini game, using the 3D character they modeled as the antagonist in their experience.

With so little time, it was really important that we had milestones to reach each day to make sure we stayed on track. On the first day working on their games, the students got an introduction to a template VR Unity project. I created this template using a beginner VR asset from the Unity Asset Store, a place where you can find free or paid packages to help you create games. The asset I used is linked here: VR Escape Room. This package handled a lot of the initial setup for a VR project that can be very complex, allowing the students to focus on their game concepts without being tied down or having to use too much coding. I also created a full VR mini game myself, giving the students an example of what their final project would look like. My game was called Jellyfishin, a game where the player has to go around catching Jellyfish. This game highlighted some of the main mechanics of the template and also was fun for the students to play around with.

Image showing a screenshot from the template project provided to the students
Screenshot from the template project provided to the students

After being introduced to the template project, day 2 was all about environmental design. The students learned how to find resources to create their game world using a combination of free models, primitive objects, and their 3D characters that they made the week prior. By the end of day 2, the games really came together. I was really amazed at how much detail and care that each student put into their project, especially considering how little time that they had. The final development day was used to polish and finalize the games. We made sure that each student’s game could be playable start to finish and that there were no major problems with the experience. I think each project was really unique despite coming from the same template. It was so rewarding to see the tools we had created be used so well to create these awesome experiences.

On our final day with the students, it was time for the showcase. Staff members from all over the library came to Studio X, and each student had the opportunity to present their game. One-by-one they gave a quick introduction to their concept and then showed off some gameplay. In the world of game development, you never know if something is going to go wrong. One minor bug could throw off an entire demonstration. Thankfully, these students did an amazing job finalizing their games, and everything went off without a hitch. After two challenging weeks, our students left with a complete VR game, a 3D modeled character, and a set of skills they can continue to grow and use on their journey with XR.

XR Content Creation & World Building – Final Showcase

Being a part of this pre-college program throughout the summer has been an amazing learning experience for me. Through all of the preparation and thinking that went into making our goals possible, I really had to put my technical skills to the test. In the end, our planning really made all the difference and is what made the program run so smoothly. It was a great challenge to think about how we can teach so much information to the students in such a short amount of time, and I’m really proud of what we all accomplished. I can’t wait to see how this program continues to evolve and find more ways to lower the entrance barrier to the world of XR. Overall, it was a pretty great summer in Studio X.

Liam O'Leary
Liam O’Leary

Karp Library Fellow, XR Developer

In a World Full of 3D Models, Researchers Build a New One for Leukemia

hand holding the the bone-marrow-on-chip device.

Wilmot Cancer Institute scientist published data that show a new microchip-like device that his lab developed can reliably model changes in the bone marrow as leukemia takes root and spreads.

hand holding the the bone-marrow-on-chip device.
Ben Frisch, PhD, holds the bone-marrow-on-chip device in his lab.

Ben Frisch, Ph.D., assistant professor of Pathology and Laboratory Medicine and Biomedical Engineering at the University of Rochester, and colleagues have been building what is known as a modular bone-marrow-on-chip to enhance the investigation of leukemia stem cells. The tiny device recapitulates the entire human bone marrow microenvironment and its complex network of cellular and molecular components involved in blood cancers.  

Similar tissue-chip systems have been developed by others, but they lack two key features contained in Frisch’s product: osteoblast cells, which are crucial to fuel leukemia, and a readily available platform.

The fact that Frisch’s 3D model has been published in Frontiers in Bioengineering and Biotechnology and is not a one-off fabrication will allow others in the field to adopt a similar approach using the available microfluidics system, he said.

Read more.

Sensory Processing – in a Virtual Kodak Hall

a binaural microphone set up with a dummy head.

Rochester researchers will harness the immersive power of virtual reality to study how the brain processes light and sound.

A cross-disciplinary team of researchers from the University of Rochester is collaborating on a project to use virtual reality (VR) to study how humans combine and process light and sound. The first project will be a study of multisensory integration in autism, motivated by prior work showing that children with autism have atypical multisensory processing.

The project was initially conceived by Shui’er Han, a postdoctoral research associate, and Victoire Alleluia Shenge ’19, ’20 (T5), a lab manager, in the lab of Duje Tadin, a professor of brain and cognitive sciences.

“Most people in my world—including most of my work—conduct experiments using artificial types of stimuli, far from the natural world,” Tadin says. “Our goal is to do multisensory research not using beeps and flashes, but real sounds and virtual reality objects presented in realistically looking VR rooms.”

UR students working on the project are looking at information on a laptop with Kodak Hall in the background.
Members of the team begin the setup for audio and visual data collection. From left to right are Shui’er Han, a postdoctoral research fellow in Duje Tadin’s lab; brain and cognitive sciences major Betty Wu ’23; computer science and business major and e5 student Haochen Zeng ’23, who works in River Campus Libraries’s Studio X; and Victoire Alleluia Shenge ’19, ’20 (Take Five), who earned her degree in brain and cognitive sciences and is a manager in Tadin’s lab.

A cognitive scientist, a historian, and an electrical engineer walk into a room . . .

Tadin’s partners in the study include Emily Knight, an incoming associate professor of pediatrics, who is an expert on brain development and multisensory processing in autism. But in creating the virtual reality environment the study participants will use—a virtual version of Kodak Hall at Eastman Theatre in downtown Rochester—Tadin formed collaborations well outside his discipline.

Faculty members working on this initial step in the research project include Ming-Lun Lee, an associate professor of electrical and computer engineering, and Michael Jarvis, an associate professor of history. Several graduate and undergraduate students are also participating.

Many of the tools they’ll use come from River Campus Libraries—in particular, Studio X, the University’s hub for extended reality projects, as well as the Digital Scholarship department. Emily Sherwood, director of Studio X and Digital Scholarship, is leading the effort to actually construct the virtual replica of Kodak Hall.

The group recently gathered in the storied performance space to collect the audio and visual data that Studio X will rely on. University photographer J. Adam Fenster followed along to document the group’s work.

Read more.

Exploring Extended Reality in the Libraries with Studio X

Senior Creative Writing major and Karp Library Fellow Ayiana Crabtree '22 was featured in this post for the UR admissions blog! Link to original post at the end.

Located on the first floor of Carlson Library, as the hub for extended reality at the University of Rochester, Studio X fosters a community of cross-disciplinary collaboration, exploration, and peer-to-peer learning that lowers barriers to entry, inspires experimentation, and drives innovative research and teaching in immersive technologies.

Studio X runs tons of fun workshops and events that aim to make XR fun and easier to understand. For example, I run an Intro to XR workshop every semester that teaches participants, no matter their skill level, all about the basics of XR with a fun hands-on learning experience. There are other workshops too, like Blender and Unity tutorials to teach you the basics of 3D modeling and game development. If workshops aren’t your thing, we also have events like our Beat Saber competition and a speaker series called Voices of XR, where you can learn about XR directly from professionals in the field.

Studio X has a wide range of XR technologies that students, faculty, and staff have access to using both inside and out of the space. Our most popular attractions are the Meta Quest 2 VR headsets, which can be borrowed and taken back to your dorm for up to three days at a time. On our VR headsets, there are a bunch of fun pre-downloaded games and experiences for you to play, like Beat Saber, Walkabout Minigolf, Job Simulator, and more! In addition to the VR headsets, we have 360 cameras and 360 audio recorders which can also be taken back to your dorm for a three-day period. If you don’t mind staying in the space, you can ask to try one of our Microsoft HoloLens 2’s (MR headsets) or use one of our high-end workstations for homework. You can also use any of the aforementioned technology in the space if you don’t want to take it back to your room.

Studio X’s main goal is to break down any barriers that may be preventing students from getting into XR technologies. Whether that be making resources readily available, or giving introductory tutorials, Studio X is here to help!

Read the full article here!

First and Lasting Impressions of VR

Personal Experience

I first encountered the idea of virtual reality (VR) when I read the book Ready Player One by Ernest Cline. As an avid reader of science fiction books, I loved the idea of being able to escape to some virtual world through a VR headset. Soon after I read the book, the movie was released and seeing the concept executed in a visual form only increased my interest in the subject. Despite my fascination, I took it as the book genre labeled it. Fiction. I believed that there were no VR headsets, as I had never seen or heard of anyone having them.

In the fall of 2020, I happened across an advertisement for the Oculus Quest 1. My interest in the novel had not wavered, but nevertheless, I was shocked. I hadn’t realized that the concept introduced to me through a science fiction novel was real in the form of a readily available, and relatively affordable technology. I had been saving money for a while and prepared to make my purchase. Luckily, a friend encouraged me to wait a few months, as in October 2020, the Oculus Quest 2 was released. I ordered the headset and eagerly awaited its arrival.

Photo of the author of the article, a young woman, with a virtual reality headset resting on top of her head

When I finally got my hands on it, I was over the moon. It may not have looked like the vision Cline painted in his novel, nor like the version in the movie, but it was virtual reality nonetheless. In the time between ordering and receiving the headset, I researched various games and experiences that I wanted to try upon its arrival. Beat Saber, a rhythm game, was top of my list and was my first purchase on the device. I’d never been one to read instructions for consoles, games, or anything at all, so I dove right in and set up my account.

As soon as I began playing, I was hooked. Whether it was the idea of actually experiencing VR or the catchy songs of Beat Saber, I absolutely fell in love with my Quest 2. I played it every moment I had time. As I danced around my living room slashing to the beat of the songs, my parents asked me what I was doing.

I excitedly explained to them what VR was, and how it worked. It was at this point I had my first experience sharing VR with someone else. After a long tutorial on how to wear the headset, how to navigate the menu, and how to play the game, my parents tried out VR for the first time.

This was all back over the winter break of 2020-2021, just before I interviewed to join the Studio X team as a Karp Library Fellow to do XR research. This was during the time that the pandemic was still pretty bad, and VR provided an escape from the harsh reality around me. It helped my anxiety and allowed me to relax, even if just a little bit while I was immersed in the world of VR.

Ever since that initial experience, I made an effort to introduce as many of my family members as I could to virtual reality.

Family Experiences


It was the various times of having my mum try VR that really inspired me to explore the topic of user interaction with VR further. Her reactions to the various experiences I had her try really made me understand the impact that VR can have on people’s lives. Her first experience was with Beat Saber, which she thoroughly enjoyed due to its catchy songs, but it was Job Simulator that really captured her attention. “I thought it was going to be dumb,” she said. “When I saw you doing it, it looked silly, but when I tried it, it blew my mind. It was a strange experience because it really made me feel like I was in the room.” For me, it was especially funny watching her play Job Simulator. I had to make sure she wouldn’t forget about the guardian boundary, as she kept trying to walk down the virtual hallway when, in reality, she was about to crash into the coffee table. Another interesting thing was how she was worried about dropping the virtual coffee cup, because she didn’t want it to break or make a mess on the floor.

a screen grab from the game job simulator


While he didn’t try Job Simulator, my dad tried Walkabout Mini Golf. He’s not much of a mini-golf fan, but he was blown away by how realistic the physics were in the game. He said he kept feeling like he was going to fall off the edge of the map and even tried walking from hole to hole (which would have required a lot more space than we have in our living room). “You really don’t know what it’s like until you try it, and when you do, you can see all kinds of applications this technology may have in the future.”


I, of course, wasn’t going to have them do Beat Saber, and I didn’t have Walkabout Mini Golf at the time, so I had them watch a few Oculus TV videos.

Having my 96-year-old Great Grandma try VR was quite an interesting experience. She was in awe at the capabilities of the technology and loved the fly-over nature documentary about the ice caps.

My Gran tried a few different parachuting and paragliding videos. “It was amazing to feel like I was there. I feel like I could do paragliding now!”

My Grandpa watched a few shorter space documentaries and was thrilled to be immersed in the galactic environment.

a photo of an elderly woman with a vr headset on
Photo of my 96-year old Great Grandma trying VR for the first time

Running a Survey

After seeing the unique reactions from all my relatives, I was curious to know how others felt about their experiences with VR. I had joined several VR-focused Facebook groups to see the kinds of conversations people were having about VR and then decided to run a survey to directly ask the community about their experiences.

With the survey titled “How do Users Experience VR,” I asked a range of questions about age, their perception of VR, what they used it for, and if they had any stories they wanted to share. After about a week and a half of running the survey, I had 282 responses to go through.

One of the things that interested me the most was the age range distribution of those who responded. I’m not sure if this is directly related to the survey being run on Facebook and the demographic of Facebook users, but it doesn’t feel like a misrepresentation as the group was specifically for Oculus Quest users, and Oculus, now the Meta Quest, is owned by the same company.

a photo of a histogram of the age range distribution of people who participated in the survey


The following information is taken directly from the survey results. Some of the quotes have been reformatted a bit for coherency.

This first quote is from a friend who had some previous experience with VR. I had them try VR on my headset before asking them to fill out the survey. Their unique perspective on the potential threat VR poses to society is one that I haven’t seen discussed much elsewhere, which is why I believe it is important to include it here.

Jenna, 21, Non-Binary

“At first, I thought it was super cool, but a little bit scary. As I’ve had more experience with VR games, I still think it is an awesome technology with a lot of potential uses, but I fear that VR video game violence will further desensitize users to violence in the real world. I was at an arcade once and played a VR zombie game and had to ask the worker to stop the game because it felt too much like I was killing real living things. Hence my fear of it desensitizing people to violence.”

I included the next quote because it shows VR being used as a tool for long-distance interaction and also how people in your environment perceive you as you play VR.

Tracie, 49, F

“I saw it as an opportunity to stay connected and play with my friend who lives thousands of miles away from me. I used to live in my RV where there was very little space to play, so I would take my headset to the laundromat and play while doing laundry. Can't tell you how many times people were freaked out by what I was doing. I always tap the headset to bring up passthrough when someone came in and when I started interacting with one guy he was totally weirded out ... ‘you can see me!!!’ ... lol. Yes. Yes, I can. (I was in a closed RV community laundry place with the offices and rec center in the same building -- it was completely safe).”

These next two quotes are particularly interesting as they show the potential uses for the elderly and the health benefits of using VR.

Bonnie, 79+, F

“I became so enthusiastic, it was fun, and I moved my body. I bought a headset and began to realize all the possibilities. I finally got my Quest2 in September and found Fitxr and SuperNatural. I have continued to use my quest 2 every day and have barely explored all the apps. My enthusiasm prompted five other sales among my friends as they noted my weight loss and toning of my body. I never thought that an old person could gain strength and balance. Just thought we went backwards physically. I had given away my cross-country skis and now wish I had them back as I have gained strength in my entire body. My balance has improved so much and although I have “bat wings” on one side of my arms I actually have muscle “bumps” on the other side. I can do step ups - more and more each week or so. I can do squats, as many as 40 at a time. Every day my muscles ache, but I LOVE IT as I realize it is a good ache and I earned it. The technology allows me to socialize with others, visit sites I had traveled to previously and brings back happy memories. The technology allows me not to travel to a gym (not that I would have) and to have privacy.”
M, 75, F

“I am seeing more uses for homebound, elderly... seeing it as a way of connecting friends and family scattered around the country. wonderful experience taking my 85-year-old brother to the top of mount everest! Getting to play golf with a group of women every week. exploring worlds in altspace”
an elderly man with a vr headset on

The following quote shows the emotional impact VR can have on people through the experiences it allows one to have.

Sherry, 57, F

“I bought an Oculus for my 11 year old granddaughter for Christmas. She brought me to the kitchen and told me to stand in a spot. She put it on me and told me to close my eyes. When it was on, she told me to open them. I was in a beautiful mountain lodge. Out the window were mountains. I was overwhelmed and began to cry. It was as if I had been transported to my home in the mountains 25 years ago. I could not believe my eyes. Literally! I just kept saying … is this real? I knew immediately that I must get my own Oculus and pretty much immediately ordered one for myself based on that 5 minutes of standing in a room looking at the mountains.”

These next quotes show an optimistic perspective for the future of VR technology.

a photo of a man with a vr headset on in front of a tv screen
Anonymous, 59, F

“At first, it was just a music game that I played. I've since added more experiences with various game types, the Multiverse, and more. Rather than this just being a gaming system, I can see a future for business, education, research, social interacting (that can actual involve talking to one another vs just texting), shopping, and so much more!”
Susan, 61, F

“I came to see it could be used for exercise and education and other non-gaming applications. I have come to see that it is a powerful educational tool, particularly for people who are limited, either physically or not able to travel to other parts of the world. Also, I believe it could be used to deepen educational experiences in a variety of ways. I also continue to believe it is probably pretty addictive and should not be used many hours of the day as it is basically an escape and not particularly productive in general. I think it’s a great tool for people who are disabled or otherwise housebound. I have a concern that entering a VR world takes away from the time that people spend outdoors, which in the end is far more important.”
Skye, 45, F

“It was more real than I thought it would be. And I immediately saw the potential applications to things that I cared about - like art, exercise, and experience with others. When my brother bought everyone in the family an Oculus for Christmas, that was a game-changer. I was SHOCKED with how much further the technology had come and am a total convert and trying to get others to get a headset so we can hang out in virtual worlds and other experiences. I now see VR as being something that is relevant for my life now and into the future. I see how it can improve my interactions with family and friends (we spend more time together...especially since Covid and distance limits our in-person opportunities), and it has given me new ideas for how to approach and use it for engagement for my wellness company and clients.”

The next several quotes show the potential therapeutic and mental benefits of using VR.

Audrey, 38, F

“I have ADHD. I was diagnosed at 37 years old, and I have found that the exercise component for VR allows me to keep engaged in a way no other exercises have previously. I still do other types of exercise (such as strength training or hiking) but when I’m not in the mood to workout, the menu of options in VR still brings excitement for me.”
Anonymous, 42, F

“I live in the Midwest, and it is dark by 4 o'clock in winter. In vr, I can hop into real fishing vr and spend time on the lake in sunshine. It doesn't matter that it's not real, your body still relaxes, endorphins are released. It has helped a lot with seasonal affective disorder this year.”
a photo of a woman surrounded by lights with a vr headset on
Anonymous, 41, F

“My mom passed away June 2020, she had prefrontal dementia, she slowly lost all her motor skills and eventually mobility. One of the last happy memories that I have with her was me putting the oculus quest on her face and guiding her through a tour of the African Sahara. She actually reacted and reached out to try to touch lions and I swear I saw her smile when she saw elephants. Looking forward to seeing what VR therapy for people with dementia, Alzheimer's and other debilitation can bring in the future.”
Susie, 46, F

“I bought a VR to study the exercise game Supernatural and its effect on learning and motivation for Neurodiverse individuals. (Specifically adhd) I realized pretty quickly that this is the platform of the future. Way beyond games. I see it used for mental health/therapy, exercise, social connection, work interaction, performance/skills enhancing (like public speaking) etc. I’m literally applying for a PhD program so I can study VR some more. It’s changed my life! Supernatural daily has decreased my adhd symptoms tremendously. I feel my brain starting to work better. I can see this tool being an alternative to meds for those who can’t take them.”
a photo of a woman with a vr headset on

This next shows how giving VR a second chance can completely change your perspective on the technology.

Anonymous, 24, F

“My first experience was poor. I tried it at the mall when it was fairly new, and it was a video simulation of an amusement park ride. Sitting down, I got a very intense feeling of motion sickness and did not enjoy the video at all. It was a very bland video simulation. Although my 1st experience was bad, I gave it another try at a friend's house. This was a totally different experience compared to my first. I played beat Saber and it was an overwhelming, awe-inspiring time. From that point forward, I began thinking of VR as the future and one of the most advanced types of technology to exist yet. Almost all experience I have had after that has been incredibly immersive and entertaining. I look at VR as an opportunity to take a break from our physical world and enter another world.”

The last quote I leave you with is a pretty cool perspective on how the technology has changed over time, and how it has impacted this person’s life and social interactions.

Gnossos, 65, F

“Early 90s I was hired by a Space Museum to consult on a VR exhibit and traveled to Boston, Chicago and LA to test drive early concepts. First experiences were so bad that I told the Space Museum to hold off on purchasing VR until it was more developed. Oculus Quest's first experience did not disappoint. My perceptions shift with the technology development, of course. I still see it in its infancy - it’s the Pong Era of VR meaning it sucks but we don’t realize it yet. It’s going to be 100 times better in 10 years. I was surprised by having a crush on a guy in Rec Room who played Paint Ball like he was a trained assassin. Crushes are a distant experience for me, so having one with only a voice and a cartoon avatar really surprised me. I think the safety of my anonymous state helped create an openness to flirting that’s not my normal way. It inspired me to wonder more about the potential for intimacy in VR - especially if these spaces were developed by women.”


VR is rapidly growing to be one of the most popular forms of XR. It is estimated that in 2020 nearly one in 5 people in the US, or 19% of consumers, used VR. Due to this increasing demand, it is expected that nearly 15 million AR and VR devices will be shipped to customers worldwide in 2022. Source

The quotes provided above shine a small spotlight on the many ways that people are being impacted by VR every day. From new ways of socializing to new methods of staying physically and mentally fit, VR has the ability to benefit everyone in some way shape or form due to its versatility. It is this social and emotional impact that allows VR to become so popular, as people feel directly connected to the experiences they are trying while in VR. The ever-present description of VR being the ultimate empathy machine is growing more and more accurate as the technology progresses and the range of possibilities expands.

Education from the sciences to the humanities, job training, interpersonal relationships, concerts, work meetings, all these fields can and are already benefiting from VR technologies. More and more people are being exposed to VR every day, and soon enough, it will become a household staple, much like cellphones and TVs. And why? Because of the ways we as users experience VR. It is the consumer perspective that shapes the industry, which is why it is so important to understand why people react the way they do to these technologies.

I personally believe that VR has shaped my perspectives on the world in ways I wouldn’t have been able to imagine due to some of the experiences I have tried. VR has opened my mind to new perspectives on personal space, human interaction, disabilities, and even the way I view myself as a person existing in the real world versus in the digital one.

VR has a unique ability to change perspectives and influence emotions, and it is up to the people using it to decide what path VR ultimately goes down.

a photo of a child with a vr headset on sitting in a field of grass
Ayiana Crabtree
Ayiana Crabtree

Karp Library Fellow, XR Research

Is Extended Reality Shaping the Future of Academic Libraries? This Dean Thinks So.

Studio X salon area. Shows students sitting or standing near the entrance of Studio X.
Mary Ann Mavrinac, vice provost and dean of the University of Rochester Libraries, shares insight into how the campus community directed the development of Studio X, the library’s new extended reality hub featuring advanced technology and expert training 

“I don’t believe in ‘if you build it, they will come.’ You can build something, but they won’t come if you don’t know what your users want,” Mavrinac said. It’s the guiding principle she and her team followed throughout the ideation and planning of the library’s new high-tech hub, Studio X. Located on the first floor of the Carlson Science and Engineering Library, the 3,000 SF space allows students and faculty to participate in immersive learning experiences.

Equipped with technology that supports virtual reality (VR), augmented reality (AR) and everything between (extended reality or XR), Studio X allows researchers to perform tasks such as visualizing large data sets and safely experimenting with hazardous materials by creating a virtual environment. Studio X broadens the range of possibilities for discovery and instruction, but what makes it truly special is its source of inspiration. CannonDesign collaborated with the university to design a facility that the campus community not only requested but also intimately shaped. From inception to completion, student and faculty preferences were integrated with expert knowledge to deliver a space tailored to serve the entire campus community.

We spoke with Dean Mavrinac to learn more about the process and impact of the project. She wanted to underscore that the success, to date, of Studio X is a team effort, much of it led by Digital Scholarship and Studio X director, Emily Sherwood.

Studio X salon area. Shows students sitting or standing near the entrance of Studio X.

There aren’t many academic libraries that offer a space like Studio X. What is it, and how did the project begin?

The project began in fall 2017 when Lauren Di Monte joined our team and learned from the faculty that there was a lot of research activity in extended reality and other immersive technologies. We thought it was something the library could get involved in since we had close to 50 researchers engaged in XR technologies. So, we set out to better understand that landscape and how the researchers would engage with any initiative we developed, whether it was a space or specialized expertise. We knew a generic cave wouldn’t work for them, so we thought about what we may be able to do to help them tackle specific research questions. As it turned out, we pivoted to a space and service that would provide an easy on-ramp to those less familiar with these technologies and related needs.

Today, Studio X is a collaborative hub for extended reality where students and faculty are immersed in learning and teaching in ways that just aren’t possible without advanced technology. It’s a high-tech space that allows exploration, experimentation and experience that truly brings education to life.

What was the goal of Studio X? Who is it for?

The overall goal was to offer physical space, a program, services, technology and expertise that students and faculty needed—and expertise was really big. The user research told us that they wanted a space and experts in the space to teach them how to use and apply the technologies. We approached this goal by providing an on-ramp that made it easy for people to gain access to and experience with XR technologies.

Whether a person is an advanced researcher or a novice user, we’re good at helping people feel comfortable to explore their questions. The library is an interdisciplinary crossroads at the university, so it could be someone studying history, biomedical engineering, neuroscience, religion, ethics—whatever it is—if they’re interested in using XR technologies, we provide the support they need to feel welcome.

Read the full interview.

XR and Accessibility Resources

Person in a VR headset using assistive VR hardware.

Summary: This post reviews resources on XR (extended reality) and accessibility and summarizes best practices for centering accessibility when engaging with these technologies.

Technology in general creates many barriers for disabled users. As XR technologies are rapidly growing in popularity, they exacerbate these challenges. When creating an XR product, whether that be a VR (virtual reality) headset or an AR (augmented reality) game, etc., people tend to think more about their product’s aesthetic or its usability for the average user. What people fail to remember is that not every user will be “the average user.” The world is a diverse place, with people of all ages, genders, races, and abilities, and when creating XR, it is important to keep in mind this diversity. XR and accessibility is itself a new area that is a moving target. Because of this, many new developments are in the works, so these resources may be outdated in just a year’s time.

Before we dive into XR, let’s first define some terms:  What are Accessibility and Universal Design? 

Accessibility is the ability to access something and be able to benefit from its intended purpose. It sometimes refers to specific characteristics that products, services, and facilities have that can be used by people with a variety of disabilities.

Accessible Design is a design process that specifically considers the needs of people with disabilities.

Universal Design is the process of creating products that are accessible to people with a wide range of abilities, disabilities, and other unique circumstances.

Usability, Accessibility, and Ethical Design from San Diego State University
What is the difference between accessible, usable, and universal design? from University of Washington

The following resources are divided into 5 categories:
Design, UI/UX


XR Access

woman showing a man how to use a vr headset while an audience watches

Link to Webpage
Education, Teaching, Research, Organization, Conferences, Resources

XR Access is a community committed to making virtual, augmented, and mixed reality accessible to people with disabilities. Their mission is to modernize, innovate, and expand XR technologies, products, content and assistive technologies by promoting inclusive design in a diverse community that connects stakeholders, catalyzes shared and sustained action, and provides valuable, informative resources. 

The site provides a plethora of materials for those interested in their efforts. Their research network provides valuable information regarding accessibility research that’s happening across the XR access research network. They have workstreams, which are community-led efforts to inform the design, development, and production of accessible XR. In addition to these, they also have a wide variety of other resources that are there to aid people in their own research, some of which are their annual XR Access Symposium reports (see below for more about the symposium). XR Access also curates stories of disabled folks who have used technology both successfully and unsuccessfully to help advocate for accessible XR technology. Those interested can sign up for their newsletter or join their robust Slack community. 

Accessibility Needs of Extended Reality Hardware: A Mixed Academic-Industry Reflection

a man looking at his hands while wearing a vr headset with a tv screen behind him

Link to Article
Education, Hardware

This journal publication walks the reader through the process of and reasoning behind the need for accessible XR hardware and software. By starting out with an explanation of the benefits of XR, they then move on to show why the accessibility movement should start with hardware. If a user cannot wear a headset, then they cannot experience its software. The XR Access Symposium of 2019 allowed many people to connect and expand upon their individual ideas, which allowed them to establish their goals for XR hardware accessibility. They established a need to: understand related fields’ accessibility guidelines, determine the most pressing obstacles, consider industry guidelines, and increase public awareness of the issues at hand. With those needs in mind and a focus on a community-centered approach, they believe it is easily possible to succeed in overcoming the lack of accessible XR hardware.

Barriers to Supporting Accessible VR in Academic Libraries

Link to Article
Education, Libraries

Although XR technologies offer new opportunities to engage students, they also present more challenges for disabled students. Technology, in general, already tends to exclude these users, and XR’s rapid rate of development further complicates things. The article shares statistics as of 2019 from the U.S. Department of Education, National Center for Education: “19.4% of undergraduates and 11.9% of graduate students have some form of disability.” The authors argue that academic libraries, as leaders in supporting and sharing new technologies, are well poised to address accessibility challenges for XR and must create clear policies and service models that support all users. While no clear accessibility guidelines currently exist, there are several promising initiatives such as XR Access Symposium that are working towards this goal. They detail two accessibility initiatives occurring at Temple University and at the University of Oklahoma. The authors then conclude with a list of key takeaways:

Plan for Accessibility from the Beginning: Libraries can save time and resources by thinking about accessibility issues at the start of a program or project.

Lack of Standards: As of 2020, there are no standards for accessible VR design, but there are related standards that could lay the groundwork for their development.

Developer Support is Essential: Libraries that intend to develop VR experiences need to have sufficient developer support with accessibility expertise.

Importance of Auditing and Reporting: Out-of-the-box VR experiences will pose different accessibility challenges from one person to the next and should be audited to better understand these barriers to access. If a library lacks a developer to modify software or create new software, at the very least, available software needs to be audited and have a corresponding accessibility report produced.

VR is Not the Pedagogy: VR should be another tool in an educator’s arsenal, not the sole focus of a class (unless VR is the course subject). As Fabris et al. (2019) suggest “Having VR for the sake of having VR won’t fly; the VR learning resources need to be built with learning outcomes in mind and the appropriate scaffolds in place to support the learning experience” (74).

Acknowledge the Limits of VR Accessibility: There are limits to making VR accessible. The reality is that there will be students who are unable to use VR for a variety of reasons. Therefore, there should always be an alternative access plan developed so that students have access to non-VR learning methods as well.

XR Accessibility Initiatives in Academic Libraries

cover of the asis&t proceedings booklet

Link to Article
Education, Survey, Libraries

As libraries traditionally take the lead in accessibility initiatives, a survey was done to examine the accessibility of their digital resources. Three questions were asked and sent to various academic libraries, and they received responses from 30 universities:

  • Question 1: What is the level of development of accessibility support for XR technologies in academic libraries?
  • The majority of institutions surveyed did not have policies or dedicated staff to support the accessibility for XR resources
  • Question 2: What XR accessibility knowledge do library staff and administrators currently have?
  • Nearly all participating spaces had some awareness of the challenges that XR provides and are able to find resources to assist when needed.
  • Question 3: What are the main barriers to developing accessibility support for XR technologies in academic libraries?

The top three barriers to developing accessibility policies and processes were lack of staff knowledge, lack of funding, and lack of time. 

The concluding result was that XR and accessibility in academic libraries is still developing, so policies and staff are not yet in place. It is also noted that many institutions have plans to begin progressing towards implementing strategies soon.

DLFteach Toolkit: Lesson Plans on Immersive Pedagogy

man in a wheelchair with his hands in the air wearing a vr headset

Link to Toolkit
Education, Libraries, Teaching

The digital library foundation (DLF) has put together a toolkit of lesson plans that facilitate interdisciplinary work engaged with XR technologies. The toolkit is focused on a decolonial, anti-ableist, and feminist pedagogical framework for collaboratively developing and curating humanities content for emerging technologies. 

Located in the introductory materials section of the toolkit, there are three particularly useful resources. Recommendations for accessible pedagogy with immersive technology, an immersive technology auditing checklist, and instructions on how to create an equally effective alternate action plan for immersive technologies.

Recommendations for Accessible Pedagogy with Immersive Technology – serves to provide a background for the increasing need for creating educational resources for disabled learners. The list of materials provided are intended to guide educators on how to incorporate immersive technologies into their teaching while also keeping disabled learners in mind. It is split into three sections: accessibility and disability, readings on the accessibility of immersive technologies, and recommended administrative considerations. It ends with a series of questions to keep in mind when teaching.

Immersive Technology Auditing Checklist – serves to identify and document the various challenges of making immersive technologies accessible. It divides the workflow into three steps: purchasing software and hardware, providing technical support for software and hardware, and ensuring user access to software and hardware. The checklist then walks you through a series of important questions when considering each phase of the process, posing questions such as “What hardware is required?” and “Is there an accessibility page for the software?” It also dives into questions about ease of operation and perception, asks about the robustness of the technology, and asks about any documentation about the technology. 

Creating and Equally Effective Alternative Action Plan for Immersive Technologies – serves to instruct the reader on how to create an Equally Effective Alternative Action Plan (EEAAP). An EEAAP is a document that is used when there is an accessibility barrier in a technology (i.e. when a technology is unable to be used by a person or group with a disability). The components of an EEAAP are a description of the issue, the person or group affected, the responsible faculty, how the EEAA will be provided, the additional EEAA resources required, repair information, and a timeline for unforeseen events. Some examples of EEAAP’s are listed at the end of the resource. 

Exploring Virtual Reality Through the Lens of Disability

young girl wearing a vr headset with her hands in the air

Link to Article
Education. Teaching Resource

This resource comes directly from the DLF Toolkit. It provides a lesson in an interdisciplinary approach to introducing VR immersions through the lens of disability studies. They are not aiming to represent how all people experience disability, rather they are trying to create an activity that includes discipline–specific theory and criticism. They then talk about the different types of VR: cinematic VR uses filmmaking techniques; simulation VR simulates the real and fictional, while the user is an active participant; representational VR creates immersive experiences through sensory embodiment; and therapeutic VR is designed for various treatments. 

The resource then becomes an instructional guide on how to try several disability-related experiences. They recommend the audience, curricular context, learning outcomes, materials needed, how to prepare for the experiences, and provide a long list of sample instructions. Following this, they list several important applications they recommend trying: Notes on Blindness, The Party, and InMind VR. Each experience is paired with a plethora of questions and other external resources they found to be relevant.

  • Notes on Blindness – This experience tells the story of a man who lost his sight and how he coped by keeping an audio diary. For three years, he recorded over sixteen hours of material.
  • The Party – A VR film by The Guardian that allows you to enter the world of an autistc teenager who is at a surprise birthday party. You will hear internal thoughts about how the experience affects her and share the sensory overload that leads to a meltdown.
  • InMind VR – A short adventure that allows the user to journey into a patient’s brain and search for the neurons that cause mental disorder.

Design, UI/UX

Designing XR for Accessibility and Inclusion

diagram of a vr application called SeeingVR

Link to Article
Design, UI/UX 

When you are in the beginning stages of creating something in an XR medium, whether that be a device or an experience, it is important to keep in mind the various factors that might make something less accessible. Accessibility could mean anything from being differently abled than those around you in terms of motor function, sensory deprivation, or wealth and societal standing. 

VR has a plethora of positive features that could be beneficial to differently abled users such as the ability to enhance spatial sound on one side of the body,  render visuals with higher contrast, and enable those in wheelchairs to experience what it would feel like to “walk around” in VR. However, like with any technology, VR also presents many accessibility challenges such as the heavy emphasis on motion controls, the use of the body to control many experiences, and the requirement to stand during some VR experiences.  

Considering these and other challenges, here are some things to keep in mind while trying to make XR design more inclusive: 

  • Hardware – What equipment do people need to participate in a VR environment? Is a standalone headset and controllers all that’s required? Or is there some form of special equipment or a computer to run the experiences also needed?
  • Navigation and Interfaces – How understandable is the XR environment? If a user had no context or guidebook upon entering the space, would they know what to do and how to interact? Make things either clearly labeled or have a guide or some form of instructions available. This could involve an avatar that appears to give instructions along the way, an instruction dialog box, or a guidebook with your product.
  • Communication – How are speech and body language communicated? Do you have an avatar that represents you in an environment? Is there full body tracking, or does your avatar just float from place to place? Do you speak using a microphone, or are there pre-written text options to choose from? Is captioning available? 
  • Customization and Interoperability – Allow users to customize the XR environment to their needs. Can you enable color contrast? Can you toggle on and off captioning when needed? Are there a variety of sound options? 
  • Avatars and Embodiment – Make sure that there are a wide range of options so people can feel accurately represented. Is there a wide range of skin tones, hair colors, hairstyles, clothing, etc. that will enable any person from anywhere in the world to feel as if they are properly represented in the VR space?

Try out the space yourself and see if it works from several perspectives of ability, seated, standing, sound, no sound, etc. Think about the users that you want to be able to access the device and try to see it from their perspective. Another way to do this might be having testing where you have differently-abled people come to try out your device/program and offer feedback.

An Accessible Future – XR: Considerations for Virtual, Mixed, and Augmented Reality

woman wearing a vr headset with her arms in the air

Link to Article
UI/UX, Metaverse, Conferences

There are many XR applications for the workplace, such as virtual orientation events and training sessions. Imagine being able to attend a conference with people from all over the world using VR: you could still get the experience of being among professionals in your field without ever having to leave your home or office. For example, the XR Access Initiative used VR during its annual symposium to foster engagement. They created virtual rooms that conference participants could explore and interact with their surroundings, held virtual demonstrations, and provided captioned rooms and rooms with ASL interpreters. 

The XR Access Initiative emphasizes three key accessibility factors for virtual conferences: captions, sign language communication, and keyboard and screen reader usage. 

  • Captions – Captions should follow a user and be legible regardless of what angle from which they view the environment. 
  • Sign Language – Sign language interpreters should be located in high visibility areas, and those who need interpreters should be able to get easy access to them. 
  • Screen Reader/Keyboard – For those who are unable to or do not wish to use VR to attend, they should be able to interact with the space in the same way a person in VR could, though with simplified controls. Having cross-platform capabilities is important.

This virtual symposium showcases how VR can make conferences and other virtual events accessible to many people.

Why VR/AR Developers Should Prioritize Accessibility in UX/UI Design

image of a man's hands holding vr controllers

Link to Article
UI/UX, Development, Inclusive Design, Accessibility Settings

An important thing that this article touches on is how a lack of accessibility in VR can make people feel left out or ignored. For example, the easier it is for people to understand a game, the more likely they are to play it. Some things that you might not think about for inclusive design are different hair types or people who experience arthritis. If you have long hair that’s in a ponytail or buns or even fluffy hair, putting on a headset might become difficult as you will have to rearrange your hair into a new position to get the headset on. People with arthritis may need to sit down in the middle of a game, or their fingers or hands get sore after a time. Making controls easier to change in the middle of a game or experience would be very helpful in these cases. Some ways to make VR more accessible for glasses wearers could include the ability to change vision settings or the creation of better glasses adapters for current headsets. 

There is a huge importance in having a diverse group of people in your testing groups to ensure that people of all genders, ethnicities, abilities, socio-economic backgrounds, and other identities are able to interact with your product with ease. It may be impossible to accommodate every unique circumstance but taking the diverse voices of others into consideration while making your product will ensure a better end result. While it may take a little more time to try to make sure everyone is included, the end design will be more profitable and beneficial to a larger community, which is most important.

Computers Helping People with Special Needs

Link to Resource
Conference, Resource

This link is to the proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP). The 2020 ICCHP conference proceedings has a section on XR and accessibility. It has several articles on this topic that cover a wide range of subjects from vocational training for students with disabilities, AR for people with low vision, guidelines for inclusive avatars, and more.

Unity UI Accessibility Plugin

image with the following text: UAP make your game accessible to visually impaired players

Link to Store
Development, UX/UI 

This is a plugin offered on the Unity Asset Store that makes the UI for a Unity project accessible to blind and visually impaired players with just a few clicks.


Introducing the Accessibility VRCs

Link to Article
Developers, Oculus, Game Development

This is Oculus’ guide for developers on how to create with accessibility in mind. The Accessibility VRCs (Virtual Reality Check Guidelines) focus on audio, visuals, interactions, locomotion/movement, and other aspects of accessible design. By deploying these guidelines, they ensure that every application officially available on their platform will meet certain accessibility requirements–something that might make their platform usable for more people. 
Link to the VRC Webpage:

Initiative aims to make virtual, augmented, and mixed reality accessible

Link to Article

This article links to a webinar about a new initiative to make XR accessible to more people. Larry Goldberg, Senior Director and Head of Accessibility at Verizon Media, discusses emerging technologies and how his company deals with this technological growth. The webinar highlights the importance of how we can use existing technologies as a jumping off point to create new accessible technologies from the beginning, or as Larry Goldberg says, have the technologies be “Born accessible.”

W3C Accessibility User Requirements

Link to W3C
Development, UI/UX

This guide from the World Wide Web Consortium provides a plethora of technical guidelines and considerations for developing accessible products.

XRA’s Developer Guide: Accessibility & Inclusive Design in Immersive Experiences

XRA’S DEVELOPERS GUIDE, CHAPTER THREE: Accessibility & Inclusive Design in Immersive Experiences

Link to Guide
Development, UI/UX

The XRA’s (XR Association’s) developer guide serves as a starter resource for developers looking to create XR experiences. The guide offers a series of industry-backed best practices to developing accessible platforms.

Oculus’ Designing Accessible VR

Link to Guide
Development, Production

This is Oculus’ guide for those wishing to develop accessible content for their platform. They note the importance of accessibility as it pertains to widening the potential customer base.

Accessible Mixed Reality

Link to Webpage
Development, News

This is Microsoft’s project that considers how to design mixed reality technologies in a way that makes them usable and useful to people of all abilities. This webpage links to those involved with the project, publications, and other news surrounding their efforts.


WalkinVR Add-on Makes VR More Accessible to Disabled Gamers

image of a woman in a wheelchair using a vr headset with her arm outstretched

Link to Article
Software, Gaming, Accessibility Settings

A custom locomotion driver for Steam VR applications introduces four new features for those with disabilities. The four features – virtual move, motion range boost, hand tracking, and Xbox controller move – can be adjusted to an individual user’s needs on the fly. 

  • Virtual move allows players to use their controllers’ joystick to move, rather than having to physically move their arms.
  • Motion range boost changes the origin point of motion controllers to amplify movement. It translates a small movement into a large one. 
  • Hand tracking allows the position of motion controllers to be emulated based on hand movements rather than having to use actual controllers. 
  • Xbox controller move allows users to use a gamepad to emulate VR controller inputs. 

This driver is free to download and is only available to users who use SteamVR headsets and applications. You must also have a Steam account to download the application.
Link to the Steam store:

Using AI, people who are blind are able to find familiar faces in a room

man holding a laptop with a camera attached pointing at a woman

Link to Article
Microsoft, Developers, Software, HoloLens, AI

Project Tokyo is a Microsoft initiative that aims to help members of the blind and low-vision community with intelligent personal agent technology that leverages AI to extend their capabilities. The long-term goal of the project is to show that this XR technology can be used by anyone and even assist those with disabilities. Their focus is to create a way for those who are blind or have low vision to see the world or at least perceive it in a similar way to which sighted people do.

They provide several examples throughout the article. For example, they demonstrate the device’s AI ability to notify a user that someone is looking at them. If the wearer turns in the direction of another person, the AI is able to identify the other person’s name for the wearer. An individual working on the project states, “Whenever I am in a situation with more than two or three people, especially if I don’t know some of them, it becomes exponentially more difficult to deal with because people use more and more eye contact and body language to signal that they want to talk to such-and-such a person, that they want to speak now,” he said. “It is really very difficult as a blind person.” Social cues, whether conveyed verbally or physically, are so important for interaction. Rather than starting from scratch, the team is using a modified Microsoft HoloLens, as the HoloLens provides essential information to the AI for reading the environment.


Accessibility, Disabilities, and Virtual Reality Solutions

an image of the Microsoft canetroller with its parts labeled: brake, slider, tracker, voice coil, controller

Link to Article
Education, Healthcare, Assistive Hardware

Accessibility is a major priority for those in education fields. Approximately 15% of the world’s population has some form of disability, and one in four adults in the US have a disability that affects “major life activities.” As VR evolves, it provides a whole new range of opportunities and experiences for many people. For example, many visually impaired users can actually see better in VR due to the depth perception headsets provide. Moving forward, VR creators should consider the wide-ranging needs of users from the beginning of the development process.

Microsoft has developed several XR products with accessibility in mind:

  • Canetroller [Link] – The Canetroller, a Microsoft patented haptic device, works as a white cane that visually impaired people can use to experience a virtual environment. 
  • Seeing VR [Link] – SeeingVR is a series of tools to make VR more accessible to those with low vision. The tools include a magnification lens, a bifocal lens, a brightness lens, a contrast lens, edge enhancement, peripheral remapping, text augmentation, text to speech, depth measurement, and more.
  • Braille Controller [Link] – The Microsoft-patented, braille-displaying controller attaches to the back of an Xbox controller, allowing for an alternative way for the visually impaired to experience games. The inspiration for this particular project was to make text-heavy video games more accessible to the visually impaired.

Hospitals are beginning to use VR to find new ways of relieving pain and offer palliative care to patients. While there is no technology currently in existence that would be able to restore someone’s sight, tools such as the IrisVision [] can assist those living with such impairments by providing vision-aid features, a personal voice command assistant, a text-to-speech reader, and high contrast fonts. AR is also being studied to determine if such devices could be helpful with those who suffer from age-related macular degeneration.

The article also links to a variety of informational videos and links to accessibility groups and associations.

Inclusivity of VR and AR Accessibility for the Visually and Hearing Impaired

image from the London National Theatre with the following text: Just Enjoy Cinema: diverse audio versions and subtitles absolutely wherever you want - simply from your own smart device

Link to Article
Assistive Hardware

There are a plethora of companies working on creating applications for enhancing the experiences of differently abled users, and this article highlights a small sample of those projects. Microsoft has created the “canetroller,” which allows a blind or visually impaired person to access virtual reality through a controller that resembles a white cane that uses haptic and audio feedback. Nearsighted VR Augmented Aid is an Android application that uses a mobile device’s camera to display images in stereoscopic view. London’s National Theater did something similar with the help of Epsom’s latest smart glasses to display subtitles in the user’s field of vision, so even if a viewer looked away, they would still be able to see the subtitles. There are many more projects linked in the article. 

Ayiana Crabtree
Ayiana Crabtree

Karp Library Fellow, XR Research