Senior Creative Writing major and Karp Library Fellow Ayiana Crabtree '22 was featured in this post for the UR admissions blog! Link to original post at the end.
Located on the first floor of Carlson Library, as the hub for extended reality at the University of Rochester, Studio X fosters a community of cross-disciplinary collaboration, exploration, and peer-to-peer learning that lowers barriers to entry, inspires experimentation, and drives innovative research and teaching in immersive technologies.
Studio X runs tons of fun workshops and events that aim to make XR fun and easier to understand. For example, I run an Intro to XR workshop every semester that teaches participants, no matter their skill level, all about the basics of XR with a fun hands-on learning experience. There are other workshops too, like Blender and Unity tutorials to teach you the basics of 3D modeling and game development. If workshops aren’t your thing, we also have events like our Beat Saber competition and a speaker series called Voices of XR, where you can learn about XR directly from professionals in the field.
Studio X has a wide range of XR technologies that students, faculty, and staff have access to using both inside and out of the space. Our most popular attractions are the Meta Quest 2 VR headsets, which can be borrowed and taken back to your dorm for up to three days at a time. On our VR headsets, there are a bunch of fun pre-downloaded games and experiences for you to play, like Beat Saber, Walkabout Minigolf, Job Simulator, and more! In addition to the VR headsets, we have 360 cameras and 360 audio recorders which can also be taken back to your dorm for a three-day period. If you don’t mind staying in the space, you can ask to try one of our Microsoft HoloLens 2’s (MR headsets) or use one of our high-end workstations for homework. You can also use any of the aforementioned technology in the space if you don’t want to take it back to your room.
Studio X’s main goal is to break down any barriers that may be preventing students from getting into XR technologies. Whether that be making resources readily available, or giving introductory tutorials, Studio X is here to help!
I first encountered the idea of virtual reality (VR) when I read the book Ready Player One by Ernest Cline. As an avid reader of science fiction books, I loved the idea of being able to escape to some virtual world through a VR headset. Soon after I read the book, the movie was released and seeing the concept executed in a visual form only increased my interest in the subject. Despite my fascination, I took it as the book genre labeled it. Fiction. I believed that there were no VR headsets, as I had never seen or heard of anyone having them.
In the fall of 2020, I happened across an advertisement for the Oculus Quest 1. My interest in the novel had not wavered, but nevertheless, I was shocked. I hadn’t realized that the concept introduced to me through a science fiction novel was real in the form of a readily available, and relatively affordable technology. I had been saving money for a while and prepared to make my purchase. Luckily, a friend encouraged me to wait a few months, as in October 2020, the Oculus Quest 2 was released. I ordered the headset and eagerly awaited its arrival.
When I finally got my hands on it, I was over the moon. It may not have looked like the vision Cline painted in his novel, nor like the version in the movie, but it was virtual reality nonetheless. In the time between ordering and receiving the headset, I researched various games and experiences that I wanted to try upon its arrival. Beat Saber, a rhythm game, was top of my list and was my first purchase on the device. I’d never been one to read instructions for consoles, games, or anything at all, so I dove right in and set up my account.
As soon as I began playing, I was hooked. Whether it was the idea of actually experiencing VR or the catchy songs of Beat Saber, I absolutely fell in love with my Quest 2. I played it every moment I had time. As I danced around my living room slashing to the beat of the songs, my parents asked me what I was doing.
I excitedly explained to them what VR was, and how it worked. It was at this point I had my first experience sharing VR with someone else. After a long tutorial on how to wear the headset, how to navigate the menu, and how to play the game, my parents tried out VR for the first time.
This was all back over the winter break of 2020-2021, just before I interviewed to join the Studio X team as a Karp Library Fellow to do XR research. This was during the time that the pandemic was still pretty bad, and VR provided an escape from the harsh reality around me. It helped my anxiety and allowed me to relax, even if just a little bit while I was immersed in the world of VR.
Ever since that initial experience, I made an effort to introduce as many of my family members as I could to virtual reality.
It was the various times of having my mum try VR that really inspired me to explore the topic of user interaction with VR further. Her reactions to the various experiences I had her try really made me understand the impact that VR can have on people’s lives. Her first experience was with Beat Saber, which she thoroughly enjoyed due to its catchy songs, but it was Job Simulator that really captured her attention. “I thought it was going to be dumb,” she said. “When I saw you doing it, it looked silly, but when I tried it, it blew my mind. It was a strange experience because it really made me feel like I was in the room.” For me, it was especially funny watching her play Job Simulator. I had to make sure she wouldn’t forget about the guardian boundary, as she kept trying to walk down the virtual hallway when, in reality, she was about to crash into the coffee table. Another interesting thing was how she was worried about dropping the virtual coffee cup, because she didn’t want it to break or make a mess on the floor.
While he didn’t try Job Simulator, my dad tried Walkabout Mini Golf. He’s not much of a mini-golf fan, but he was blown away by how realistic the physics were in the game. He said he kept feeling like he was going to fall off the edge of the map and even tried walking from hole to hole (which would have required a lot more space than we have in our living room). “You really don’t know what it’s like until you try it, and when you do, you can see all kinds of applications this technology may have in the future.”
I, of course, wasn’t going to have them do Beat Saber, and I didn’t have Walkabout Mini Golf at the time, so I had them watch a few Oculus TV videos.
Having my 96-year-old Great Grandma try VR was quite an interesting experience. She was in awe at the capabilities of the technology and loved the fly-over nature documentary about the ice caps.
My Gran tried a few different parachuting and paragliding videos. “It was amazing to feel like I was there. I feel like I could do paragliding now!”
My Grandpa watched a few shorter space documentaries and was thrilled to be immersed in the galactic environment.
Running a Survey
After seeing the unique reactions from all my relatives, I was curious to know how others felt about their experiences with VR. I had joined several VR-focused Facebook groups to see the kinds of conversations people were having about VR and then decided to run a survey to directly ask the community about their experiences.
With the survey titled “How do Users Experience VR,” I asked a range of questions about age, their perception of VR, what they used it for, and if they had any stories they wanted to share. After about a week and a half of running the survey, I had 282 responses to go through.
One of the things that interested me the most was the age range distribution of those who responded. I’m not sure if this is directly related to the survey being run on Facebook and the demographic of Facebook users, but it doesn’t feel like a misrepresentation as the group was specifically for Oculus Quest users, and Oculus, now the Meta Quest, is owned by the same company.
The following information is taken directly from the survey results. Some of the quotes have been reformatted a bit for coherency.
This first quote is from a friend who had some previous experience with VR. I had them try VR on my headset before asking them to fill out the survey. Their unique perspective on the potential threat VR poses to society is one that I haven’t seen discussed much elsewhere, which is why I believe it is important to include it here.
Jenna, 21, Non-Binary
“At first, I thought it was super cool, but a little bit scary. As I’ve had more experience with VR games, I still think it is an awesome technology with a lot of potential uses, but I fear that VR video game violence will further desensitize users to violence in the real world. I was at an arcade once and played a VR zombie game and had to ask the worker to stop the game because it felt too much like I was killing real living things. Hence my fear of it desensitizing people to violence.”
I included the next quote because it shows VR being used as a tool for long-distance interaction and also how people in your environment perceive you as you play VR.
Tracie, 49, F
“I saw it as an opportunity to stay connected and play with my friend who lives thousands of miles away from me. I used to live in my RV where there was very little space to play, so I would take my headset to the laundromat and play while doing laundry. Can't tell you how many times people were freaked out by what I was doing. I always tap the headset to bring up passthrough when someone came in and when I started interacting with one guy he was totally weirded out ... ‘you can see me!!!’ ... lol. Yes. Yes, I can. (I was in a closed RV community laundry place with the offices and rec center in the same building -- it was completely safe).”
These next two quotes are particularly interesting as they show the potential uses for the elderly and the health benefits of using VR.
Bonnie, 79+, F
“I became so enthusiastic, it was fun, and I moved my body. I bought a headset and began to realize all the possibilities. I finally got my Quest2 in September and found Fitxr and SuperNatural. I have continued to use my quest 2 every day and have barely explored all the apps. My enthusiasm prompted five other sales among my friends as they noted my weight loss and toning of my body. I never thought that an old person could gain strength and balance. Just thought we went backwards physically. I had given away my cross-country skis and now wish I had them back as I have gained strength in my entire body. My balance has improved so much and although I have “bat wings” on one side of my arms I actually have muscle “bumps” on the other side. I can do step ups - more and more each week or so. I can do squats, as many as 40 at a time. Every day my muscles ache, but I LOVE IT as I realize it is a good ache and I earned it. The technology allows me to socialize with others, visit sites I had traveled to previously and brings back happy memories. The technology allows me not to travel to a gym (not that I would have) and to have privacy.”
M, 75, F
“I am seeing more uses for homebound, elderly... seeing it as a way of connecting friends and family scattered around the country. wonderful experience taking my 85-year-old brother to the top of mount everest! Getting to play golf with a group of women every week. exploring worlds in altspace”
The following quote shows the emotional impact VR can have on people through the experiences it allows one to have.
Sherry, 57, F
“I bought an Oculus for my 11 year old granddaughter for Christmas. She brought me to the kitchen and told me to stand in a spot. She put it on me and told me to close my eyes. When it was on, she told me to open them. I was in a beautiful mountain lodge. Out the window were mountains. I was overwhelmed and began to cry. It was as if I had been transported to my home in the mountains 25 years ago. I could not believe my eyes. Literally! I just kept saying … is this real? I knew immediately that I must get my own Oculus and pretty much immediately ordered one for myself based on that 5 minutes of standing in a room looking at the mountains.”
These next quotes show an optimistic perspective for the future of VR technology.
Anonymous, 59, F
“At first, it was just a music game that I played. I've since added more experiences with various game types, the Multiverse, and more. Rather than this just being a gaming system, I can see a future for business, education, research, social interacting (that can actual involve talking to one another vs just texting), shopping, and so much more!”
Susan, 61, F
“I came to see it could be used for exercise and education and other non-gaming applications. I have come to see that it is a powerful educational tool, particularly for people who are limited, either physically or not able to travel to other parts of the world. Also, I believe it could be used to deepen educational experiences in a variety of ways. I also continue to believe it is probably pretty addictive and should not be used many hours of the day as it is basically an escape and not particularly productive in general. I think it’s a great tool for people who are disabled or otherwise housebound. I have a concern that entering a VR world takes away from the time that people spend outdoors, which in the end is far more important.”
Skye, 45, F
“It was more real than I thought it would be. And I immediately saw the potential applications to things that I cared about - like art, exercise, and experience with others. When my brother bought everyone in the family an Oculus for Christmas, that was a game-changer. I was SHOCKED with how much further the technology had come and am a total convert and trying to get others to get a headset so we can hang out in virtual worlds and other experiences. I now see VR as being something that is relevant for my life now and into the future. I see how it can improve my interactions with family and friends (we spend more time together...especially since Covid and distance limits our in-person opportunities), and it has given me new ideas for how to approach and use it for engagement for my wellness company and clients.”
The next several quotes show the potential therapeutic and mental benefits of using VR.
Audrey, 38, F
“I have ADHD. I was diagnosed at 37 years old, and I have found that the exercise component for VR allows me to keep engaged in a way no other exercises have previously. I still do other types of exercise (such as strength training or hiking) but when I’m not in the mood to workout, the menu of options in VR still brings excitement for me.”
Anonymous, 42, F
“I live in the Midwest, and it is dark by 4 o'clock in winter. In vr, I can hop into real fishing vr and spend time on the lake in sunshine. It doesn't matter that it's not real, your body still relaxes, endorphins are released. It has helped a lot with seasonal affective disorder this year.”
Anonymous, 41, F
“My mom passed away June 2020, she had prefrontal dementia, she slowly lost all her motor skills and eventually mobility. One of the last happy memories that I have with her was me putting the oculus quest on her face and guiding her through a tour of the African Sahara. She actually reacted and reached out to try to touch lions and I swear I saw her smile when she saw elephants. Looking forward to seeing what VR therapy for people with dementia, Alzheimer's and other debilitation can bring in the future.”
Susie, 46, F
“I bought a VR to study the exercise game Supernatural and its effect on learning and motivation for Neurodiverse individuals. (Specifically adhd) I realized pretty quickly that this is the platform of the future. Way beyond games. I see it used for mental health/therapy, exercise, social connection, work interaction, performance/skills enhancing (like public speaking) etc. I’m literally applying for a PhD program so I can study VR some more. It’s changed my life! Supernatural daily has decreased my adhd symptoms tremendously. I feel my brain starting to work better. I can see this tool being an alternative to meds for those who can’t take them.”
This next shows how giving VR a second chance can completely change your perspective on the technology.
Anonymous, 24, F
“My first experience was poor. I tried it at the mall when it was fairly new, and it was a video simulation of an amusement park ride. Sitting down, I got a very intense feeling of motion sickness and did not enjoy the video at all. It was a very bland video simulation. Although my 1st experience was bad, I gave it another try at a friend's house. This was a totally different experience compared to my first. I played beat Saber and it was an overwhelming, awe-inspiring time. From that point forward, I began thinking of VR as the future and one of the most advanced types of technology to exist yet. Almost all experience I have had after that has been incredibly immersive and entertaining. I look at VR as an opportunity to take a break from our physical world and enter another world.”
The last quote I leave you with is a pretty cool perspective on how the technology has changed over time, and how it has impacted this person’s life and social interactions.
Gnossos, 65, F
“Early 90s I was hired by a Space Museum to consult on a VR exhibit and traveled to Boston, Chicago and LA to test drive early concepts. First experiences were so bad that I told the Space Museum to hold off on purchasing VR until it was more developed. Oculus Quest's first experience did not disappoint. My perceptions shift with the technology development, of course. I still see it in its infancy - it’s the Pong Era of VR meaning it sucks but we don’t realize it yet. It’s going to be 100 times better in 10 years. I was surprised by having a crush on a guy in Rec Room who played Paint Ball like he was a trained assassin. Crushes are a distant experience for me, so having one with only a voice and a cartoon avatar really surprised me. I think the safety of my anonymous state helped create an openness to flirting that’s not my normal way. It inspired me to wonder more about the potential for intimacy in VR - especially if these spaces were developed by women.”
VR is rapidly growing to be one of the most popular forms of XR. It is estimated that in 2020 nearly one in 5 people in the US, or 19% of consumers, used VR. Due to this increasing demand, it is expected that nearly 15 million AR and VR devices will be shipped to customers worldwide in 2022. Source
The quotes provided above shine a small spotlight on the many ways that people are being impacted by VR every day. From new ways of socializing to new methods of staying physically and mentally fit, VR has the ability to benefit everyone in some way shape or form due to its versatility. It is this social and emotional impact that allows VR to become so popular, as people feel directly connected to the experiences they are trying while in VR. The ever-present description of VR being the ultimate empathy machine is growing more and more accurate as the technology progresses and the range of possibilities expands.
Education from the sciences to the humanities, job training, interpersonal relationships, concerts, work meetings, all these fields can and are already benefiting from VR technologies. More and more people are being exposed to VR every day, and soon enough, it will become a household staple, much like cellphones and TVs. And why? Because of the ways we as users experience VR. It is the consumer perspective that shapes the industry, which is why it is so important to understand why people react the way they do to these technologies.
I personally believe that VR has shaped my perspectives on the world in ways I wouldn’t have been able to imagine due to some of the experiences I have tried. VR has opened my mind to new perspectives on personal space, human interaction, disabilities, and even the way I view myself as a person existing in the real world versus in the digital one.
VR has a unique ability to change perspectives and influence emotions, and it is up to the people using it to decide what path VR ultimately goes down.
Mary Ann Mavrinac, vice provost and dean of the University of Rochester Libraries, shares insight into how the campus community directed the development of Studio X, the library’s new extended reality hub featuring advanced technology and expert training
“I don’t believe in ‘if you build it, they will come.’ You can build something, but they won’t come if you don’t know what your users want,” Mavrinac said. It’s the guiding principle she and her team followed throughout the ideation and planning of the library’s new high-tech hub, Studio X. Located on the first floor of the Carlson Science and Engineering Library, the 3,000 SF space allows students and faculty to participate in immersive learning experiences.
Equipped with technology that supports virtual reality (VR), augmented reality (AR) and everything between (extended reality or XR), Studio X allows researchers to perform tasks such as visualizing large data sets and safely experimenting with hazardous materials by creating a virtual environment. Studio X broadens the range of possibilities for discovery and instruction, but what makes it truly special is its source of inspiration. CannonDesign collaborated with the university to design a facility that the campus community not only requested but also intimately shaped. From inception to completion, student and faculty preferences were integrated with expert knowledge to deliver a space tailored to serve the entire campus community.
We spoke with Dean Mavrinac to learn more about the process and impact of the project. She wanted to underscore that the success, to date, of Studio X is a team effort, much of it led by Digital Scholarship and Studio X director, Emily Sherwood.
There aren’t many academic libraries that offer a space like Studio X. What is it, and how did the project begin?
The project began in fall 2017 when Lauren Di Monte joined our team and learned from the faculty that there was a lot of research activity in extended reality and other immersive technologies. We thought it was something the library could get involved in since we had close to 50 researchers engaged in XR technologies. So, we set out to better understand that landscape and how the researchers would engage with any initiative we developed, whether it was a space or specialized expertise. We knew a generic cave wouldn’t work for them, so we thought about what we may be able to do to help them tackle specific research questions. As it turned out, we pivoted to a space and service that would provide an easy on-ramp to those less familiar with these technologies and related needs.
Today, Studio X is a collaborative hub for extended reality where students and faculty are immersed in learning and teaching in ways that just aren’t possible without advanced technology. It’s a high-tech space that allows exploration, experimentation and experience that truly brings education to life.
What was the goal of Studio X? Who is it for?
The overall goal was to offer physical space, a program, services, technology and expertise that students and faculty needed—and expertise was really big. The user research told us that they wanted a space and experts in the space to teach them how to use and apply the technologies. We approached this goal by providing an on-ramp that made it easy for people to gain access to and experience with XR technologies.
Whether a person is an advanced researcher or a novice user, we’re good at helping people feel comfortable to explore their questions. The library is an interdisciplinary crossroads at the university, so it could be someone studying history, biomedical engineering, neuroscience, religion, ethics—whatever it is—if they’re interested in using XR technologies, we provide the support they need to feel welcome.
Summary: This post reviews resources on XR (extended reality) and accessibility and summarizes best practices for centering accessibility when engaging with these technologies.
Technology in general creates many barriers for disabled users. As XR technologies are rapidly growing in popularity, they exacerbate these challenges. When creating an XR product, whether that be a VR (virtual reality) headset or an AR (augmented reality) game, etc., people tend to think more about their product’s aesthetic or its usability for the average user. What people fail to remember is that not every user will be “the average user.” The world is a diverse place, with people of all ages, genders, races, and abilities, and when creating XR, it is important to keep in mind this diversity. XR and accessibility is itself a new area that is a moving target. Because of this, many new developments are in the works, so these resources may be outdated in just a year’s time.
Before we dive into XR, let’s first define some terms: What are Accessibility and Universal Design?
Accessibility is the ability to access something and be able to benefit from its intended purpose. It sometimes refers to specific characteristics that products, services, and facilities have that can be used by people with a variety of disabilities.
Accessible Design is a design process that specifically considers the needs of people with disabilities.
Universal Design is the process of creating products that are accessible to people with a wide range of abilities, disabilities, and other unique circumstances.
Link to Webpage Education, Teaching, Research, Organization, Conferences, Resources
XR Access is a community committed to making virtual, augmented, and mixed reality accessible to people with disabilities. Their mission is to modernize, innovate, and expand XR technologies, products, content and assistive technologies by promoting inclusive design in a diverse community that connects stakeholders, catalyzes shared and sustained action, and provides valuable, informative resources.
The site provides a plethora of materials for those interested in their efforts. Their research network provides valuable information regarding accessibility research that’s happening across the XR access research network. They have workstreams, which are community-led efforts to inform the design, development, and production of accessible XR. In addition to these, they also have a wide variety of other resources that are there to aid people in their own research, some of which are their annual XR Access Symposium reports (see below for more about the symposium). XR Access also curates stories of disabled folks who have used technology both successfully and unsuccessfully to help advocate for accessible XR technology. Those interested can sign up for their newsletter or join their robust Slack community.
Accessibility Needs of Extended Reality Hardware: A Mixed Academic-Industry Reflection
This journal publication walks the reader through the process of and reasoning behind the need for accessible XR hardware and software. By starting out with an explanation of the benefits of XR, they then move on to show why the accessibility movement should start with hardware. If a user cannot wear a headset, then they cannot experience its software. The XR Access Symposium of 2019 allowed many people to connect and expand upon their individual ideas, which allowed them to establish their goals for XR hardware accessibility. They established a need to: understand related fields’ accessibility guidelines, determine the most pressing obstacles, consider industry guidelines, and increase public awareness of the issues at hand. With those needs in mind and a focus on a community-centered approach, they believe it is easily possible to succeed in overcoming the lack of accessible XR hardware.
Barriers to Supporting Accessible VR in Academic Libraries
Although XR technologies offer new opportunities to engage students, they also present more challenges for disabled students. Technology, in general, already tends to exclude these users, and XR’s rapid rate of development further complicates things. The article shares statistics as of 2019 from the U.S. Department of Education, National Center for Education: “19.4% of undergraduates and 11.9% of graduate students have some form of disability.” The authors argue that academic libraries, as leaders in supporting and sharing new technologies, are well poised to address accessibility challenges for XR and must create clear policies and service models that support all users. While no clear accessibility guidelines currently exist, there are several promising initiatives such as XR Access Symposium that are working towards this goal. They detail two accessibility initiatives occurring at Temple University and at the University of Oklahoma. The authors then conclude with a list of key takeaways:
Plan for Accessibility from the Beginning: Libraries can save time and resources by thinking about accessibility issues at the start of a program or project.
Lack of Standards: As of 2020, there are no standards for accessible VR design, but there are related standards that could lay the groundwork for their development.
Developer Support is Essential: Libraries that intend to develop VR experiences need to have sufficient developer support with accessibility expertise.
Importance of Auditing and Reporting: Out-of-the-box VR experiences will pose different accessibility challenges from one person to the next and should be audited to better understand these barriers to access. If a library lacks a developer to modify software or create new software, at the very least, available software needs to be audited and have a corresponding accessibility report produced.
VR is Not the Pedagogy: VR should be another tool in an educator’s arsenal, not the sole focus of a class (unless VR is the course subject). As Fabris et al. (2019) suggest “Having VR for the sake of having VR won’t fly; the VR learning resources need to be built with learning outcomes in mind and the appropriate scaffolds in place to support the learning experience” (74).
Acknowledge the Limits of VR Accessibility: There are limits to making VR accessible. The reality is that there will be students who are unable to use VR for a variety of reasons. Therefore, there should always be an alternative access plan developed so that students have access to non-VR learning methods as well.
XR Accessibility Initiatives in Academic Libraries
As libraries traditionally take the lead in accessibility initiatives, a survey was done to examine the accessibility of their digital resources. Three questions were asked and sent to various academic libraries, and they received responses from 30 universities:
Question 1: What is the level of development of accessibility support for XR technologies in academic libraries?
The majority of institutions surveyed did not have policies or dedicated staff to support the accessibility for XR resources
Question 2: What XR accessibility knowledge do library staff and administrators currently have?
Nearly all participating spaces had some awareness of the challenges that XR provides and are able to find resources to assist when needed.
Question 3: What are the main barriers to developing accessibility support for XR technologies in academic libraries?
The top three barriers to developing accessibility policies and processes were lack of staff knowledge, lack of funding, and lack of time.
The concluding result was that XR and accessibility in academic libraries is still developing, so policies and staff are not yet in place. It is also noted that many institutions have plans to begin progressing towards implementing strategies soon.
DLFteach Toolkit: Lesson Plans on Immersive Pedagogy
The digital library foundation (DLF) has put together a toolkit of lesson plans that facilitate interdisciplinary work engaged with XR technologies. The toolkit is focused on a decolonial, anti-ableist, and feminist pedagogical framework for collaboratively developing and curating humanities content for emerging technologies.
Located in the introductory materials section of the toolkit, there are three particularly useful resources. Recommendations for accessible pedagogy with immersive technology, an immersive technology auditing checklist, and instructions on how to create an equally effective alternate action plan for immersive technologies.
Recommendations for Accessible Pedagogy with Immersive Technology – serves to provide a background for the increasing need for creating educational resources for disabled learners. The list of materials provided are intended to guide educators on how to incorporate immersive technologies into their teaching while also keeping disabled learners in mind. It is split into three sections: accessibility and disability, readings on the accessibility of immersive technologies, and recommended administrative considerations. It ends with a series of questions to keep in mind when teaching.
Immersive Technology Auditing Checklist – serves to identify and document the various challenges of making immersive technologies accessible. It divides the workflow into three steps: purchasing software and hardware, providing technical support for software and hardware, and ensuring user access to software and hardware. The checklist then walks you through a series of important questions when considering each phase of the process, posing questions such as “What hardware is required?” and “Is there an accessibility page for the software?” It also dives into questions about ease of operation and perception, asks about the robustness of the technology, and asks about any documentation about the technology.
Creating and Equally Effective Alternative Action Plan for Immersive Technologies – serves to instruct the reader on how to create an Equally Effective Alternative Action Plan (EEAAP). An EEAAP is a document that is used when there is an accessibility barrier in a technology (i.e. when a technology is unable to be used by a person or group with a disability). The components of an EEAAP are a description of the issue, the person or group affected, the responsible faculty, how the EEAA will be provided, the additional EEAA resources required, repair information, and a timeline for unforeseen events. Some examples of EEAAP’s are listed at the end of the resource.
Exploring Virtual Reality Through the Lens of Disability
This resource comes directly from the DLF Toolkit. It provides a lesson in an interdisciplinary approach to introducing VR immersions through the lens of disability studies. They are not aiming to represent how all people experience disability, rather they are trying to create an activity that includes discipline–specific theory and criticism. They then talk about the different types of VR: cinematic VR uses filmmaking techniques; simulation VR simulates the real and fictional, while the user is an active participant; representational VR creates immersive experiences through sensory embodiment; and therapeutic VR is designed for various treatments.
The resource then becomes an instructional guide on how to try several disability-related experiences. They recommend the audience, curricular context, learning outcomes, materials needed, how to prepare for the experiences, and provide a long list of sample instructions. Following this, they list several important applications they recommend trying: Notes on Blindness, The Party, and InMind VR. Each experience is paired with a plethora of questions and other external resources they found to be relevant.
Notes on Blindness – This experience tells the story of a man who lost his sight and how he coped by keeping an audio diary. For three years, he recorded over sixteen hours of material.
The Party– A VR film by The Guardian that allows you to enter the world of an autistc teenager who is at a surprise birthday party. You will hear internal thoughts about how the experience affects her and share the sensory overload that leads to a meltdown.
InMind VR – A short adventure that allows the user to journey into a patient’s brain and search for the neurons that cause mental disorder.
When you are in the beginning stages of creating something in an XR medium, whether that be a device or an experience, it is important to keep in mind the various factors that might make something less accessible. Accessibility could mean anything from being differently abled than those around you in terms of motor function, sensory deprivation, or wealth and societal standing.
VR has a plethora of positive features that could be beneficial to differently abled users such as the ability to enhance spatial sound on one side of the body, render visuals with higher contrast, and enable those in wheelchairs to experience what it would feel like to “walk around” in VR. However, like with any technology, VR also presents many accessibility challenges such as the heavy emphasis on motion controls, the use of the body to control many experiences, and the requirement to stand during some VR experiences.
Considering these and other challenges, here are some things to keep in mind while trying to make XR design more inclusive:
Hardware – What equipment do people need to participate in a VR environment? Is a standalone headset and controllers all that’s required? Or is there some form of special equipment or a computer to run the experiences also needed?
Navigation and Interfaces – How understandable is the XR environment? If a user had no context or guidebook upon entering the space, would they know what to do and how to interact? Make things either clearly labeled or have a guide or some form of instructions available. This could involve an avatar that appears to give instructions along the way, an instruction dialog box, or a guidebook with your product.
Communication – How are speech and body language communicated? Do you have an avatar that represents you in an environment? Is there full body tracking, or does your avatar just float from place to place? Do you speak using a microphone, or are there pre-written text options to choose from? Is captioning available?
Customization and Interoperability – Allow users to customize the XR environment to their needs. Can you enable color contrast? Can you toggle on and off captioning when needed? Are there a variety of sound options?
Avatars and Embodiment – Make sure that there are a wide range of options so people can feel accurately represented. Is there a wide range of skin tones, hair colors, hairstyles, clothing, etc. that will enable any person from anywhere in the world to feel as if they are properly represented in the VR space?
Try out the space yourself and see if it works from several perspectives of ability, seated, standing, sound, no sound, etc. Think about the users that you want to be able to access the device and try to see it from their perspective. Another way to do this might be having testing where you have differently-abled people come to try out your device/program and offer feedback.
An Accessible Future – XR: Considerations for Virtual, Mixed, and Augmented Reality
There are many XR applications for the workplace, such as virtual orientation events and training sessions. Imagine being able to attend a conference with people from all over the world using VR: you could still get the experience of being among professionals in your field without ever having to leave your home or office. For example, the XR Access Initiative used VR during its annual symposium to foster engagement. They created virtual rooms that conference participants could explore and interact with their surroundings, held virtual demonstrations, and provided captioned rooms and rooms with ASL interpreters.
The XR Access Initiative emphasizes three key accessibility factors for virtual conferences: captions, sign language communication, and keyboard and screen reader usage.
Captions – Captions should follow a user and be legible regardless of what angle from which they view the environment.
Sign Language – Sign language interpreters should be located in high visibility areas, and those who need interpreters should be able to get easy access to them.
Screen Reader/Keyboard – For those who are unable to or do not wish to use VR to attend, they should be able to interact with the space in the same way a person in VR could, though with simplified controls. Having cross-platform capabilities is important.
This virtual symposium showcases how VR can make conferences and other virtual events accessible to many people.
Why VR/AR Developers Should Prioritize Accessibility in UX/UI Design
Link to Article UI/UX, Development, Inclusive Design, Accessibility Settings
An important thing that this article touches on is how a lack of accessibility in VR can make people feel left out or ignored. For example, the easier it is for people to understand a game, the more likely they are to play it. Some things that you might not think about for inclusive design are different hair types or people who experience arthritis. If you have long hair that’s in a ponytail or buns or even fluffy hair, putting on a headset might become difficult as you will have to rearrange your hair into a new position to get the headset on. People with arthritis may need to sit down in the middle of a game, or their fingers or hands get sore after a time. Making controls easier to change in the middle of a game or experience would be very helpful in these cases. Some ways to make VR more accessible for glasses wearers could include the ability to change vision settings or the creation of better glasses adapters for current headsets.
There is a huge importance in having a diverse group of people in your testing groups to ensure that people of all genders, ethnicities, abilities, socio-economic backgrounds, and other identities are able to interact with your product with ease. It may be impossible to accommodate every unique circumstance but taking the diverse voices of others into consideration while making your product will ensure a better end result. While it may take a little more time to try to make sure everyone is included, the end design will be more profitable and beneficial to a larger community, which is most important.
This link is to the proceedings of the International Conference on Computers Helping People with Special Needs (ICCHP). The 2020 ICCHP conference proceedings has a section on XR and accessibility. It has several articles on this topic that cover a wide range of subjects from vocational training for students with disabilities, AR for people with low vision, guidelines for inclusive avatars, and more.
This is Oculus’ guide for developers on how to create with accessibility in mind. The Accessibility VRCs (Virtual Reality Check Guidelines) focus on audio, visuals, interactions, locomotion/movement, and other aspects of accessible design. By deploying these guidelines, they ensure that every application officially available on their platform will meet certain accessibility requirements–something that might make their platform usable for more people. Link to the VRC Webpage: https://developer.oculus.com/resources/publish-quest-req/
Initiative aims to make virtual, augmented, and mixed reality accessible
This article links to a webinar about a new initiative to make XR accessible to more people. Larry Goldberg, Senior Director and Head of Accessibility at Verizon Media, discusses emerging technologies and how his company deals with this technological growth. The webinar highlights the importance of how we can use existing technologies as a jumping off point to create new accessible technologies from the beginning, or as Larry Goldberg says, have the technologies be “Born accessible.”
The XRA’s (XR Association’s) developer guide serves as a starter resource for developers looking to create XR experiences. The guide offers a series of industry-backed best practices to developing accessible platforms.
This is Microsoft’s project that considers how to design mixed reality technologies in a way that makes them usable and useful to people of all abilities. This webpage links to those involved with the project, publications, and other news surrounding their efforts.
A custom locomotion driver for Steam VR applications introduces four new features for those with disabilities. The four features – virtual move, motion range boost, hand tracking, and Xbox controller move – can be adjusted to an individual user’s needs on the fly.
Virtual move allows players to use their controllers’ joystick to move, rather than having to physically move their arms.
Motion range boost changes the origin point of motion controllers to amplify movement. It translates a small movement into a large one.
Hand tracking allows the position of motion controllers to be emulated based on hand movements rather than having to use actual controllers.
Xbox controller move allows users to use a gamepad to emulate VR controller inputs.
Project Tokyo is a Microsoft initiative that aims to help members of the blind and low-vision community with intelligent personal agent technology that leverages AI to extend their capabilities. The long-term goal of the project is to show that this XR technology can be used by anyone and even assist those with disabilities. Their focus is to create a way for those who are blind or have low vision to see the world or at least perceive it in a similar way to which sighted people do.
They provide several examples throughout the article. For example, they demonstrate the device’s AI ability to notify a user that someone is looking at them. If the wearer turns in the direction of another person, the AI is able to identify the other person’s name for the wearer. An individual working on the project states, “Whenever I am in a situation with more than two or three people, especially if I don’t know some of them, it becomes exponentially more difficult to deal with because people use more and more eye contact and body language to signal that they want to talk to such-and-such a person, that they want to speak now,” he said. “It is really very difficult as a blind person.” Social cues, whether conveyed verbally or physically, are so important for interaction. Rather than starting from scratch, the team is using a modified Microsoft HoloLens, as the HoloLens provides essential information to the AI for reading the environment.
Accessibility is a major priority for those in education fields. Approximately 15% of the world’s population has some form of disability, and one in four adults in the US have a disability that affects “major life activities.” As VR evolves, it provides a whole new range of opportunities and experiences for many people. For example, many visually impaired users can actually see better in VR due to the depth perception headsets provide. Moving forward, VR creators should consider the wide-ranging needs of users from the beginning of the development process.
Microsoft has developed several XR products with accessibility in mind:
Canetroller [Link]– The Canetroller, a Microsoft patented haptic device, works as a white cane that visually impaired people can use to experience a virtual environment.
Seeing VR [Link] – SeeingVR is a series of tools to make VR more accessible to those with low vision. The tools include a magnification lens, a bifocal lens, a brightness lens, a contrast lens, edge enhancement, peripheral remapping, text augmentation, text to speech, depth measurement, and more.
Braille Controller [Link] – The Microsoft-patented, braille-displaying controller attaches to the back of an Xbox controller, allowing for an alternative way for the visually impaired to experience games. The inspiration for this particular project was to make text-heavy video games more accessible to the visually impaired.
Hospitals are beginning to use VR to find new ways of relieving pain and offer palliative care to patients. While there is no technology currently in existence that would be able to restore someone’s sight, tools such as the IrisVision [https://irisvision.com/] can assist those living with such impairments by providing vision-aid features, a personal voice command assistant, a text-to-speech reader, and high contrast fonts. AR is also being studied to determine if such devices could be helpful with those who suffer from age-related macular degeneration.
The article also links to a variety of informational videos and links to accessibility groups and associations.
Inclusivity of VR and AR Accessibility for the Visually and Hearing Impaired
There are a plethora of companies working on creating applications for enhancing the experiences of differently abled users, and this article highlights a small sample of those projects. Microsoft has created the “canetroller,” which allows a blind or visually impaired person to access virtual reality through a controller that resembles a white cane that uses haptic and audio feedback. Nearsighted VR Augmented Aid is an Android application that uses a mobile device’s camera to display images in stereoscopic view. London’s National Theater did something similar with the help of Epsom’s latest smart glasses to display subtitles in the user’s field of vision, so even if a viewer looked away, they would still be able to see the subtitles. There are many more projects linked in the article.
While the concept of virtual reality (VR) is not new, VR in practice has only become ubiquitous in recent years. Due to an increase in media exposure, new technology developments, and an explosion of use cases, VR is swiftly becoming an in-demand medium for a wide variety of users. This article will give you a look into the past, the present, and the future of VR, using the novels Ready Player One and Ready Player Two by Ernest Cline as a framework.
Let’s start out with a definition. Virtual reality, or VR, is an immersive experience, also known as a computer-simulated reality. Wearing a VR headset, the user is immersed in an experience that has images and sounds that can either replicate the real world or create an imaginary one. Some examples of popular VR headsets are the line of Oculus headsets from Facebook (now Meta), the Quests and the Rifts. Other common VR platforms include the HTC Vive, the Valve Index, and the Pico VR.
The Ready Player duology, consisting of Ready Player One (2011) and Ready Player Two (2020), is one of the most mainstream works that discusses VR technology in great depth. The novels hold an exciting narrative told by Wade Watts, a teen living in the year 2045 on a dystopian future earth. He and many others use the OASIS, a virtually simulated utopia to escape from a boring, impoverished life and the declining state of the planet. Both in the novel and in real life, the OASIS is used to play games, watch videos, hang out with friends, attend class, go to work, and so much more. The first novel focuses mainly on haptic immersion technology, while the second novel shifts to full dive technology. As the books are set in a not-so-distant future that centers on the emerging technology we are beginning to see, they provide an interesting framework from which to consider the potential impact of this technology.
Let’s talk more in-depth about two main categories for this technology:
Haptic immersion is the branch of VR that relies heavily on using haptic sensory technology to give the user a physical feeling such as a vibration while experiencing VR. By using things such as haptic gloves, and in some cases, haptic vests or suits, this branch of VR allows a user to put on a headset and experience VR with more realism. Haptic immersion also relies heavily on the physical motions a player makes–sometimes incorporating a treadmill or slide-pad (slippery surface to run on with low-traction shoes).
The haptic sensory technology allows the user to feel a new dimension of sensory input from the games they are playing. Some simple applications of haptics might be allowing the user to feel as if they are actually holding objects or allowing more player interaction, such as tapping another avatar on the shoulder. Imagine you are playing a first-person shooter game and that you are attacked by an enemy from behind. If you are playing on a computer, the damage input would typically be shown around the center screen crosshair with the direction of the damage indicated with an arrow or line. By playing with haptic immersion, however, you would be able to feel the exact spot that the damage is dealt, giving you the ability to react more quickly to the attacks and providing for a more immersive experience.
The first description of haptic immersion we see in Ready Player One is most similar to the technologies we currently have available today.
“The wireless one-size-fits-all OASIS visor was slightly larger than a pair of sunglasses. It used harmless low-powered lasers to draw the stunningly real environment of the OASIS right onto its wearer's retinas, completely immersing their entire field of vision in the online world. The visor was light-years ahead of the clunky virtual-reality goggles available prior to that time, and it represented a paradigm shift in virtual-reality technology-as did the lightweight OASIS haptic gloves, which allowed users to directly control the hands of their avatar and to interact with their simulated environment as if they were actually inside it. When you picked up objects, opened doors, or operated vehicles, the haptic gloves made you feel these nonexistent objects and surfaces as if they were really right there in front of you. The gloves let you, as the television ads put it, ‘reach in and touch the OASIS.’ Working together, the visor and the gloves made entering the OASIS an experience unlike anything else available, and once people got a taste of it, there was no going back.”
-Ernest Cline, Ready Player One, Page 58
Then, we have descriptions of things that still lie in our future.
“I spent the majority of my time in my Shaptic Technologies HC5000 fully adjustable haptic chair. It was suspended by two jointed robotic arms anchored to my apartment's walls and ceiling. These arms could rotate the chair on all four axes, so when I was strapped in to it, the unit could flip, spin, or shake my body to create the sensation that I was falling, flying, or sitting behind the wheel of a nuclear-powered rocket sled hurtling at Mach 2 through a canyon on the fourth moon of Altair VI. The chair worked in conjunction with my Shaptic Bootsuit, a full- body haptic feedback suit […] The outside of the suit was covered with an elaborate exoskeleton, a network of artificial tendons and joints that could both sense and inhibit my movements. Built into the inside of the suit was a weblike network of miniature actuators that made contact with my skin every few centimeters. These could be activated in small or large groups for the purpose of tactile simulation--to make my skin feel things that weren't really there. They could convincingly simulate the sensation of a tap on the shoulder, a kick to the shin, or a gunshot in the chest. (Built-in safety software prevented my rig from actually causing me any physical harm, so a simulated gunshot actually felt more like a weak punch.) I had an identical backup suit hanging in the MoshWash cleaning unit in the corner of the room. These two haptic suits made up my entire wardrobe. My old street clothes were buried somewhere in the closet, collecting dust. On my hands, I wore a pair of state-of-the-art Okagami IdleHands haptic datagloves. Special tactile feedback pads covered both palms, allowing the gloves to create the illusion that I was touching objects and surfaces that didn't actually exist.”
-Ernest Cline, Ready Player One, Pages 191-192
As mentioned before, there is nothing quite like this on the market as of yet, and a lot of what comes close is not readily available for mass consumption due to either high prices or the production is mainly focused on high-end business usage. The cost it takes to produce high-end haptic technology puts it way out of the price range of the average consumer. One such example of this comes from the company haptx.
Haptx specializes in industrial-grade haptic technology and is miles ahead of other companies due to their patented microfluidic systems. With a combination of microfluidic skin, force feedback exoskeletons, magnetic motion tracking, and powerful pneumatics, it is the only company that is capable of providing true haptics (for now!). The company’s main focus is their haptic gloves, which are used for a variety of industries from aircraft manufacturing to firefighting. While these haptic gloves are quite an impressive technological invention, the majority of the other technologies mentioned in the novel still lie in the future for the real world.
While we don’t yet have any high-end haptic chairs that are suspended from the ceiling like in Ready Player One, we do have some basic haptic suits that are on the market. The current main provider of haptic suits is bHaptics who offer a range of haptic technologies like vests with varying feedback points, haptic sleeves, haptics for VR HMDs, and haptics for hands (not gloves) and feet. And they’re pretty pricey. The “cheapest” haptic vest with only 16 feedback points comes in at $299, and the most expensive with PC compatibility and 40 feedback points is at $549. If you’re interested in the full experience, you’re looking at a price of about $1,400+, and it’s not even a proper full haptic suit.
Another company, TESLASUIT (not affiliated with Tesla), has developed a full-body haptic suit, though it is only for industry use. Considering the cost and inaccessibility of these technologies, it’s not hard to see why discussions about haptics haven’t broken the internet yet.
Haptic immersion uses these technologies in coordination with VR headsets to bring users a more physically engaging VR experience. Some future potential applications could include military training or physical therapy. Military training in VR might allow soldiers to have exposure to more dangerous scenarios without the worry of facing actual danger. The haptic technologies would allow for them to feel the consequences of their actions without the setback of having a lasting injury. Physical therapy in VR would allow for a more engaging experience, allowing, for example, the patient to choose a fun environment to transport themselves away from the doctor’s office for the duration of their visit. If the haptic suits could be combined with some form of hard robotic exoskeleton, it could allow for those doing physical therapy to have a more exciting and engaging experience. The addition of a robotic exoskeleton could allow for stretches that a patient may be unable to complete on their own, helping them on their journey to a potentially quicker recovery. The possible applications for haptic immersion give this branch of VR a future full of technical applications that may enhance the way we view physical activity and training.
Full dive is a branch of VR that has had a lot more media exposure. At its core, full dive is VR for your mind. It relies primarily on a brain-computer interface (BCI) to allow a user to control the virtual world with their thoughts. This technology is currently undergoing research, and there have been no real instances of it being a reality as of yet. In theory, a true full dive would allow a user to put on a VR headset with BCI capabilities, and the headset would intercept their brain signals, allowing the user to become their virtual avatar while in the simulation. Imagine the possibilities that this technology could present: reduced travel costs, more immersive educational experiences, being able to visit any place or any time period, or working a job you would never be able to do otherwise. Ernest Cline engages with this concept in his book Ready Player Two:
“The device had a segmented central spine that appeared to stretch from a wearer's forehead to the nape of their neck, with a row of ten C-shaped metal bands attached to it. Each band was comprised of jointed, retractable segments, and each segment had a row of circular sensor pads on its underside. This made the whole sensor array adjustable, so that it could fit around heads of all shapes and sizes. A long fiber-optic cable stretched from the base of the headset, with a standard OASIS console plug at the end of it. […] ‘The device you now hold in your hands is an OASIS Neural Interface, or ONI.’ He pronounced it Oh-En-Eye. ‘It is the world's first fully functional noninvasive brain-computer interface. It allows an OASIS user to see, hear, smell, taste, and feel their avatar's virtual environment, via signals transmitted directly into their cerebral cortex. The headset's sensor array also monitors and interprets its wearer's brain activity, allowing them to control their OASIS avatar just as they do their physical body--simply by thinking about it’”
-Ernest Cline, Ready Player Two, Pages 5-6
The term “full dive” was coined in the 2016 anime franchise, Sword Art Online, though the concept was brought to mass media a lot earlier in The Matrix, which was released in 1999. The Matrix revolves around the main character, Neo learning that the entire world is a simulation created to keep humans complacent while AI harvests their bodies for energy. In Sword Art Online, the main character is trapped in a virtual MMO (massively multiplayer online game) by the creator of the device along with all the other players. The creator had trapped the players in the game by removing the log-out option. The only way one can escape the simulation is to beat the game.
“Its telescoping bands retracted automatically, pressing the array of sensor and transmitter pads mounted on them firmly against the unique contours of my cranium. Then its metal joints tightened up and the whole spiderlike device locked itself onto my skull so that its pads couldn't be jostled or removed while the device was interfacing with my brain. According to the ONI documentation, forcibly removing the headset while it was in operation could severely damage the wearer's brain and/or leave them in a permanent coma. So the titanium-reinforced safety bands made certain this couldn't happen. I found this little detail comforting instead of unsettling. Riding in an automobile was risky, too, if you didn't wear your seatbelt … The ONI documentation also noted that a sudden power loss to the headset could also cause potential harm to the wearer's brain, which was why it had an internal backup battery that could power the device long enough to complete an emergency logout sequence and safely awaken the wearer from the artificially induced sleeplike state it placed them in while the headset was in use.”
-Ernest Cline, Ready Player Two, Page 9
It is this aspect of the technology that Sword Art Online, The Matrix, and Ready Player Two all highlight over the course of their respective narratives, and it is these things that make some people (myself included) skeptical about the progression towards this technology. These narratives show the dangers that this technology could pose if some maniacal creator or AI might take advantage of the BCI to trap their users inside the simulation.
The potential ethical issues that arise with any type of VR, especially with BCI-enhanced VR, are innumerable. When approaching full dive and any technology really, both creators and users must keep in mind the potential negative outcomes that may arise from the creation of such a technology. The applications for full dive in conjunction with accessibility research is presented at the beginning of Ready Player Two.
"A few months after GSS launched the OASIS, Halliday set up an R&D division at the company called the Accessibility Research Lab. Ostensibly, its mission had been to create a line of neuroprosthetic hardware that would allow people with severe physical disabilities to use the OASIS more easily. Halliday hired the best and brightest minds in the field of neuroscience to staff the ARL, then he gave them all the funding they would ever need to conduct their research. The ARI's work over the next few decades was certainly no secret. To the contrary, their breakthroughs had created a new line of medical implants that became widely used. I’d read about several of them in my high school textbooks. First, they developed a new type of cochlear implant that- for those who chose to use it--allowed the hearing impaired to perceive sound with perfect clarity, both in the real world and inside the OASIS. A few years later, they unveiled a new retinal implant that allowed any blind people who wished to be sighted to "see" perfectly inside the OASIS. And by linking two head-mounted mini cameras to the same implant, their real-world sight could be restored as well. The ARI's next invention was a brain implant that allowed paraplegics to control the movements of their OASIS avatar simply by thinking about it. It worked in conjunction with a separate implant that allowed them to feel simulated sensory input. And the very same implants gave these individuals the ability to regain control of their lower extremities while restoring their sense of touch. They also allowed amputees to control robotic replacement limbs, and to receive sensory input through them as well. To accomplish this, the researchers devised a method of "recording" the sensory information transmitted to the human brain by the nervous system in reaction to all manner of external stimuli, then compiled these assets into a massive digital library of sensations that could be "played back" inside the OASIS to perfectly simulate anything a person could experience through their senses of touch, taste, sight, smell, balance, temperature, vibration--you name it."
-Ernest Cline, Ready Player Two, Pages 15-16
Being able to give people with disabilities the ability to participate in VR would be revolutionary, as the industry today is currently struggling with creating inclusive designs. Using full dive and BCIs there would be little to nothing preventing anyone from experiencing the technology. Despite these positive applications, it is still important to think about the negative side effects of this technology. BCIs come in direct contact with a user’s brain, giving the device the ability to not only gather all sorts of information about users but also the potential ability to completely take over control of their minds is something that is frightening to most.
The closest thing we have to this technology as of yet is the NextMind BCI. While this device is mainly for developers at this time, it does come with some sample games, several for the computer, and one for VR. The game demos show off the capabilities of the NextMind: A user can move simple components of these games with their mind. One example illustrates that you can control the paddle in a game of Pong, and another shows that you can move obstacles in all directions away from your character. These simple applications are nowhere near as advanced as what the media predicts, but they are a stepping stone to the future of what this technology could provide.
“My vision went black for a moment as the headset instructed my brain to place my body into a harmless sleeplike state, while my conscious mind remained active inside what was basically a computer-controlled lucid dream. Then the OASIS slowly materialized into existence all around me, and I found myself standing back inside Anorak's study, where I’d last logged out. Everything looked the same as before, but it felt completely different. I was actually here, physically inside the OASIS. It no longer felt like I was using an avatar. Now I felt like I was my avatar. There was no visor on my face, none of the faint numbness and constriction you always felt wearing a haptic suit or gloves. I didn't even feel the ONI headset my real body was actually wearing. When I reached up to scratch my head, the device wasn't there.”
-Ernest Cline, Ready Player Two, Page 12
Another example of a BCI technology in current development is Neuralink. Neuralink is a neurotechnology company that was founded by Elon Musk in 2016. Their main goal is to create an implantable BCI that will help people with paralysis. They hope to one day be able to create a full dive VR system to better the lives of people living with disabilities.
In February 2021, Musk released a recording of a monkey that could play video games with its mind by using a Neuralink computer chip in its skull. Musk claims that one day the Neuralink could allow humans to send concepts to one another using telepathy, or even allow people to exist in what he calls a “save state” meaning that after they die, their consciousness could be transported to a robot or another human.
While the concept of full dive is a lot scarier than that of haptic immersion, its applications are endless. One potential use case could be for those who are physically disabled, such as the producers of Neuralink suggest. Most VR experiences at the present day are not very accessible to those who do not have control over their body’s full range of motion. With full dive, however, there is no need for physical movement at all, making this potentially the most accessible type of technology. Another ethical concern for full dive VR is regarding the data it collects, which is a major issue for all the technology we use. This is especially concerning for full dive VR due to its direct connection to a user’s neuro activity.
Although full dive has a lot more controversy surrounding it due to its depiction in mass media, both full dive and haptic immersion have their benefits and drawbacks.
Haptic immersion allows for a more realistic physical experience while also allowing the user to be immersed in other worlds. The user can run, jump, walk, and feel every sensation, every punch, every tap on the shoulder. Full dive connects directly with the user’s mind, making for a more out-of-this-world experience. The user doesn’t need to move a single muscle to experience the simulation, and depending on the programming, they could enable or disable various sensations.
Haptic immersion allows for applications in physical therapy, strength training, and virtual sports, while full dive would allow disabled users to experience VR without the limitations they face in the real world.
Haptic immersion is most likely the first of the two technologies we will see become mainstream due to the wide variety of resources that are already available and will likely remain the one that is most readily accessible and accepted by society. Full dive will provide a much more unique experience for users but will ultimately be the one that undergoes the most criticism and rightfully so.
I am personally very excited to see where these two technologies go as we progress even further in innovation. The two paths are unique in their own way, and I look forward to watching as the paths diverge even further. Of the two, I am more interested in the haptics side of things, due to all of the negative media surrounding the concept of full dive. I’m not quite ready to give up my consciousness to a machine, but maybe by the time we get there, my opinion will have changed.
At the end of the day, knowing the difference between these two types of VR is important as the technology progresses into the future.
Imagine you are in a meadow picking flowers. You know that some flowers are safe, while others have a bee inside that will sting you. How would you react to this environment and, more importantly, how would your brain react? This is the scene in a virtual-reality environment used by researchers to understand the impact anxiety has on the brain and how brain regions interact with one another to shape behavior.
“These findings tell us that anxiety disorders might be more than a lack of awareness of the environment or ignorance of safety, but rather that individuals suffering from an anxiety disorder cannot control their feelings and behavior even if they wanted to,” said Benjamin Suarez-Jimenez, Ph.D., assistant professor in the Del Monte Institute for Neuroscience at the University of Rochester and first author of the study published in Communications Biology. “The patients with an anxiety disorder could rationally say – I’m in a safe space – but we found their brain was behaving as if it was not.”
“Black Past Lives Matter: Digital Kormantin,” funded with a $99,874 NEH Digital Humanities grant, will create a website with meticulously detailed virtual tours of a 1632 English fort on the coast of Ghana that was among the earliest to send enslaved Africans to the American colonies.
Sustained Black Lives Matter protests have focused national attention on persisting racial inequalities in the United States. Because this racism “has been centuries in the making, reconciliation depends upon all Americans understanding a Black history extending back four centuries temporally and across the Atlantic world spatially,” says Jarvis, a history professor who also infuses archaeology and digital media studies in his teaching and research.
Moreover, the website will be accessible to millions of people who, even without the travel barriers raised by COVID 19, would never have the means or opportunity to visit the coast of Ghana.
“Although no substitute for an actual visit, this project will make virtual visitation possible for an historic site every bit as important to American history as Jamestown or Plymouth Rock,” says Jarvis.
3D modeling can seem like something for just the professionals with fancy equipment, expensive programs, and hours upon hours of time. What if I told you that you could do 3D modeling using Blender on your own laptop, for free, and create a starter project in under an hour? Blender is a free and open-source 3D computer graphics software tool set used for creating a wide variety of things. With the models you create in Blender, you could go on to create animated films, 3D printed models, computer games, and even things in virtual reality.
I started learning Blender in Fall of 2020 through a class offered at the University. As a Creative Writing major, I never would have guessed that I would one day learn to use a 3D modeling software. After being so used to creating things with words on paper and making the occasional doodle with pen on paper, the idea of creating in 3D seemed so foreign to me. I was initially taking the class to fill a requirement but realized that I enjoyed using Blender and 3D modeling quite a bit. I already enjoyed drawing and doing other forms of digital art, so Blender soon began to feel like the next natural progression for me once I was introduced to the software.
After taking the class and getting the hang of things, I proceeded to remove it from my computer’s home screen and didn’t think about it again. The class had been stressful, and I thought that I would never need to use the program again. When the Spring rolled around, however, I got my position at Studio X. As part of my training, I was asked to attend one of Studio X’s workshops to better understand our programming goals and approach to engaging new XR users. I chose to attend one on Blender, since I was at least familiar with the software. During this workshop, I created a frog:
This frog helped me become reinterested in Blender, and before I knew it, I was making all sorts of little projects in my free time. I realized that Blender was more fun than the class had let me realize, as in Blender, you can create anything you can imagine! Now I hope to inspire others to introduce themselves to the software so they can get creative in their free time.
If you are interested in getting started with Blender, you’re in the right place. This article provides 6 beginner blender tutorials ranging from easy difficulty for those getting started for the very first time, to a harder difficulty for those who want a challenge or are working their way through this list!
Tips to keep in mind as we go:
When rendering your final projects, most of the tutorials recommend that you use Cycles. If you are doing this tutorial on a high-end desktop computer or at an innovation station in Studio X, that should be ok. If you are working from your own laptop, however, I would recommend sticking to Eevee, unless you have a super amazing laptop, otherwise you’ll be sitting there for hours waiting for it to render!
Most but not all of these tutorials have varying versions of shortcut viewer in either the left- or right-hand bottom corner of the blender viewport. Shortcut viewer allows you to see which buttons the instructor is pressing in case they forget to mention it.
The software may feel overwhelming, but as long as you remember to pause the video to follow along with the steps, it becomes quite manageable!
Blender 3D – Easy Lowpoly Car – Beginners Tutorial
This tutorial is great for people who are wanting to get an easy start in Blender. I was able to complete this tutorial in under an hour, and this does include all the times I paused to follow along. A few things that you might find a little hard to follow along with are the many shortcuts he uses but doesn’t take all that long to explain.
Adding Objects to the scene. He uses the shortcut Shift + A. If you find yourself forgetting this or would rather do it the long-handed way, you can instead go to the top left-hand corner of the screen where you will find the “Add” button. Hover over this and you will see the same menu.
Moving objects along an axis after duplication. This happened first when he is duplicating the wheels, and the command is Shift + D to duplicate. I was initially struggling with the “then press Y to move along the Y-axis, as what he doesn’t mention is that you release the “Shift + D” before hitting “Y.”
Other Notes to keep in mind: I was having a little trouble with my Blender not showing me the colours that I was applying to my car, so to check, you can render out the image by going to the top left-hand corner and hovering over “Render” then selecting “Render image.” You may have to reposition the camera before this, though, as the rendering comes from what the camera sees [he explains cameras and positioning starting around 19:50]. The shortcut for rendering is F12.
The first part of the video [0:00-14:00] is all about the modeling, from [14:00-18:51] is all about the coloration of the car. From [18:51-end] he is playing around with the scene and lighting.
Here is the outcome I got when following along with the tutorial!
[2.8] Blender Tutorial: Simple Animation For Beginners
This tutorial provides the simplest introduction to animation that I’ve seen, and I think that it’s perfect for those just getting started. The instructor keeps to using very simple shapes, nothing more than several cubes and a plane. He doesn’t have the shortcut viewer on, but he explains everything step-by-step in a super easy-to-follow fashion. When something might be a little confusing, text appears on the screen with the instructions of what buttons to click. There were only a few instances in which I had to pause and squint at the screen to see which tab he’d clicked into. I was able to complete this tutorial in under 30 minutes, so it’s great if you only have a little bit of time!
Here is my animation result!
Rig your Own Ghost in Blender 3D for Halloween – EASY
This tutorial leans a little more to the medium-easy side. I would say that it’s pretty good overall, though the instructor goes through the steps quite fast and I found myself pausing a lot more often than I was in the “easy” tutorials. There are a few things he neglects to explain, so here are some of the things that stuck out to me the most!
[5:57] He magically transitions from Edit Mode to having the object be a pretty rainbow gradient. What he actually does is switch from Edit Mode to Weight Paint mode. He was able to tab into the mode due to his own personal settings, but here’s how you can do it: Go to the Upper left-hand corner of the screen to where you will likely see either “Object” or “Edit” mode. If you click this drop-down menu, you will see “Weight Paint” as an option.
[6:59] When pinning the ghost to the circle, make sure you click the ghost, then the circle, then ctrl +P (Order matters for a lot of things in blender!)
[7:10] When he says he’s going to grab the keyframe, there’s a small bar at the bottom of the screen in the center that has a little circle button (next to the play buttons). That button will grab a keyframe. You can also use the shortcut “I” which will prompt you to select which type of keyframe. For the purposes of this tutorial, just select “Location” in addition. When he moves the ghost over to pin a second keyframe, notice that you have to move the time bar across the bottom first, before moving the circle, otherwise it will just override the previous keyframe.
Note! At around timestamp 10:17, I the video gets a little harder. You have completed the ghost’s body at this point, but the video then moves onto texture painting. Though the video itself has only run for about 10 minutes by this point, a beginner should stop here as this part took me about 40ish minutes to complete.
Here is the outcome of my ghost! (I chose to simply colour in two squares for the eyes):
Low Poly Island | Beginner | Blender 2.8 Tutorial
This tutorial does a good job of explaining each step of the process. the reason I would rate it a bit harder is because of the use of Nodes for textures, which can be tricky to follow along with at times. If you want the tutorial to remain at more of a beginner level, I would recommend stopping around timestamp 19:30, as at this point you have an island with the water and everything is already shaded. From 19:30 onwards, he begins to add more details like trees to the island. Going past 19:30 would also probably put you well past an hour for time of work. I looked at the comment section below the video and saw many people saying that the entire tutorial took them between 4-5 hours. I was able to do the island and the water alone (with shading) in about 1 hour.
Things that might be confusing explained:
If you have the latest version of blender, set the resolution in the Dynotypo to 0.5 instead of 6! This is the new setting for low poly. If you put it to 6, it will be way too smooth!
10:34- Make sure you set the sun strength to something around 5-10! This is equivalent to the energy.
13:15 – He says to look at the blend settings. If you have a more up to date version of Blender, you will have to open the settings tab on the node editor.
14:21 – After you’ve selected “vertex paint” go to the shader editor window (where the nodes are). You should have already added a base material. Click Shift + A and then search for “Vertex colors” and add that. Connect the Yellow color dot to the base color yellow dot (that’s on the large green material node). Then change the color to the color name (default is Col, so select that). Then your color should start appearing. NOTE: He explains this later around 15:50, but you will have already completed it as the different versions work slightly different.
Here is my result!
Create Satisfying 3D Animations | Easy Blender Tutorial
This tutorial is simple, though a little confusing to follow along. There are some times at which I struggled a bit to follow along with what he was saying (there was a lot of pausing and restarting). Up until he starts adding the textures, a strong-willed beginner could complete this tutorial. Once he gets to the textures around timestamp 8:00, however, I might wait until you know a little more about Blender before you tackle that part! The only instance I was completely baffled was at timestamp 3:25. In order to extrude in the way he does, you have to hit “E” and then “S” for the scaled extrusion.
Here is my result!
My journey through learning Blender may be different from yours, but there’s no doubt that you’ll be able to get the hang of it in no time! You don’t need an entire semester-long class to learn to master the basics of Blender thanks to the many resources available on the internet. As long as you have a computer, at least an hour of time, and the motivation to create, I’m sure you will power through these beginner instructions.
As I followed through these tutorials for myself, I began relearning the things I had forgotten from the class that seemed to have happened so long ago. Now, I’m working to improve my skills a little bit every week. With my increasing experience with the software, I am hoping to lead an Advanced Blender workshop for those daring to take on the challenge! I definitely wouldn’t call myself an expert at the software, but I’m certainly more confident than I was when I started. You too could learn these skills, and maybe even one day become a Blender master!
After being a concept in our minds for so long, Studio X’s physical space is finally beginning to come to life! Starting the week after graduation, construction started on the first floor of Carlson. Some of our staff had the opportunity to visit the space and see how the progress is coming along. We were amazed by how large the space felt after staring at the rendering provided by our architects, CannonDesign for over a year. The journey has been documented on the University of Rochester River Campus Libraries Instagram: rclibraries.
In anticipation of our soft opening in the fall, our team has been busy planning workshops, drafting policies, and ordering and organizing equipment.
Ten of these headsets will be available for use within Studio X, while the other 10 will be lendable for patrons to take outside of the library. We also received five Insta 360 One X2 360 cameras, which are versatile beginner-friendly cameras. You can also watch us unbox one of the Oculus Quests!
To stay up to date on the latest and greatest from Studio X, be sure to stay connected with us on social media! Follow us on Instagram and Twitter Join our Discord server Subscribe to our Mailing List for our biannual newsletter
Augmented reality (AR) is an overlay of computer-generated images onto the real world. AR uses our existing reality as the basis of the experience, and with the help of a device, this creates an interactive experience for users. AR has snuck into mainstream media right under everyone’s noses. However, some people still feel intimidated by the idea of AR technology! If you think you’re new to AR and are looking for some experiences to try, here are some free apps to get you started!
Travel/Adventure: Pokémon Go
Pokémon Go is the game that brought AR mainstream like no other, drawing in old school Pokémon fans and new gamers into its addicting world of Pokémon catching. When you create your avatar and log in for the first time, you join Trainers across the globe in discovering Pokémon as you explore the world around you. The AR aspect comes into direct effect when you are catching Pokémon, as not only do you have to travel around to find them, but you also see them appear on the ground right in front of you! Its unique mechanics force players to get up and walking because, in order to catch new Pokémon, you have to be actively moving around. When you travel to well-known locations in your town, you may even find exciting monuments in the game such as Gyms and gift locations. You don’t have to play alone either. You can link up with your friends to share items, collect rare Pokémon, fight in Raid Battles, and more!
This app populates your surroundings with AR flamingos that you can interact with through several functions on your screen. The FLARMINGOS are virtual representations of wild flamingos that the creator developed informed by scientific research. They were animated using human motion and a dynamic flocking algorithm that influences their collective behavior. If you have your friends download this informational app, you can work together to create a flock and watch as they dance to the music!
Social Justice: Breonna’s Garden
She who plants a garden plants hope. This is the message you first see upon opening the app. Soon, music cues, and you watch a slideshow filled with various images of Breonna from her lifetime. After watching the moving slideshow, you scan your environment briefly with your camera and watch as Breonna’s Garden blooms. in the center of the garden stands a 3D model of Breonna. This AR app evolved from a tool to help her family grieve, to a tool used by the nation. It was created with the intention of it being a peaceful refuge where her name can be said without negation and where you can give yourself a moment to surrender and let go. Brimming with art, life, and beauty, the Garden is not only a sacred space to honor Breonna Taylor, but also to celebrate someone you miss. This app is best experienced in a quiet place where you feel comfortable sharing your feelings.
If you are a plant lover, Candide is the app for you! Not only can this app identify your plants for you through a simple photo, but it also serves as a form of social media app for plant lovers! If you have a plant you’ve long forgotten the name of, a quick snap and scan using Candide’s AR filter will do its best to identify the plant for you. If you’re unsure whether or not its identification is correct, you can post the plant to your feed and ask others for their opinion. The app also provides extensive information on how to care for various plants, what plants go well together, and how to make your plants the healthiest they can be!
Kinfolk empowers users to bring AR monuments of underrepresented icons into any home, school, or public space. This app allows the user to dive deeper into Black history. The app comes preloaded with stories of 6 different historical figures. Once you choose a figure, its stature appears right in front of you through the power of AR. While you gaze at their life-size visage, you get to read or listen to their bios, explore their playlists, and even explore additional resources through the Kinfolk web portal. The app starts with monuments of people that should be common knowledge, and they are working to build an archive that tells history from a perspective that you won’t find in schools.
Game: Angry Birds AR: Isle of Pigs
Whether you are a fan of the classic game or looking into AR games for the first time, Angry Birds AR: Isle of Pigs is a fun way to introduce you and your family to the world of AR games. This game takes you to a remote island right within your own home! Each of the 40 fun-filled levels appears on your floor, wherever you may be. The more space you have, the easier it is to play, as you can go as far as walking around the buildings to destroy them better! You are able to see the incredibly realistic characters and the structures of each level overlaying your own environment in real-time, making this Angry Birds game immersive like no other.
AR may seem daunting at first, but once you dip your toes into the world of AR apps, you’ll definitely come to see how easy it is to use this technology! People may think that they are not the right kind of person to use AR, but with the technology becoming so mainstream, there is bound to be an AR app out there for everyone! All the apps mentioned in this article are free to download, and there are many more out there to try. If you would like more recommendations of AR apps, or simply want to know more about AR technology, stop by Studio X and we can help you learn more!