Metaverse Reading Group

illustration of a woman wearing a VR headset reading a book.

Join Studio X for a casual reading group this spring in which we will discuss the metaverse and try out virtual reality (VR) experiences together. We’ll be reading The Metaverse Handbook to gain familiarity with the concept. Funds have generously been made available through the Humanities Center to purchase books for participants.

The group will meet biweekly from 12 to 1pm on Fridays beginning February 3rd. If you are interested in joining the group, please fill out this short form.

New grant will use virtual reality to understand trauma and the brain

A ball of energy with electricity beaming all over the place.

Understanding how experience and exposure to trauma changes the brain could improve diagnosis and targeted care for conditions like anxiety and post-traumatic stress disorder (PTSD). Benjamin Suarez-Jimenez, Ph.D., assistant professor of Neuroscience, has been studying this topic for the past several years and was awarded a new $3.5 million grant to use virtual reality and MRI to look into the circuitry of threat, reward, and cognitive mapping in PTSDtrauma, and resilience.

For the next five years, this funding from the National Institute of Mental Health will allow the ZVR lab to build upon work that investigates brain areas that build spatial maps, specifically to discriminate between areas of an environment associated with emotions. Suarez-Jimenez’s most recent research identified changes in the salience network – a mechanism in the brain used for learning and survival – in people exposed to trauma (with and without psychopathologies, including PTSD, depression, and anxiety). His prior research has revealed people with anxiety have increased insula and dorsomedial prefrontal cortex activation – indicating their brain was associating a known safe area to danger or threat.

“This project the RO1 will support will probe whether the neural processes we have identified in the past are specific to threat or if they expand to reward processing,” Suarez-Jimenez said. “We are also looking at how attention allocation to some visual cues of the virtual reality tasks changes from pre- to post-task experience. We are hoping that understanding these brain processes can help us identify better ways to diagnose PTSD and to improve treatment.”

Suarez-Jimenez came to the University in January 2021. He is an active member of the Neuroscience Diversity Commission and has served as a mentor for the NEUROCITY program.

Learn more.

Seed funding reflects how data science, AR/VR transform research at Rochester

professor mudjat cetin standing in front of Wegman's Hall.

The University’s Goergen Institute for Data Science supports collaborative projects across all disciplines.

professor mudjat cetin standing in front of Wegman's Hall.
“I’m very excited about the wide range of collaborative projects we are able to support this year,” says Mujdat Cetin, the Robin and Tim Wentworth Director of the Goergen Institute for Data Science. “These projects tackle important and timely problems on data science methods and applications, and I am confident they will lead to significant research contributions and attract external funding.” (University of Rochester photo / Bob Marcotte)

Ten projects supported with seed funding from the Goergen Institute for Data Science this year demonstrate how machine learning, artificial intelligence (AI), and augmented and virtual reality (AR/VR) are transforming the way University of Rochester researchers—across all disciplines—address challenging problems.

“I’m very excited about the wide range of collaborative projects we are able to support this year,” says Mujdat Cetin, the Robin and Tim Wentworth Director of the institute. “These projects tackle important and timely problems on data science methods and applications, and I am confident they will lead to significant research contributions and attract external funding.”

The awards, approximately $20,000 each, help researchers generate sufficient proof-of-concept findings to then attract major external funding.

This year’s projects involve collaborations among engineers, computer scientists, a historian, a biostatistician, and experts in brain and cognitive sciences, earth and environmental science, and palliative care. Their projects include a totally new kind of computing platform, new virtual reality technologies to improve doctor-patient conversations and help people overcome color vision deficiency, and machine learning techniques to make it easier for people to add music to their videos and to enhance AR/VR immersive experiences based on the unique geometry of each user’s anatomy.

The 2022–23 funded projects and their principal investigators are:

  • Ising Boltzmann Substrate for Energy-Based Models
    Co-PIs: Michael Huang, professor of electrical and computer engineering and of computer science, and Gonzalo Mateos, associate professor of electrical and computer engineering and of computer science and the Asaro Biggar Family Fellow in Data Science
  • A Data-Driven, Virtual Reality-based Approach to Enhance Deficient Color Vision
    Co-PIs: Yuhao Zhu, assistant professor of computer science, and Gaurav Sharma, professor of electrical and computer engineering, of computer science, and of biostatistics and computational biology
  • Audiovisual Integration in Virtual Reality Renderings of Real Physical Spaces
    Co-PIs: Duje Tadin, professor and chair of brain and cognitive sciences and professor of ophthalmology and of neuroscience; Ming-Lun Lee, associate professor of electrical and computer engineering; and Michael Jarvis, associate professor of history
  • Personalized Immersive Spatial Audio with Physics Informed Neural Field
    Co-PIs: Zhiyao Duan, associate professor of electrical and computer engineering and of computer science, and Mark Bocko, Distinguished Professor of Electrical and Computer Engineering and professor of physics and astronomy
  • Computational Earth Imaging with Machine Learning
    Co-PIs: Tolulope Olugboji, assistant professor of earth and environmental sciences, and Mujdat Cetin, professor of electrical and computer engineering and of computer science, and the Robin and Tim Wentworth Director of the Goergen Institute for Data Science
  • Improving Deconvolution Estimates through Bayesian Shrinkage
    PI: Matthew McCall, associate professor of biostatistics
  • Building a Multi-Step Commonsense Reasoning System for Story Understanding
    Co-PIs: Zhen Bai, assistant professor of computer science, and Lenhart Schubert, professor of computer science
  • Versatile and Customizable Virtual Patients to Improve Doctor-Patient Communication
    Co-PIs: Ehsan Hoque, associate professor of computer science, and Ronald Epstein, professor of family medicine and palliative care
  • Machine Learning Assisted Femtosecond Laser Fabrication of Efficient Solar Absorbers
    Co-PIs: Chunlei Guo, professor of optics, and Jiebo Luo, Albert Arendt Hopeman Professor of Engineering
    Rhythm-Aware and Emotion-Aware Video Background Music Generation
    PI: Jiebo Luo, Albert Arendt Hopeman Professor of Engineering

Read the full story.

Beat Saber Battle 2022

promotional image for beat saber battle. Shows two light sabers intersecting.

Like dancing to fun music, light sabers, and virtual reality? We have the perfect competition for you! Compete against your peers, and if you dominate, you will be crowned as the beat saber champion. If you are the ultimate, numero uno, top dog Beat Saberer, you will win the fanciest of prizes. The kickoff will be during a Drop-In Friday event on Friday, November 11th at 1pm.

promotional image for Beat Saber battle. Shows two light sabers crossing.

Where: Studio X, Carlson Library First Floor
Kick Off: Friday, November 11 at at 1pm

But what’s Beat Saber, you say? Only the most popular VR game of all time! Beat Saber is a VR rhythm game in which you slash floating boxes as they fly toward you to the beat of the music with gigantic light sabers.

The Rules

  • Participants must be affiliated with the University of Rochester. Students, faculty, or staff are welcome to participate.
  • Participants must show a UR ID with the name under which they registered.
  • Participants must complete all three rounds and final battle to be eligible for prizes.
  • There will be two competition brackets (1. easy/normal level 2. hard, expert, expert+ level). Once a participant selects a bracket, they must stay in that bracket throughout the entire competition.
  • Once the participant chooses their bracket, they are welcome to select which level therein.
  • Participants must complete each round in Studio X.
  • Scores must be verified by a Studio X staff member.
  • Participants will get three attempts for each of the three rounds but not for the final battle.
  • Participants are not allowed to use score multipliers.
  • The songs will be chosen by Studio X staff and will be presented upon arrival for each round.
  • Once you have your score, you will be added to our scoreboard.
  • At the end of each round, a selection of the lowest scoring participants will be eliminated. This depends on how many participants we have.

The Structure

The competition is divided between two brackets:

BEGINNER

Players who are new to the game on the easy/normal levels.

Winning prize: Projector!

EXPERIENCED

Players who have experience with the game on the hard, expert, and expert+ levels.

Winning prize: Meta Quest 2 VR headset!

You can choose which bracket you would like to participate in.

The Schedule

Kick Off: Friday, 11/11 @1pm in Studio X
Round 1: Participants must complete this round by 11/18.
Round 2: Participants must complete this round by 11/22.
Round 3: Participants must complete this round by 12/2.
Final Showdown: Three finalists from each bracket will participate in final competition event on Friday, 12/9 @1pm.


promotional graphic for drop-in fridays at Studio X with geometric design. Reads "Drop-in Fridays. Fall 2022 series. Join us Fridays at 1pm for informal XR talks, tech demos, workshops, and more."

Drop by Studio X every Friday at 1pm for informal workshops, talks, demos, and more! View the full schedule.

XR Game Night

promotional image for XR game night. Shows people in VR headsets.

Take a break from studying and unwind at XR Game Night at Studio X! The night will begin with a brief headset tutorial, and you can reserve the headsets after the event to keep playing later. We will have snacks, beats, and games to relax, have fun, and vibe!

promotional image for XR game night. Shows people in VR headsets.

Join Studio X, UR’s hub for immersive technologies, and learn more about the digital world of extended reality (XR). All levels welcome. No experience necessary!

Instructor: Nefle Nesli Oruç
Where: Studio X, Carlson Library First Floor
When: Tuesday, December 6th @7:30pm
Register: libcal.lib.rochester.edu/event/9662693

Make Your Own AR Mini Driving Game

illustration of a person using a cell phone that shows an augmented reality car.

Learn how to create your own AR mini driving game with Apple ARKit, a mobile platform that makes it easy to create all kinds of AR experiences. In this workshop, participants will use Reality Composer, a tool within ARKit, to create simple 3D models, add physics and behaviors, and deploy their creation on an iPhone or iPad.

Join Studio X, UR’s hub for immersive technologies, and learn more about the digital world of extended reality (XR). All levels welcome. No experience necessary!

Note: In order to participate, you will need to complete the pre-workshop instructions, which will be sent by email prior to the event. Need assistance with this process? Ask for help on the Studio X Discord (Quick Questions Channel). 

Instructor: Hao Zeng
Where: Studio X, Carlson Library First Floor
When: Tuesday, November 15th from 6 to 7:30pm
Register: libcal.lib.rochester.edu/event/9662565

XR Research in the Summer

photogrammetry model of the mural in Kodak Hall.

There is a strong emphasis on fostering cross-disciplinary collaboration in extended reality (XR) at Studio X. Over 50 researchers across the UR use XR technology for their research and teaching, and many come to Studio X for consultation and advice in either program development or engineering. As an XR Specialist at Studio X, I got the opportunity to work on two XR-related research projects during the past summer, one in collaboration with the Brain and Cognitive Science Department (BCS), and the other with the Computer Science Department (CS). Through the Office of Undergraduate Research, these projects were supported by a Discover Grant, which support immersive, full-time summer research experiences for undergraduate students at the UR.

The research with BCS includes digitizing the Kodak Hall at the Eastman School of Music and bringing it into VR. The result will be used to provide a more realistic environment for conducting user testing to better study how humans combine and process light and sound. The visit to Kodak Hall was scheduled way back in March. Many preparations had been done before the visit that included figuring out the power supply and cable management, stage arrangement, clearance, etc. One discussion was had on what techniques will be used to scan and capture the hall. Three object scanning techniques were tested before and during the visit: photogrammetry, 360-image, and time-of-flight (ToF). 

Photogrammetry creates 3D models of physical objects by processing photographic images or video recordings. By taking images of an object from all different angles and processing them with software like Agisoft Metashape, it is possible for the algorithm to locate and map key points from multiple images and combine them into a 3D model. I first learned about this technique by attending a photogrammetry workshop at Studio X led by Professor Michael Jarvis. This technique has been very helpful for the research since we are able to get great details on the mural in Kodak Hall, at which other techniques had failed.

photogrammetry model of the mural in Kodak Hall.
Photogrammetry model of the mural in Kodak Hall

360-image, as its name suggests, is a 360-degree panoramic image taken from a fixed location. With the Insta360 camera borrowed from Studio X, the capturing session requires almost no setup whatsoever and can be quickly previewed using the app on a phone or smart device.

360 image of Kodak Hall, captured from the stage.
360 image of Kodak Hall, captured from the stage

The Time-of-Flight (ToF) technique shoots light and calculates the time it takes for the light wave to travel back from the reflection in order to get the depth information. Hardware using the ToF technique can be easily found on modern devices, such as iPhone and iPad with Face ID. I tested the ToF scanner on the iPad Pro at Studio X. It provides a great sense of spatial orientation and has a fairly short processing time.

3D capture of Studio X from an iPad Pro.

We used the Faro Laser Scanner in order to get a scan with higher accuracy and resolution. Each scan took 20 minutes, and we conducted 8 scans to cover the entire hall. The result is a 20+ GB model with billions of points. In order to load the scene to the Meta Quest 2 VR headset, we shrunk down the size and resolution of the model dramatically using tools such as gradual selection, adjusting the Poisson distribution, material paint, etc. We also deleted excessive points and replaced flat surfaces with better quality images such as the stage and mural. The end result is a nice-looking model with decent details around 250MB, good for the headset to run. 

partial 3D model of Kodak Hall.

The model was handed over to Shui’er Han from BCS as a Unity package, where she is going to implement the audio recording and spatial visualization before conducting the user testing. It is amazing to see many people working and bringing together their experience and knowledge in making this cross-disciplinary project to reality. I would like to thank Dr. Duje Tadin, Shui’er Han, Professor Michael Jarvis, Dr. Emily Sherwood, Blair Tinker, Lisa Wright, Meaghan Moody, and many more who gave me the amazing opportunity to work on this fun research and all the help they provided along the way. I can’t wait to see what they can achieve beyond this model and research project.  


You can read more about this cross-disciplinary collaboration here.

Hao Zeng
Hao Zeng

XR Specialist

Create Your Own Selfie Filters for Instagram

person holding a smart phone high in the air.

Ever wondered how the fun selfie filters on Instagram are created? It’s easier than you think!

Create your own custom effects with Spark AR Studio, a free AR tool. With this platform, you can incorporate images and 3D models, animation, interaction, and more into filters without knowing how to code. During this hands-on workshop, participants will learn the basics of creating effects and how to publish them to Instagram. We’ll also provide some custom UR assets, so you can spread that Meliora spirit!

Join Studio X, UR’s hub for immersive technologies, and learn more about the digital world of extended reality (XR). All levels welcome. No experience necessary!

Note: Attendees will need a Facebook account and to download Spark AR studio (Mac and PC) as well as the Spark AR app (iOS and Android) for testing ahead of time. Need assistance with this process? Ask for help on the Studio X Discord (Quick Questions Channel). 

Instructor: Mila Paymukhina
Where: Studio X, Carlson Library First Floor
When: Friday, 11/4/2022 @1PM
Register: libcal.lib.rochester.edu/event/9662190


promotional graphic for drop-in fridays at Studio X with geometric design. Reads "Drop-in Fridays. Fall 2022 series. Join us Fridays at 1pm for informal XR talks, tech demos, workshops, and more."

Drop by Studio X every Friday at 1pm for informal workshops, talks, demos, and more! View the full schedule.

VR Games Pop-Up @iZone

promotional image for pop up at izone. Shows a polar bear in a vr headset.

Hang out with friends and play fun virtual reality games at a cozy Studio X pop-up at iZone! There will be donuts, cider, and swag up for grabs (warm Studio X beanies). 

promotional image for pop-up at iZone. Shows a polar bear wearing a VR headset.

Games will be playing…

  • Acron: Attack of the Squirrels
  • Beat Saber
  • First Steps
  • Epic Roller Coasters
  • Job Simulator
  • iB Cricket
  • Vader Immortal: A Star Wars VR series

Where: iZone Forum
When: Thursday, 11/3 from 4 to 6pm