New grant will use virtual reality to understand trauma and the brain

A ball of energy with electricity beaming all over the place.

Understanding how experience and exposure to trauma changes the brain could improve diagnosis and targeted care for conditions like anxiety and post-traumatic stress disorder (PTSD). Benjamin Suarez-Jimenez, Ph.D., assistant professor of Neuroscience, has been studying this topic for the past several years and was awarded a new $3.5 million grant to use virtual reality and MRI to look into the circuitry of threat, reward, and cognitive mapping in PTSDtrauma, and resilience.

For the next five years, this funding from the National Institute of Mental Health will allow the ZVR lab to build upon work that investigates brain areas that build spatial maps, specifically to discriminate between areas of an environment associated with emotions. Suarez-Jimenez’s most recent research identified changes in the salience network – a mechanism in the brain used for learning and survival – in people exposed to trauma (with and without psychopathologies, including PTSD, depression, and anxiety). His prior research has revealed people with anxiety have increased insula and dorsomedial prefrontal cortex activation – indicating their brain was associating a known safe area to danger or threat.

“This project the RO1 will support will probe whether the neural processes we have identified in the past are specific to threat or if they expand to reward processing,” Suarez-Jimenez said. “We are also looking at how attention allocation to some visual cues of the virtual reality tasks changes from pre- to post-task experience. We are hoping that understanding these brain processes can help us identify better ways to diagnose PTSD and to improve treatment.”

Suarez-Jimenez came to the University in January 2021. He is an active member of the Neuroscience Diversity Commission and has served as a mentor for the NEUROCITY program.

Learn more.

Seed funding reflects how data science, AR/VR transform research at Rochester

professor mudjat cetin standing in front of Wegman's Hall.

The University’s Goergen Institute for Data Science supports collaborative projects across all disciplines.

professor mudjat cetin standing in front of Wegman's Hall.
“I’m very excited about the wide range of collaborative projects we are able to support this year,” says Mujdat Cetin, the Robin and Tim Wentworth Director of the Goergen Institute for Data Science. “These projects tackle important and timely problems on data science methods and applications, and I am confident they will lead to significant research contributions and attract external funding.” (University of Rochester photo / Bob Marcotte)

Ten projects supported with seed funding from the Goergen Institute for Data Science this year demonstrate how machine learning, artificial intelligence (AI), and augmented and virtual reality (AR/VR) are transforming the way University of Rochester researchers—across all disciplines—address challenging problems.

“I’m very excited about the wide range of collaborative projects we are able to support this year,” says Mujdat Cetin, the Robin and Tim Wentworth Director of the institute. “These projects tackle important and timely problems on data science methods and applications, and I am confident they will lead to significant research contributions and attract external funding.”

The awards, approximately $20,000 each, help researchers generate sufficient proof-of-concept findings to then attract major external funding.

This year’s projects involve collaborations among engineers, computer scientists, a historian, a biostatistician, and experts in brain and cognitive sciences, earth and environmental science, and palliative care. Their projects include a totally new kind of computing platform, new virtual reality technologies to improve doctor-patient conversations and help people overcome color vision deficiency, and machine learning techniques to make it easier for people to add music to their videos and to enhance AR/VR immersive experiences based on the unique geometry of each user’s anatomy.

The 2022–23 funded projects and their principal investigators are:

  • Ising Boltzmann Substrate for Energy-Based Models
    Co-PIs: Michael Huang, professor of electrical and computer engineering and of computer science, and Gonzalo Mateos, associate professor of electrical and computer engineering and of computer science and the Asaro Biggar Family Fellow in Data Science
  • A Data-Driven, Virtual Reality-based Approach to Enhance Deficient Color Vision
    Co-PIs: Yuhao Zhu, assistant professor of computer science, and Gaurav Sharma, professor of electrical and computer engineering, of computer science, and of biostatistics and computational biology
  • Audiovisual Integration in Virtual Reality Renderings of Real Physical Spaces
    Co-PIs: Duje Tadin, professor and chair of brain and cognitive sciences and professor of ophthalmology and of neuroscience; Ming-Lun Lee, associate professor of electrical and computer engineering; and Michael Jarvis, associate professor of history
  • Personalized Immersive Spatial Audio with Physics Informed Neural Field
    Co-PIs: Zhiyao Duan, associate professor of electrical and computer engineering and of computer science, and Mark Bocko, Distinguished Professor of Electrical and Computer Engineering and professor of physics and astronomy
  • Computational Earth Imaging with Machine Learning
    Co-PIs: Tolulope Olugboji, assistant professor of earth and environmental sciences, and Mujdat Cetin, professor of electrical and computer engineering and of computer science, and the Robin and Tim Wentworth Director of the Goergen Institute for Data Science
  • Improving Deconvolution Estimates through Bayesian Shrinkage
    PI: Matthew McCall, associate professor of biostatistics
  • Building a Multi-Step Commonsense Reasoning System for Story Understanding
    Co-PIs: Zhen Bai, assistant professor of computer science, and Lenhart Schubert, professor of computer science
  • Versatile and Customizable Virtual Patients to Improve Doctor-Patient Communication
    Co-PIs: Ehsan Hoque, associate professor of computer science, and Ronald Epstein, professor of family medicine and palliative care
  • Machine Learning Assisted Femtosecond Laser Fabrication of Efficient Solar Absorbers
    Co-PIs: Chunlei Guo, professor of optics, and Jiebo Luo, Albert Arendt Hopeman Professor of Engineering
    Rhythm-Aware and Emotion-Aware Video Background Music Generation
    PI: Jiebo Luo, Albert Arendt Hopeman Professor of Engineering

Read the full story.

In a World Full of 3D Models, Researchers Build a New One for Leukemia

hand holding the the bone-marrow-on-chip device.

Wilmot Cancer Institute scientist published data that show a new microchip-like device that his lab developed can reliably model changes in the bone marrow as leukemia takes root and spreads.

hand holding the the bone-marrow-on-chip device.
Ben Frisch, PhD, holds the bone-marrow-on-chip device in his lab.

Ben Frisch, Ph.D., assistant professor of Pathology and Laboratory Medicine and Biomedical Engineering at the University of Rochester, and colleagues have been building what is known as a modular bone-marrow-on-chip to enhance the investigation of leukemia stem cells. The tiny device recapitulates the entire human bone marrow microenvironment and its complex network of cellular and molecular components involved in blood cancers.  

Similar tissue-chip systems have been developed by others, but they lack two key features contained in Frisch’s product: osteoblast cells, which are crucial to fuel leukemia, and a readily available platform.

The fact that Frisch’s 3D model has been published in Frontiers in Bioengineering and Biotechnology and is not a one-off fabrication will allow others in the field to adopt a similar approach using the available microfluidics system, he said.

Read more.

Sensory Processing – in a Virtual Kodak Hall

a binaural microphone set up with a dummy head.

Rochester researchers will harness the immersive power of virtual reality to study how the brain processes light and sound.

A cross-disciplinary team of researchers from the University of Rochester is collaborating on a project to use virtual reality (VR) to study how humans combine and process light and sound. The first project will be a study of multisensory integration in autism, motivated by prior work showing that children with autism have atypical multisensory processing.

The project was initially conceived by Shui’er Han, a postdoctoral research associate, and Victoire Alleluia Shenge ’19, ’20 (T5), a lab manager, in the lab of Duje Tadin, a professor of brain and cognitive sciences.

“Most people in my world—including most of my work—conduct experiments using artificial types of stimuli, far from the natural world,” Tadin says. “Our goal is to do multisensory research not using beeps and flashes, but real sounds and virtual reality objects presented in realistically looking VR rooms.”

UR students working on the project are looking at information on a laptop with Kodak Hall in the background.
Members of the team begin the setup for audio and visual data collection. From left to right are Shui’er Han, a postdoctoral research fellow in Duje Tadin’s lab; brain and cognitive sciences major Betty Wu ’23; computer science and business major and e5 student Haochen Zeng ’23, who works in River Campus Libraries’s Studio X; and Victoire Alleluia Shenge ’19, ’20 (Take Five), who earned her degree in brain and cognitive sciences and is a manager in Tadin’s lab.

A cognitive scientist, a historian, and an electrical engineer walk into a room . . .

Tadin’s partners in the study include Emily Knight, an incoming associate professor of pediatrics, who is an expert on brain development and multisensory processing in autism. But in creating the virtual reality environment the study participants will use—a virtual version of Kodak Hall at Eastman Theatre in downtown Rochester—Tadin formed collaborations well outside his discipline.

Faculty members working on this initial step in the research project include Ming-Lun Lee, an associate professor of electrical and computer engineering, and Michael Jarvis, an associate professor of history. Several graduate and undergraduate students are also participating.

Many of the tools they’ll use come from River Campus Libraries—in particular, Studio X, the University’s hub for extended reality projects, as well as the Digital Scholarship department. Emily Sherwood, director of Studio X and Digital Scholarship, is leading the effort to actually construct the virtual replica of Kodak Hall.

The group recently gathered in the storied performance space to collect the audio and visual data that Studio X will rely on. University photographer J. Adam Fenster followed along to document the group’s work.

Read more.

Anxiety Cues Found in the Brain Despite Safe Environment

3D nature scene. Shows a field, a wide sky, and a mountain in the background.

Imagine you are in a meadow picking flowers. You know that some flowers are safe, while others have a bee inside that will sting you. How would you react to this environment and, more importantly, how would your brain react? This is the scene in a virtual-reality environment used by researchers to understand the impact anxiety has on the brain and how brain regions interact with one another to shape behavior.

“These findings tell us that anxiety disorders might be more than a lack of awareness of the environment or ignorance of safety, but rather that individuals suffering from an anxiety disorder cannot control their feelings and behavior even if they wanted to,” said Benjamin Suarez-Jimenez, Ph.D., assistant professor in the Del Monte Institute for Neuroscience at the University of Rochester and first author of the study published in Communications Biology. “The patients with an anxiety disorder could rationally say – I’m in a safe space – but we found their brain was behaving as if it was not.”

Read more.

Black Past Lives Matter: Digital Kormantin

Aerial image of Fort Amsterdam in May 2019 before start of excavations.

Michael Jarvis’ latest digital history project at the University of Rochester couldn’t come at a better time.

Aerial image of Fort Amsterdam in May 2019 before start of excavations.
Aerial image of Fort Amsterdam in May 2019 before start of excavations. (Photo courtesy of Michael Jarvis)

“Black Past Lives Matter: Digital Kormantin,” funded with a $99,874 NEH Digital Humanities grant, will create a website with meticulously detailed virtual tours of a 1632 English fort on the coast of Ghana that was among the earliest to send enslaved Africans to the American colonies.

Sustained Black Lives Matter protests have focused national attention on persisting racial inequalities in the United States. Because this racism “has been centuries in the making, reconciliation depends upon all Americans understanding a Black history extending back four centuries temporally and across the Atlantic world spatially,” says Jarvis, a history professor who also infuses archaeology and digital media studies in his teaching and research.

Unity software was used to superimpose a model of the 1790 reconstruction of Fort Amsterdam onto an image of the current ruins compiled with photogrammetry. (Courtesy of Michael Jarvis)
Unity software was used to superimpose a model of the 1790 reconstruction of Fort Amsterdam onto an image of the current ruins compiled with photogrammetry. (Courtesy of Michael Jarvis)

Moreover, the website will be accessible to millions of people who, even without the travel barriers raised by COVID 19, would never have the means or opportunity to visit the coast of Ghana.

“Although no substitute for an actual visit, this project will make virtual visitation possible for an historic site every bit as important to American history as Jamestown or Plymouth Rock,” says Jarvis.

Read more.

A New Way to Make AR/VR Glasses

Person using equipment to work with AR/VR lenses.
A metaform is a new optical component that Rochester researchers say can combine with freeform optics to create the next generation of AR/VR headsets and eyewear. (University of Rochester illustration / Michael Osadciw)

University of Rochester researchers combine freeform optics and a metasurface to avoid ‘bug eyes’

“Image” is everything in the $20 billion market for AR/VR glasses. Consumers are looking for glasses that are compact and easy to wear, delivering high-quality imagery with socially acceptable optics that don’t look like “bug eyes.”

University of Rochester researchers at the Institute of Optics have come up with a novel technology to deliver those attributes with maximum effect. In a paper in Science Advances, they describe imprinting freeform optics with a nanophotonic optical element called “a metasurface.”

The metasurface is a veritable forest of tiny, silver, nanoscale structures on a thin metallic film that conforms, in this advance, to the freeform shape of the optics—realizing a new optical component the researchers call a metaform.

Read the full article from the University of Rochester’s Newscenter.

Faculty Interview Findings

During the spring semester and into the summer of 2020, Studio X staff conducted eight half-hour long interviews with faculty members currently working with immersive technologies to inform our fall 2020 pilot programming. We spoke with faculty across the following disciplines: engineering, history, digital media studies, and education. We view these initial conversations as ongoing, and we hope to expand beyond the limitations of the small sample.

Q1: How have you engaged XR through your own research and/or in your classes and other student-centered work?

Faculty discussed their research projects as very interdisciplinary, requiring diverse perspectives and expertise. Several faculty members discussed being more interested in what XR can facilitate rather than the tools and methods themselves, especially when this comes to teaching and learning. They want to address real world problems and leverage XR for active learning opportunities. Faculty also discussed generating content for future research such as assessing tools to guide future development.

Q2: What platforms and skills do your students require to participate in your courses and/or research that leverage XR?

Nearly every faculty member mentioned Unity 3D and its steep learning curve.

Q3: How do students become involved with XR?

Q4: What does a community of practice look like for XR@UR?

Q5: Where do you find inspiration for new ways of teaching, innovative tools, or exciting projects?

Q6: Imagine you have enough funding to work on an XR project with a small group of students. What projects might you choose?

Half of faculty described expanding on current research projects such as generating more content for assessment or making projects more usable. Several faculty members also discussed creating specific XR experiences such as developing AR walking tours centered on social justice topics and designing machines virtually, so one could see the inside of how they operate.

Q7: What challenges do you encounter when engaging with XR?

Faculty discussed their frustrations with the steep learning curve of XR tools and getting students acquainted at an early stage, so they are prepared for more advanced coursework. Faculty find they often must teach students the basics themselves or rely on their graduate student collaborators, who might have no other reason to learn the tool/method. Several participants emphasized the value of resident expertise and introductory, low-stakes trainings.

Access to enough of the same equipment in the same space is also a barrier. Faculty discussed running experiments and struggling to locate the same versions of VR and MR headsets, which are cost-prohibitive. Their research also often requires dedicated, long-term space, and setting up these unique environments can take hours of work before they can even begin to develop. The technology is also rapidly evolving, requiring users to constantly relearn it not to mention maintaining cross-platform compatibility and addressing storage issues.

XR also has a PR problem in that most do not understand its value or see themselves as users let alone creators. One faculty member mentioned that XR seems overly complicated, unrelatable, and not something that everyone is ready to integrate into their courses. Faculty, staff, and students need to see more use cases to pique their interest as well as have access to the costly equipment. Moreover, the timeless debate between theory and practice endures. At a theory-driven institution such as that of UR, hands-on making and skill building remains a challenge.

Q8: Is there anything we should keep in mind?

icon of a teacher and students.

Beginner Friendly
Provide introductory workshops and early onboarding opportunities for students

icon to represent connection. Shows a large circle connected to three smaller circles.

Facilitate Interdisciplinary Work
Support all disciplines & collapse departmental silos

icon of graduation cap

Faculty Development
Create new opportunities, space, and time for faculty to experiment

icon of a lightbulb hovering over an open box.

Think Outside the Box
Push boundaries, take risks, & make challenging interventions. Studio X is a cross-unit initiative that can help to balance theory and practice.

icon of a satellite

Be the Hub for Immersive Technologies @UR
Stay up to date on XR news @UR and beyond and share out

icon of a wrench.

Practical Advice
Host group events and classes, etc.

New XR training for UR doctoral students

Biomedical engineering graduate student Tom Stoll, right, adjusts a virtual reality head-mounted display on assistant professor Ross Maddox. The array of speakers in Maddox's lab allows researchers to simulate realistic listening environments.

A $1.5 million grant from the National Science Foundation will provide additional impetus to a University of Rochester initiative applying augmented and virtual reality in health, education, product design, remote communication, entertainment, and other fields.

The grant will enable 62 doctoral students to be trained in the skills needed to advance AR/VR technologies and will also help them gain an appreciation for the broader cultural and societal implications of the technologies, says Mujdat Cetin, the principal investigator behind the grant. Other Rochester faculty supporting this initiative are Jannick Rolland, the Brian J. Thompson Professor of Optical Engineering; Michele Rucci, professor of brain and cognitive sciences; and Zhen Bai, assistant professor of computer science.

Biomedical engineering graduate student Tom Stoll, right, adjusts a virtual reality head-mounted display on assistant professor Ross Maddox. The array of speakers in Maddox's lab allows researchers to simulate realistic listening environments.
Biomedical engineering graduate student Tom Stoll, right, adjusts a virtual reality head-mounted display on assistant professor Ross Maddox. The array of speakers in Maddox’s lab allows researchers to simulate realistic listening environments. (University photo / J. Adam Fenster)

Read the full article on the University of Rochester’s Newscenter.

Training Brains with Virtual Reality

Brenna James '20 wearing a virtual reality headset with a computer in the foreground of the photo

Duje Tadin, associate professor of brain and cognitive sciences; Jeffrey Bazarian, professor of emergency medicine; and Feng (Vankee) Lin, assistant professor in the School of Nursing, are working together to see how VR can help treat people with Alzheimer’s disease and those suffering from concussions. Through access to technology and training, Studio X will prepare students to collaborate on and conduct cutting edge research.

Brenna James ’20 suffered a concussion in high school. Rochester researchers are using VR to create therapeutic treatments that be used at home by patients like her.

Read the full article via the University of Rochester’s Newscenter.