top of page
Belle2VR

Belle2VR

2017

Credits:

• Immersive Environment, Programming - Zach Duer
• Physics Lead, Programming - Leo Piilonen
• Educational Design - George Glasson
• Students - Jesse Barber, Samantha Spytek, Christopher Dobson

Publications:

• Proceedings of the 24th International Conference on Computing in High Energy and Nuclear Physics, Adelaide, South Australia, 2020
• IEEE Computer Graphics and Applications, Vol. 38, No. 3 - "Belle2VR: A Virtual Reality Visualization of Subatomic Particle Physics in the Belle II Experiment," 2018
• IEEE Vis 2018 - "Belle2VR - A Virtual Reality Visualization of Subatomic Particle Physics in the Belle II Experiment," Berlin, Germany, October 25, 2018
• Virginia Association of Science Teachers, "Particle Physics Virtual Reality Project at Virginia Tech," November 16, 2017
• Belle II in Virtual Reality - Steam publication - https://store.steampowered.com/app/810020/Belle_II_in_Virtual_Reality/

Sponsored by:

• VT Institute for Creativity, Arts, and Technology - Major SEAD Grant

Belle2VR is an interactive virtual reality visualization of subatomic particle physics, designed by an interdisciplinary team as an educational tool for learning about and exploring subatomic particle collisions. It was created to fill the need for a highly interactive, educational, subatomic particle physics tool in which students can take advantage of the learning benefits of both virtual reality and natural locomotion in a multi-user collaborative environment.Users wear backpack computers and virtual reality head-mounted displays (HMDs) which are tracked by an array of external motion capture cameras. In this environment, physics students participate in active learning lesson plans by freely exploring the visualization to form their own spatial understanding of subatomic particle physics.

Less

More 

Belle2VR is unique within the field by providing an animation of the detailed event history that leads up to and results in the final state. This has never been done in the past, and will prove beneficial to the students who will study this behaviour in an educational setting. Belle2VR also provides the highest interactive temporal control over the visualization of a particle collision event. Additionally, the potential application of natural locomotion in a multi-user virtual environment designed for education extends beyond the domain of physics.

Belle2VR is intended to help students learn about the Belle II detector and construct a deep understanding of subatomic particle physics. Students need to be able to identify detector components, trace particle movements in 3D space, and determine how particles interact with the detector. On the assumption that students in the context of a required lab session will have several hours to interact with the visualization, our design strategy strikes a balance between immediate impact and exploratory depth, implementing tools for educational investigation.

Belle2VR was developed in the Unity videogame engine. Unity was chosen for rapid development with a small team, quick integration with VR and optical motion capture systems, and application portability to other platforms (mobile, PC) for later public outreach. In the Cube virtual environment facility, users wear an MSI VR One backpack computer and an Oculus Rift CV1 VR HMD with an attached constellation of motion capture markers, and hold an Xbox One controller (see Figure 2). Each VR backpack runs an independent instance of the visualization, rendering to the attached VR HMD. A twenty-four camera Qualisys optical motion capture system tracks the motion capture markers, recognizing each constellation of markers attached to an HMD as a unique object.

In the virtual environment, the visualization takes place within a model of the Cube facility. The visualization’s virtual Cube environment aligns correctly with the physical Cube so users are aware of their actual location and are less likely to walk into walls. Within the virtual Cube is a model of the Belle II detector. The model is an accurate representation of the detector, with its tens of thousands of crystals and wires each represented as individual objects.

Driving the particle visualization are data sets, each representing a collision event, exported from basf2 as comma-separated value (CSV) files. Each row of data represents a time step of one particle within the GEANT4 particle-physics simulation package. At each step, the file provides the particle’s current time, position, momentum, energy, unique ID, type, parent ID, and other information. Thousands of events can be rapidly generated, each showing a different kind of collision with dramatically different outcomes. In GEANT4, every simulation time-step is recorded as one row of data in a collision event file, including every discontinuous event in the underlying physics simulation.

As the visualization of the pre-generated simulation progresses, the sprites move through the detector leaving behind particle trails, drawn as bold lines matching the color of the particle. They depict where the particles have been recently to give a sense of direction and relative speed. Particle paths, the entire path that each particle will traverse over the course of the event, are also drawn as faint colored lines that are persistent throughout the visualization.

In the Cube, Belle2VR supports multiple concurrent users, each wearing a headset and VR backpack. Each user with a headset is represented in all other active visualizations as an animated avatar. This enables users to locate and interact with each other. For example, when a user notices something interesting, they can ask the other users to come to the same location to investigate, or to collaboratively examine the same event from different points of view. Even if users are looking at different events, they will always be able to see each other’s avatars. In this case, users might compare the differences between various events that occur within a region of the detector.

The interdisciplinary collaborative process to build Belle2VR required expertise in physics, education, and the development of VR visualizations. The project was spearheaded by authors Leo Piilonen in physics and George Glasson in science education, who obtained a grant from ICAT beginning July 2016. This grant was used for access to the Cube facility, development time from author Zach Duer, and undergraduate student assistance. Duer was primarily responsible for both project management and development of the visualization in Unity. At the onset, Piilonen outlined a vision for the project and formatted the existing assets for the model and the event files to be compatible with Unity. A physics undergraduate student, Jesse Barber, directly assisted with the initial development in interpreting the CSV file, properly representing the detector model, and ensuring that the visualization was accurate. To represent the particles in the visualization, Barber and Piilonen created the sprites to represent the particles.

Concurrent with visualization development, Glasson coordinated the creation of a series of lesson plans for students to explore subatomic interactions. He was assisted by undergraduate physics students Samantha Spytek and Christopher Dobson, both of whom were enrolled in Virginia Tech's teacher licensure program. The lesson plans were inquiry-based and followed the 5-E instructional model lesson sequence.11 In the first lesson, students explored the basics of VR technology and interacting with controls in the visualization. In the second lesson, students used their observations of particle interactions to identify the detector components. In the third lesson, students gather information from the particles in multiple events and used that data to calculate the conservation of energy and momentum of relativistic subatomic particles.

Users wear backpack computers and virtual reality head-mounted displays (HMDs) which are tracked by an array of external motion capture cameras. In this environment, physics students participate in active learning lesson plans by freely exploring the visualization to form their own spatial understanding of subatomic particle physics.

Video trailer

|

|

|

Video documentation detailing interactive control

|
|

|

|
|

|

|
|

|

Previous
Next
bottom of page