Augmented reality for CERN test beam
Jennifer Toes, 23/08/2017


Demonstration of wearable technology for AR in the EDUSAFE project (Image: CERN)

In an effort to upgrade European detector testing facilities, the AIDA-2020 project is developing an augmented reality (AR) display framework at the GIF++ facility, located at the CERN Super Proton Synchrotron (SPS).

AR blurs the line between real and digital information by superimposing computer-generated content over images of real environments. AR lends itself well to mobile games and apps, but is also a growing area of interest for several professional fields. In this setting, AR must have a high degree of precision, realism, reliability, and the ability to react quickly to changes in environment.

The GIF++ AR display framework combines a camera or image-capable device (such as a tablet), a cosmic ray tracker, data acquisition software, and AR software.

In standard use, the cosmic ray tracker acts as a sensor able to identify the presence of cosmic rays and particle traces as they hit the earth at almost the speed of light. The data acquisition (DAQ) software processes the information from the sensor to discern which data is useful and reconstructs the particle’s trajectory.

Using the AR framework, this process can be visualised in real time with a mobile device in the user perspective. The camera provides a real time view of the GIF++ facility, and the AR software uses the information acquired by the DAQ software to digitally recreate the cosmic rays in real time by superimposing the data on the device’s screen as visual content.

To do this accurately, the AR software must be able to correctly identify the rays’ location and the user’s field of vision represented by the mobile device camera:

“The computer must know where you are and what you are looking at to be able to create the augmented content, so it needs reference points,” said Giulio Aielli, responsible for the GIF++ AR framework under AIDA-2020.

Reference points act as a marker, allowing the software to recreate the orientation of the camera using six degrees of freedom. This is crucial in producing digital content, such as 3D animations representing particle tracks, in the correct location and position. At GIF++, these points can be achieved with specific markers, similar to QR codes, printed onto removable stickers.

Recreating cosmic rays or particle beams may also be useful to science educators and museums. More practically, this framework can help scientists and engineers to set up and test complex machines whilst minimising commissioning or debugging time. In the future, Giulio envisions connecting this technology to a Detector Control System (DCS), which comprises a huge network of sensors inside detectors:

“You could get information about the detector immediately, without installing a specific user interface on each piece of equipment. It would be a mix between AR and the Internet of Things, used for tasks like maintenance, testing or debugging.”

Of course, augmented reality systems still require further research and development to bring their technology to its full potential. At GIF++, the next step is to create the reference framework to be able to visualise the bunker, and AR systems may just help move the frontier of detector science forward.

 

You are here