Scientists at the National Center for Supercomputing Applications (NCSA), located at the University of Illinois at Urbana-Champaign used GPUs and deep learning to rapidly detect and characterize gravitational waves. This new approach enables astronomers to study gravitational waves using minimal computational resources, reducing time to discovery and increasing the scientific reach of gravitational wave astrophysics.
Using the NVIDIA DGX-1 and cuDNN-accelerated MXNet deep learning framework, the researchers trained their convolutional neural networks on nearly 2,500 waveform templates obtained from the Einstein Toolkit run on the Blue Waters supercomputer — and data from the LIGO Open Science Center.
Their deep learning technique named Deep Filtering achieves similar sensitivities and lower errors compared to established gravitational wave detection algorithms, while being far more computationally efficient and more resilient to noise anomalies. The method allows faster than real-time processing of gravitational waves in LIGO’s raw data, and also enables new physics, since it can detect new classes of gravitational wave sources that may go unnoticed with existing detection algorithms.
NCSA Gravity Group researchers Daniel George and Eliu Huerta are extending this method to identify in real-time electromagnetic counterparts to gravitational wave events in future LSST data.
This work was awarded first place at the ACM Student Research Competition at SC17, and also received the Best Poster Award at the 24th IEEE international Conference on HPC, Data, and Analytics.