Published in

Oxford University Press, Monthly Notices of the Royal Astronomical Society, 2(504), p. 3084-3091, 2021

DOI: 10.1093/mnras/stab1082

Links

Tools

Export citation

Search in Google Scholar

A machine learning approach for GRB detection in AstroSat CZTI data

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

ABSTRACT We present a machine learning (ML) based method for automated detection of Gamma-Ray Burst (GRB) candidate events in the range 60–250 keV from the AstroSat Cadmium Zinc Telluride Imager data. We use density-based spatial clustering to detect excess power and carry out an unsupervised hierarchical clustering across all such events to identify the different light curves present in the data. This representation helps us to understand the instrument’s sensitivity to the various GRB populations and identify the major non-astrophysical noise artefacts present in the data. We use Dynamic Time Warping (DTW) to carry out template matching, which ensures the morphological similarity of the detected events with known typical GRB light curves. DTW alleviates the need for a dense template repository often required in matched filtering like searches. The use of a similarity metric facilitates outlier detection suitable for capturing previously unmodelled events. We briefly discuss the characteristics of 35 long GRB candidates detected using the pipeline and show that with minor modifications such as adaptive binning, the method is also sensitive to short GRB events. Augmenting the existing data analysis pipeline with such ML capabilities alleviates the need for extensive manual inspection, enabling quicker response to alerts received from other observatories such as the gravitational-wave detectors.