Many modern scientific fields deal with phenomena so complex that the resulting data cannot be accurately described using most statistical techniques. In some fields, such as Astrophysics and Multi-Messenger Astronomy, Machine Learning has offered many tools to help researchers, with techniques capable of understanding the dependencies of many phenomena with the underlying physical models. A common problem is the detection of anomalies in multivariate time series where the data have dependencies with the physics that characterizes the context. We present here a tool to understand the expected evolution of time series in order to discriminate and identify transient events from the normal behavior of a series of data evolving in time. The software employs Machine Learning techniques and was developed upon the use case of identifying high-energy transients, such as Gamma-Ray Bursts, in the data from the Fermi Gammaray Space Telescope, using its background rejection system, the Anti-Coincidence Detector. The latter has not been previously used for scientific purposes, so it offers a fertile ground for the development of this software. In this paper, we introduce our framework starting with the concept of time series, specifically the count rates of particles in the Anti-Coincidence Detector, and the orbit configuration of the Fermi satellite. The software implements Machine Learning models such as Feed-Forward Neural Network and Recurrent Neural Network, employed to learn the orbital Fermi configuration to predict the background. We then implemented a triggering algorithm called FOCuS, which identifies significant deviations from the background, signaling the presence of anomalies and, perhaps, astrophysical transients in the dataset. This tool can be adapted for various signals, making it applicable across different contexts and research.
Anomaly Detection with Machine Learning on Time Series: Unveiling Lost Transients Data
Crupi R.;Longo F.;
2025-01-01
Abstract
Many modern scientific fields deal with phenomena so complex that the resulting data cannot be accurately described using most statistical techniques. In some fields, such as Astrophysics and Multi-Messenger Astronomy, Machine Learning has offered many tools to help researchers, with techniques capable of understanding the dependencies of many phenomena with the underlying physical models. A common problem is the detection of anomalies in multivariate time series where the data have dependencies with the physics that characterizes the context. We present here a tool to understand the expected evolution of time series in order to discriminate and identify transient events from the normal behavior of a series of data evolving in time. The software employs Machine Learning techniques and was developed upon the use case of identifying high-energy transients, such as Gamma-Ray Bursts, in the data from the Fermi Gammaray Space Telescope, using its background rejection system, the Anti-Coincidence Detector. The latter has not been previously used for scientific purposes, so it offers a fertile ground for the development of this software. In this paper, we introduce our framework starting with the concept of time series, specifically the count rates of particles in the Anti-Coincidence Detector, and the orbit configuration of the Fermi satellite. The software implements Machine Learning models such as Feed-Forward Neural Network and Recurrent Neural Network, employed to learn the orbital Fermi configuration to predict the background. We then implemented a triggering algorithm called FOCuS, which identifies significant deviations from the background, signaling the presence of anomalies and, perhaps, astrophysical transients in the dataset. This tool can be adapted for various signals, making it applicable across different contexts and research.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


