Speaker
Description
The light curves (LCs) of long gamma-ray bursts (GRBs) show a wide variety of morphologies, which current LC simulation models based on the internal shock paradigm still fail to fully reproduce. The reason is that, despite the recent significant advance in understanding the energetics and dynamics of long GRBs, the nature of their inner engine, how the relativist outflow is powered, and the dissipation mechanisms are still not understood. This limits our ability to describe and simulate those transients properly. A promising way to gain insights is modelling GRB LCs as the result of a common stochastic process. In the BATSE era, a stochastic pulse avalanche model was proposed by Stern et al. (1996) and tested by comparing ensemble-average properties of simulated and real LCs. Using machine learning, we optimised the model parameters by exploiting the genetic algorithm's capability to explore the parameter space thoroughly. We revived this model by applying it to two independent and complementary datasets, BATSE and Swift/BAT. In this contribution, we describe our optimisation algorithm, showing the results obtained on both datasets. Such a technique could be extended to different and more physically-grounded GRB light curve models. Moreover, the model allows us to simulate realistic LCs as they will be seen by upcoming detectors, which is fundamental to test light curve triggering algorithms in a realistic way and properly characterise next-generation high-energy instruments.