Oxford University Press, Monthly Notices of the Royal Astronomical Society, 4(443), p. 3578-3585, 2014
Full text: Download
The physical origin of the >0.1 GeV emission detected from Gamma-Ray Bursts (GRBs) by the Fermi satellite has not yet been completely understood. In this work we consider the GeV light curves of ten GRBs with measured redshift detected by the Fermi-LAT. These light curves are characterised by a long-lived ($\gtrsim10^2$ seconds) emission, whose luminosity decays in time as a power-law. While the decay rate is similar for all GRBs (i.e. $L_{LAT}∝ t^{-1.2}$), the normalisation spans about two orders of magnitude in luminosity. However, after re-normalising the luminosities to the prompt energetics $E_{iso}$ the light curves overlap. We consider the scenario in which the temporally extended LAT emission is dominated by synchrotron radiation from electrons accelerated at the forward external shock. According to this model, at high-energies (i.e. above the typical synchrotron frequencies) a small dispersion of the $E_{iso}$-normalised light curves is expected. The fact that the LAT temporally extended emission follows this behaviour reinforces its interpretation in terms of afterglow radiation from external shocks. Assuming this scenario, we argue that the parameters $ε_e$ and $η_γ$ (i.e., the fraction of shock-dissipated energy gained by the electrons, and the efficiency of the mechanism producing the prompt radiation, respectively) must be narrowly distributed. ; Comment: 9 pages, 4 figures, submitted to MNRAS