Oxford University Press, Monthly Notices of the Royal Astronomical Society, 4(495), p. 3819-3838, 2020
Full text: Unavailable
ABSTRACT Metallicity gradients are important diagnostics of galaxy evolution, because they record the history of events such as mergers, gas inflow, and star formation. However, the accuracy with which gradients can be measured is limited by spatial resolution and noise, and hence, measurements need to be corrected for such effects. We use high-resolution (∼20 pc) simulation of a face-on Milky Way mass galaxy, coupled with photoionization models, to produce a suite of synthetic high-resolution integral field spectroscopy (IFS) datacubes. We then degrade the datacubes, with a range of realistic models for spatial resolution (2−16 beams per galaxy scale length) and noise, to investigate and quantify how well the input metallicity gradient can be recovered as a function of resolution and signal-to-noise ratio (SNR) with the intention to compare with modern IFS surveys like MaNGA and SAMI. Given appropriate propagation of uncertainties and pruning of low SNR pixels, we show that a resolution of 3–4 telescope beams per galaxy scale length is sufficient to recover the gradient to ∼10–20 per cent uncertainty. The uncertainty escalates to ∼60 per cent for lower resolution. Inclusion of the low SNR pixels causes the uncertainty in the inferred gradient to deteriorate. Our results can potentially inform future IFS surveys regarding the resolution and SNR required to achieve a desired accuracy in metallicity gradient measurements.