Oxford University Press, Monthly Notices of the Royal Astronomical Society, 4(494), p. 4751-4770, 2020
Full text: Download
ABSTRACT The attenuation of light from star-forming galaxies is correlated with a multitude of physical parameters including star formation rate, metallicity and total dust content. This variation in attenuation is even more evident on kiloparsec scales, which is the relevant size for many current spectroscopic integral field unit surveys. To understand the cause of this variation, we present and analyse Swift/UVOT near-UV (NUV) images and SDSS/MaNGA emission-line maps of 29 nearby (z < 0.084) star-forming galaxies. We resolve kiloparsec-sized star-forming regions within the galaxies and compare their optical nebular attenuation (i.e. the Balmer emission line optical depth, $τ ^{l}_{B}≡ τ _{\textrm {H}β }-τ _{\textrm {H}α }$) and NUV stellar continuum attenuation (via the NUV power-law index, β) to the attenuation law described by Battisti et al. We show the data agree with that model, albeit with significant scatter. We explore the dependence of the scatter of the β–$τ ^{l}_{B}$ measurements from the star-forming regions on different physical parameters, including distance from the nucleus, star formation rate and total dust content. Finally, we compare the measured $τ ^{l}_{B}$ and β values for the individual star-forming regions with those of the integrated galaxy light. We find a strong variation in β between the kiloparsec scale and the larger galaxy scale that is not seen in $τ ^{l}_{B}$. We conclude that the sightline dependence of UV attenuation and the reddening of β due to the light from older stellar populations could contribute to the scatter in the β–$τ ^{l}_{B}$ relation.