Published in

Oxford University Press, NAR Genomics and Bioinformatics, 2(5), 2022

DOI: 10.1093/nargab/lqad060

Links

Tools

Export citation

Search in Google Scholar

Pervasive effects of RNA degradation on Nanopore direct RNA sequencing

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Abstract Oxford Nanopore direct RNA sequencing (DRS) is capable of sequencing complete RNA molecules and accurately measuring gene and isoform expression. However, as DRS is designed to profile intact RNA, expression quantification may be more heavily dependent upon RNA integrity than alternative RNA sequencing methodologies. It is currently unclear how RNA degradation impacts DRS or whether it can be corrected for. To assess the impact of RNA integrity on DRS, we performed a degradation time series using SH-SY5Y neuroblastoma cells. Our results demonstrate that degradation is a significant and pervasive factor that can bias DRS measurements, including a reduction in library complexity resulting in an overrepresentation of short genes and isoforms. Degradation also biases differential expression analyses; however, we find that explicit correction can almost fully recover meaningful biological signal. In addition, DRS provided less biased profiling of partially degraded samples than Nanopore PCR-cDNA sequencing. Overall, we find that samples with RNA integrity number (RIN) > 9.5 can be treated as undegraded and samples with RIN > 7 can be utilized for DRS with appropriate correction. These results establish the suitability of DRS for a wide range of samples, including partially degraded in vivo clinical and post-mortem samples, while limiting the confounding effect of degradation on expression quantification.