Massachusetts Institute of Technology Press, Data Intelligence, 1-2(2), p. 108-121, 2020
DOI: 10.1162/dint_a_00033
Zenodo, 2019
Zenodo, 2019
Full text: Download
Computational workflows are an increasingly important part of the research landscape, and a key tool for: instrumentation data capture, data processing pipelines, data analytics, predictive modelling and simulation suites. Properly designed workflows contribute to FAIR data principles [FAIR principles explained in this issue], since they provide the metadata and provenance necessary to describe their data products and they describe the involved data realms in a formalized, completely traceable way. Workflows are method digital objects in their own right that are FAIR too; however they are not data, they are software. The FAIR principles for data are not directly applicable and need to be adapted and extended. Workflows bring the FAIR principles to a new level to cater for their composite and living nature, their dependencies on their environment and their components, and their need for robust and portable execution.