Dissemin is shutting down on January 1st, 2025

Published in

Massachusetts Institute of Technology Press, Data Intelligence, 1-2(2), p. 108-121, 2020

DOI: 10.1162/dint_a_00033

Zenodo, 2019

DOI: 10.5281/zenodo.3528076

Zenodo, 2019

DOI: 10.5281/zenodo.3268653

Links

Tools

Export citation

Search in Google Scholar

FAIR Computational Workflows

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Orange circle
Preprint: archiving restricted
Orange circle
Postprint: archiving restricted
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

Computational workflows are an increasingly important part of the research landscape, and a key tool for: instrumentation data capture, data processing pipelines, data analytics, predictive modelling and simulation suites. Properly designed workflows contribute to FAIR data principles [FAIR principles explained in this issue], since they provide the metadata and provenance necessary to describe their data products and they describe the involved data realms in a formalized, completely traceable way. Workflows are method digital objects in their own right that are FAIR too; however they are not data, they are software. The FAIR principles for data are not directly applicable and need to be adapted and extended. Workflows bring the FAIR principles to a new level to cater for their composite and living nature, their dependencies on their environment and their components, and their need for robust and portable execution.