Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions conf/solr/8.11.1/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -261,6 +261,9 @@
<field name="cleaningOperations" type="text_en" multiValued="false" stored="true" indexed="true"/>
<field name="collectionMode" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="collectorTraining" type="text_en" multiValued="false" stored="true" indexed="true"/>
<field name="workflowType" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="workflowCodeRepository" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="workflowDocumentation" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="contributor" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="contributorName" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="contributorType" type="text_en" multiValued="true" stored="true" indexed="true"/>
Expand Down Expand Up @@ -498,6 +501,9 @@
<copyField source="cleaningOperations" dest="_text_" maxChars="3000"/>
<copyField source="collectionMode" dest="_text_" maxChars="3000"/>
<copyField source="collectorTraining" dest="_text_" maxChars="3000"/>
<copyField source="workflowType" dest="_text_" maxChars="3000"/>
<copyField source="workflowCodeRepository" dest="_text_" maxChars="3000"/>
<copyField source="workflowDocumentation" dest="_text_" maxChars="3000"/>
<copyField source="contributor" dest="_text_" maxChars="3000"/>
<copyField source="contributorName" dest="_text_" maxChars="3000"/>
<copyField source="contributorType" dest="_text_" maxChars="3000"/>
Expand Down
6 changes: 6 additions & 0 deletions doc/release-notes/8639-computational-workflow.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
## Adding Computational Workflow Metadata
The new Computational Workflow metadata block will allow depositors to effectively tag datasets as computational workflows.

To add the new metadata block, follow the instructions in the user guide: <https://guides.dataverse.org/en/latest/admin/metadatacustomization.html>

The location of the new metadata block tsv file is: `dataverse/scripts/api/data/metadatablocks/computational_workflow.tsv`
Comment on lines +1 to +6
Copy link
Contributor

@poikilotherm poikilotherm Jul 19, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion: May I request we declare this an experimental thing as usually done with new things like these?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am not sure it should be. Is this something you can make a decision on @jggautier?

Or @scolapasta / @pdurbin , should this feature be labeled as experimental?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@poikilotherm, the effect of labelling this experimental would be to let installations know that it's likely that it will change (more likely than other changes in the update) as more work is done to support the publication of workflows and software. Is that the reasoning?

Since this is considered an MVP, I agree, and the group that worked on this metadatablock share the same concerns you raise in your comments in this PR (and in your talks during the Dataverse Community Meeting). I would confirm with Mahmood that this be labelled as experimental. Does anyone know Mahmood's GitHub username?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems to be @mmshad.

As @scolapasta and @jggautier know, we just talked about this at standup. That is, should this feature be enabled for all new installations or not? We're going to talk about this a bit more during our sprint planning meeting in a few hours.

As Gustavo pointed out, for embargo (which isn't on by default) we turned it on for the demo server to play around with it. (I'm not even sure if embargo is enabled in Harvard Dataverse or not.) So we could try this... enable the workflow block on the demo server first.

If we do flag the workflow block as experimental and not enabled by default we should probably remove it from the appendix of the User Guide or at least have a note saying that your admin need to manually enable it.

@mmshad if you happen to know any Harvard Dataverse users that would like to try out this workflow block, please let us know.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@pdurbin , I can ask a few labs to start uploading their workflow. My team can use it as well.

30 changes: 22 additions & 8 deletions doc/sphinx-guides/source/user/appendix.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,23 +8,37 @@ Additional documentation complementary to the User Guide.
.. contents:: |toctitle|
:local:

.. _metadata-references:

Metadata References
======================

The Dataverse Project is committed to using standard-compliant metadata to ensure that a Dataverse installation's
metadata can be mapped easily to standard metadata schemas and be exported into JSON
format (XML for tabular file metadata) for preservation and interoperability.

Supported Metadata
~~~~~~~~~~~~~~~~~~

Detailed below are what metadata schemas we support for Citation and Domain Specific Metadata in the Dataverse Project:

- `Citation Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=0>`__: compliant with `DDI Lite <http://www.ddialliance.org/specification/ddi2.1/lite/index.html>`_, `DDI 2.5 Codebook <http://www.ddialliance.org/>`__, `DataCite 3.1 <http://schema.datacite.org/meta/kernel-3.1/doc/DataCite-MetadataKernel_v3.1.pdf>`__, and Dublin Core's `DCMI Metadata Terms <http://dublincore.org/documents/dcmi-terms/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/citation.tsv>`__). Language field uses `ISO 639-1 <https://www.loc.gov/standards/iso639-2/php/English_list.php>`__ controlled vocabulary.
- `Geospatial Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=4>`__: compliant with DDI Lite, DDI 2.5 Codebook, DataCite, and Dublin Core (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/geospatial.tsv>`__). Country / Nation field uses `ISO 3166-1 <http://en.wikipedia.org/wiki/ISO_3166-1>`_ controlled vocabulary.
- `Social Science & Humanities Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=1>`__: compliant with DDI Lite, DDI 2.5 Codebook, and Dublin Core (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/social_science.tsv>`__).
- `Astronomy and Astrophysics Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=3>`__
: These metadata elements can be mapped/exported to the International Virtual Observatory Alliance’s (IVOA)
- `Citation Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=0>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/citation.tsv>`__): compliant with `DDI Lite <http://www.ddialliance.org/specification/ddi2.1/lite/index.html>`_, `DDI 2.5 Codebook <http://www.ddialliance.org/>`__, `DataCite 3.1 <http://schema.datacite.org/meta/kernel-3.1/doc/DataCite-MetadataKernel_v3.1.pdf>`__, and Dublin Core's `DCMI Metadata Terms <http://dublincore.org/documents/dcmi-terms/>`__ . Language field uses `ISO 639-1 <https://www.loc.gov/standards/iso639-2/php/English_list.php>`__ controlled vocabulary.
- `Geospatial Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=4>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/geospatial.tsv>`__): compliant with DDI Lite, DDI 2.5 Codebook, DataCite, and Dublin Core. Country / Nation field uses `ISO 3166-1 <http://en.wikipedia.org/wiki/ISO_3166-1>`_ controlled vocabulary.
- `Social Science & Humanities Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=1>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/social_science.tsv>`__): compliant with DDI Lite, DDI 2.5 Codebook, and Dublin Core.
- `Astronomy and Astrophysics Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=3>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/astrophysics.tsv>`__): These metadata elements can be mapped/exported to the International Virtual Observatory Alliance’s (IVOA)
`VOResource Schema format <http://www.ivoa.net/documents/latest/RM.html>`__ and is based on
`Virtual Observatory (VO) Discovery and Provenance Metadata <http://perma.cc/H5ZJ-4KKY>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/astrophysics.tsv>`__).
- `Life Sciences Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=2>`__: based on `ISA-Tab Specification <https://isa-specs.readthedocs.io/en/latest/isamodel.html>`__, along with controlled vocabulary from subsets of the `OBI Ontology <http://bioportal.bioontology.org/ontologies/OBI>`__ and the `NCBI Taxonomy for Organisms <http://www.ncbi.nlm.nih.gov/Taxonomy/taxonomyhome.html/>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/biomedical.tsv>`__).
- `Journal Metadata <https://docs.google.com/spreadsheets/d/13HP-jI_cwLDHBetn9UKTREPJ_F4iHdAvhjmlvmYdSSw/edit#gid=8>`__: based on the `Journal Archiving and Interchange Tag Set, version 1.2 <https://jats.nlm.nih.gov/archiving/tag-library/1.2/chapter/how-to-read.html>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/journals.tsv>`__).
`Virtual Observatory (VO) Discovery and Provenance Metadata <http://perma.cc/H5ZJ-4KKY>`__.
- `Life Sciences Metadata <https://docs.google.com/spreadsheet/ccc?key=0AjeLxEN77UZodHFEWGpoa19ia3pldEFyVFR0aFVGa0E#gid=2>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/biomedical.tsv>`__): based on `ISA-Tab Specification <https://isa-specs.readthedocs.io/en/latest/isamodel.html>`__, along with controlled vocabulary from subsets of the `OBI Ontology <http://bioportal.bioontology.org/ontologies/OBI>`__ and the `NCBI Taxonomy for Organisms <http://www.ncbi.nlm.nih.gov/Taxonomy/taxonomyhome.html/>`__.
- `Journal Metadata <https://docs.google.com/spreadsheets/d/13HP-jI_cwLDHBetn9UKTREPJ_F4iHdAvhjmlvmYdSSw/edit#gid=8>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/journals.tsv>`__): based on the `Journal Archiving and Interchange Tag Set, version 1.2 <https://jats.nlm.nih.gov/archiving/tag-library/1.2/chapter/how-to-read.html>`__.

Experimental Metadata
~~~~~~~~~~~~~~~~~~~~~

Unlike supported metadata, experimental metadata is not enabled by default in a new Dataverse installation. Feedback via any `channel <https://dataverse.org/contact>`_ is welcome!

- `Computational Workflow Metadata <https://docs.google.com/spreadsheets/d/13HP-jI_cwLDHBetn9UKTREPJ_F4iHdAvhjmlvmYdSSw/edit#gid=447508596>`__ (`see .tsv version <https://github.com/IQSS/dataverse/blob/master/scripts/api/data/metadatablocks/computationalworkflow.tsv>`__): adapted from `Bioschemas Computational Workflow Profile, version 1.0 <https://bioschemas.org/profiles/ComputationalWorkflow/1.0-RELEASE>`__ and `Codemeta <https://codemeta.github.io/terms/>`__.

See Also
~~~~~~~~

See also the `Dataverse Software 4.0 Metadata Crosswalk: DDI, DataCite, DC, DCTerms, VO, ISA-Tab <https://docs.google.com/spreadsheets/d/10Luzti7svVTVKTA-px27oq3RxCUM-QbiTkm8iMd5C54/edit?usp=sharing>`__ document and the :doc:`/admin/metadatacustomization` section of the Admin Guide.
97 changes: 97 additions & 0 deletions doc/sphinx-guides/source/user/dataset-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,8 @@ Once a dataset has been published, its metadata can be exported in a variety of

Each of these metadata exports contains the metadata of the most recently published version of the dataset.

.. _adding-new-dataset:

Adding a New Dataset
====================

Expand All @@ -62,6 +64,8 @@ We currently only support the following HTML tags for any of our textbox metadat
<br>, <code>, <del>, <dd>, <dl>, <dt>, <em>, <hr>, <h1>-<h3>, <i>, <img>, <kbd>, <li>, <ol>, <p>, <pre>, <s>, <sup>, <sub>,
<strong>, <strike>, <u>, <ul>.

.. _dataset-file-upload:

File Upload
===========

Expand Down Expand Up @@ -153,6 +157,19 @@ Beginning with Dataverse Software 5.0, the way a Dataverse installation handles
- If a user attempts to replace a file with another file that has the same checksum, an error message will be displayed and the file will not be able to be replaced.
- If a user attempts to replace a file with a file that has the same checksum as a different file in the dataset, a warning will be displayed.

BagIt Support
-------------

BagIt is a set of hierarchical file system conventions designed to support disk-based storage and network transfer of arbitrary digital content. It offers several benefits such as integration with digital libraries, easy implementation, and transfer validation. See `the Wikipedia article <https://en.wikipedia.org/wiki/BagIt>`__ for more information.

If the Dataverse installation you are using has enabled BagIt file handling, when uploading BagIt files the repository will validate the checksum values listed in each BagIt’s manifest file against the uploaded files and generate errors about any mismatches. The repository will identify a certain number of errors, such as the first five errors in each BagIt file, before reporting the errors.

|bagit-image1|

You can fix the errors and reupload the BagIt files.

More information on how your admin can enable and configure the BagIt file handler can be found in the :ref:`Installation Guide <BagIt File Handler>`.

.. _file-handling:

File Handling
Expand Down Expand Up @@ -211,6 +228,72 @@ Finally, automating your code can be immensely helpful to the code and research

**Note:** Capturing code dependencies and automating your code will create new files in your directory. Make sure to include them when depositing your dataset.

Computational Workflow
----------------------

Computational Workflow Definition
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Computational workflows precisely describe a multi-step process to coordinate multiple computational tasks and their data dependencies that lead to data products in a scientific application. The computational tasks take different forms, such as running code (e.g. Python, C++, MATLAB, R, Julia), invoking a service, calling a command-line tool, accessing a database (e.g. SQL, NoSQL), submitting a job to a compute cloud (e.g. on-premises cloud, AWS, GCP, Azure), and execution of data processing scripts or workflow. The following diagram shows an example of a computational workflow with multiple computational tasks.

|cw-image1|


FAIR Computational Workflow
~~~~~~~~~~~~~~~~~~~~~~~~~~~

The FAIR Principles (Findable, Accessible, Interoperable, Reusable) apply to computational workflows (https://doi.org/10.1162/dint_a_00033) in two areas: as FAIR data and as FAIR criteria for workflows as digital objects. In the FAIR data area, "*properly designed workflows contribute to FAIR data principles since they provide the metadata and provenance necessary to describe their data products, and they describe the involved data in a formalized, completely traceable way*" (https://doi.org/10.1162/dint_a_00033). Regarding the FAIR criteria for workflows as digital objects, "*workflows are research products in their own right, encapsulating methodological know-how that is to be found and published, accessed and cited, exchanged and combined with others, and reused as well as adapted*" (https://doi.org/10.1162/dint_a_00033).

How to Create a Computational Workflow
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

There are multiple approaches to creating computational workflows. You may consider standard frameworks and tools such as Common Workflow Language (CWL), Snakemake, Galaxy, Nextflow, Ruffus or *ad hoc* methods using different programming languages (e.g. Python, C++, MATLAB, Julia, R), notebooks (e.g. Jupyter Notebook, R Notebook, and MATLAB Live Script) and command-line interpreters (e.g. Bash). Each computational task is defined differently, but all meet the definition of a computational workflow and all result in data products. You can find a few examples of computational workflows in the following GitHub repositories, where each follows several aspects of FAIR principles:

- Common Workflow Language (`GitHub Repository URL <https://github.com/fasrc/epa_cwl_airflow>`__)
- R Notebook (`GitHub Repository URL <https://github.com/fasrc/R_computational_workflow>`__)
- Jupyter Notebook (`GitHub Repository URL <https://github.com/fasrc/python-computational-workflow>`__)
- MATLAB Script (`GitHub Repository URL <https://github.com/fasrc/Matlab_computational_workflow>`__)

You are encouraged to review these examples when creating a computational workflow and publishing in a Dataverse repository.

At https://workflows.community, the Workflows Community Initiative offers resources for computational workflows, such as a list of workflow systems (https://workflows.community/systems) and other workflow registries (https://workflows.community/registries). The initiative also helps organize working groups related to workflows research, development and application.

How to Upload Your Computational Workflow
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

After you :ref:`upload your files <dataset-file-upload>`, you can apply a "Workflow" tag to your workflow files, such as your Snakemake or R Notebooks files, so that you and others can find them more easily among your deposit’s other files.

|cw-image3|

|cw-image4|

How to Describe Your Computational Workflow
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

The Dataverse installation you are using may have enabled Computational Workflow metadata fields for your use. If so, when :ref:`editing your dataset metadata <adding-new-dataset>`, you will see the fields described below.

|cw-image2|

As described in the :ref:`metadata-references` section of the :doc:`/user/appendix`, the three fields are adapted from `Bioschemas Computational Workflow Profile, version 1.0 <https://bioschemas.org/profiles/ComputationalWorkflow/1.0-RELEASE>`__ and `Codemeta <https://codemeta.github.io/terms/>`__:

- **Workflow Type**: The kind of Computational Workflow, which is designed to compose and execute a series of computational or data manipulation steps in a scientific application
- **External Code Repository URL**: A link to another public repository where the un-compiled, human-readable code and related code is also located (e.g., GitHub, GitLab, SVN)
- **Documentation**: A link (URL) to the documentation or text describing the Computational Workflow and its use


How to Search for Computational Workflows
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

If the search page of the Dataverse repository you are using includes a "Dataset Feature" facet with a Computational Workflows link, you can follow that link to find only datasets that contain computational workflows.

You can also search on the "Workflow Type" facet, if the Dataverse installation has the field enabled, to find datasets that contain certain types of computational workflows, such as workflows written in Common Workflow Language files or Jupyter Notebooks.

|cw-image5|

You can also search for files within datasets that have been tagged as "Workflow" files by clicking the Files checkbox to show only files and using the File Tag facet to show only files tagged as "Workflow".

|cw-image6|

Astronomy (FITS)
----------------

Expand Down Expand Up @@ -622,6 +705,20 @@ If you deaccession the most recently published version of the dataset but not al
:class: img-responsive
.. |image-file-tree-view| image:: ./img/file-tree-view.png
:class: img-responsive
.. |cw-image1| image:: ./img/computational-workflow-diagram.png
:class: img-responsive
.. |cw-image2| image:: ./img/computational-workflow-metadata.png
:class: img-responsive
.. |cw-image3| image:: ./img/file-tags-link.png
:class: img-responsive
.. |cw-image4| image:: ./img/file-tags-options.png
:class: img-responsive
.. |cw-image5| image:: ./img/computational-workflow-facets.png
:class: img-responsive
.. |cw-image6| image:: ./img/file-tags-facets.png
:class: img-responsive
.. |bagit-image1| image:: ./img/bagit-handler-errors.png
:class: img-responsive

.. _Make Data Count: https://makedatacount.org
.. _Crossref: https://crossref.org
Binary file modified doc/sphinx-guides/source/user/img/DatasetDiagram.png
100755 → 100644
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
21 changes: 21 additions & 0 deletions scripts/api/data/metadatablocks/computational_workflow.tsv
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
#metadataBlock name dataverseAlias displayName
computationalworkflow Computational Workflow Metadata
#datasetField name title description watermark fieldType displayOrder displayFormat advancedSearchField allowControlledVocabulary allowmultiples facetable displayoncreate required parent metadatablock_id termURI
workflowType Computational Workflow Type The kind of Computational Workflow, which is designed to compose and execute a series of computational or data manipulation steps in a scientific application text 0 TRUE TRUE TRUE TRUE TRUE FALSE computationalworkflow
workflowCodeRepository External Code Repository URL A link to the repository where the un-compiled, human readable code and related code is located (e.g. GitHub, GitLab, SVN) https://... url 1 FALSE FALSE TRUE FALSE TRUE FALSE computationalworkflow
workflowDocumentation Documentation A link (URL) to the documentation or text describing the Computational Workflow and its use textbox 2 FALSE FALSE TRUE FALSE TRUE FALSE computationalworkflow
#controlledVocabulary DatasetField Value identifier displayOrder
workflowType Common Workflow Language (CWL) workflowtype_cwl 1
workflowType Workflow Description Language (WDL) workflowtype_wdl 2
workflowType Nextflow workflowtype_nextflow 3
workflowType Snakemake workflowtype_snakemake 4
workflowType Ruffus workflowtype_ruffus 5
workflowType DAGMan workflowtype_dagman 6
workflowType Jupyter Notebook workflowtype_jupyter 7
workflowType R Notebook workflowtype_rstudio 8
workflowType MATLAB Script workflowtype_matlab 9
workflowType Bash Script workflowtype_bash 10
workflowType Makefile workflowtype_makefile 11
workflowType Other Python-based workflow workflowtype_otherpython 12
workflowType Other R-based workflow workflowtype_otherrbased 13
workflowType Other workflowtype_other 100
Loading