Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/api/dataaccess.rst
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ original "Saved Original", the proprietary (SPSS, Stata, R, etc.) file fr


"All Formats" bundled download for Tabular Files.
-----------------------------------------------
-------------------------------------------------

``/api/access/datafile/bundle/$id``

Expand Down
6 changes: 5 additions & 1 deletion doc/sphinx-guides/source/api/search.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,11 @@ https://demo.dataverse.org/api/search?q=trees
"file_content_type":"image/png",
"size_in_bytes":8361,
"md5":"0386269a5acb2c57b4eade587ff4db64",
"dataset_citation":"Spruce, Sabrina, 2016, \"Spruce Goose\", http://dx.doi.org/10.5072/FK2/NFSEHG, Root Dataverse, V1"
"file_persistent_id": "doi:10.5072/FK2/XTT5BV/PCCHV7",
"dataset_name": "Dataset One",
"dataset_id": "32",
"dataset_persistent_id": "doi:10.5072/FK2/XTT5BV",
"dataset_citation":"Spruce, Sabrina, 2016, \"Spruce Goose\", http://dx.doi.org/10.5072/FK2/XTT5BV, Root Dataverse, V1"
},
{
"name":"Birds",
Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/developers/workflows.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Dataverse has a flexible workflow mechanism that can be used to trigger actions


Introduction
---------
------------

Dataverse can perform two sequences of actions when datasets are published: one prior to publishing (marked by a ``PrePublishDataset`` trigger), and one after the publication has succeeded (``PostPublishDataset``). The pre-publish workflow is useful for having an external system prepare a dataset for being publicly accessed (a possibly lengthy activity that requires moving files around, uploading videos to a streaming server, etc.), or to start an approval process. A post-publish workflow might be used for sending notifications about the newly published dataset.

Expand Down Expand Up @@ -104,7 +104,7 @@ Available variables are:
* ``releaseStatus``

archiver
+++++++
++++++++

A step that sends an archival copy of a Dataset Version to a configured archiver, e.g. the DuraCloud interface of Chronopolis. See the `DuraCloud/Chronopolis Integration documentation <http://guides.dataverse.org/en/latest/admin/integrations.html#id15>`_ for further detail.

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -606,7 +606,7 @@ In the Chronopolis case, since the transfer from the DuraCloud front-end to arch

**PostPublication Workflow**

To automate the submission of archival copies to an archive as part of publication, one can setup a Dataverse Workflow using the "archiver" workflow step - see the :doc:`developers/workflows` guide.
To automate the submission of archival copies to an archive as part of publication, one can setup a Dataverse Workflow using the "archiver" workflow step - see the :doc:`/developers/workflows` guide.
. The archiver step uses the configuration information discussed above including the :ArchiverClassName setting. The workflow step definition should include the set of properties defined in \:ArchiverSettings in the workflow definition.

To active this workflow, one must first install a workflow using the archiver step. A simple workflow that invokes the archiver step configured to submit to DuraCloud as its only action is included in dataverse at /scripts/api/data/workflows/internal-archiver-workflow.json.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -576,6 +576,10 @@ public SolrQueryResponse search(DataverseRequest dataverseRequest, List<Datavers
Collections.sort(tabularDataTags);
solrSearchResult.setTabularDataTags(tabularDataTags);
}
String filePID = (String) solrDocument.getFieldValue(SearchFields.FILE_PERSISTENT_ID);
if(null != filePID && !"".equals(filePID) && !"".equals("null")) {
solrSearchResult.setFilePersistentId(filePID);
}
}
/**
* @todo store PARENT_ID as a long instead and cast as such
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -112,6 +112,8 @@ public class SolrSearchResult {
private String identifierOfDataverse = null;
private String nameOfDataverse = null;

private String filePersistentId = null;

public String getDvTree() {
return dvTree;
}
Expand Down Expand Up @@ -449,8 +451,13 @@ public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, bool

String identifierLabel = null;
String datasetCitation = null;
String datasetName = null;
String datasetId = null;
String datasetPersistentId = null;
String filePersistentId = null;
String preferredUrl = null;
String apiUrl = null;

if (this.type.equals(SearchConstants.DATAVERSES)) {
displayName = this.name;
identifierLabel = "identifier";
Expand All @@ -471,6 +478,9 @@ public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, bool
* title of the dataset it belongs to.
*/
datasetCitation = parent.get("citation");
datasetName = parent.get("name");
datasetId = parent.get("id");
datasetPersistentId = parent.get(SolrSearchResult.PARENT_IDENTIFIER);
}

//displayName = null; // testing NullSafeJsonBuilder
Expand Down Expand Up @@ -521,6 +531,10 @@ public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, bool
.add("md5", getFileMd5())
.add("checksum", JsonPrinter.getChecksumTypeAndValue(getFileChecksumType(), getFileChecksumValue()))
.add("unf", getUnf())
.add("file_persistent_id", this.filePersistentId)
.add("dataset_name", datasetName)
.add("dataset_id", datasetId)
.add("dataset_persistent_id", datasetPersistentId)
.add("dataset_citation", datasetCitation)
.add("deaccession_reason", this.deaccessionReason)
.add("citationHtml", this.citationHtml)
Expand All @@ -537,6 +551,7 @@ public JsonObjectBuilder json(boolean showRelevance, boolean showEntityIds, bool
nullSafeJsonBuilder.add("entity_id", this.entityId);
}
}

if (showApiUrls) {
/**
* @todo We should probably have a metadata_url or api_url concept
Expand Down Expand Up @@ -954,7 +969,14 @@ public String getFileParentIdentifier() {
return null;
//if (entity)
}

public String getFilePersistentId() {
return filePersistentId;
}

public void setFilePersistentId(String pid) {
filePersistentId = pid;
}
public String getFileUrl() {
// Nothing special needs to be done for harvested file URLs:
// simply directing these to the local dataset.xhtml for this dataset
Expand Down