Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
0ead2c2
incmoplete metadata label visibility setting
ErykKul Dec 1, 2023
bda39cc
merged develop
ErykKul Dec 4, 2023
0cd23fb
added documentation
ErykKul Dec 5, 2023
02f2edc
typo fix
ErykKul Dec 7, 2023
99a4b25
Merge branch 'IQSS:develop' into 10116_incomplete_matadata_label_setting
ErykKul Dec 7, 2023
7248fdd
Merge branch '10116_incomplete_matadata_label_setting' of https://git…
ErykKul Dec 19, 2023
f6e5db2
option renamed: show-label-for-incomplete-when-published -> show-vali…
ErykKul Dec 19, 2023
7a3ee97
Merge branch 'IQSS:develop' into 10116_incomplete_matadata_label_setting
ErykKul Jan 3, 2024
3ff4183
Merge branch 'IQSS:develop' into 10116_incomplete_matadata_label_setting
ErykKul Apr 17, 2024
d97697b
merged develop
ErykKul May 2, 2024
afcbdfe
dataset is always checked for validity while indexing, even when alre…
ErykKul May 2, 2024
eef2416
fix for collections not showing up when both validity facets (valid a…
ErykKul May 2, 2024
75e87e4
removed unused method
ErykKul May 2, 2024
d4c7196
reverted removing method that is used by the frontend
ErykKul May 2, 2024
2ee5bff
fixed incomplete metadata being indexed as complete in some cases
ErykKul May 2, 2024
232029e
fix for permission wrapper not available in mydata -> if it is your d…
ErykKul May 6, 2024
ff4742a
unused variable cleanup
ErykKul May 7, 2024
8525c9a
cleaned up file page logic for incomplete metadata
ErykKul May 7, 2024
724f238
refactored isValid in DatasetVersion
ErykKul May 7, 2024
c40838c
change the default to false for UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED
ErykKul May 14, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Bug fixed for the ``incomplete metadata`` label being shown for published dataset with incomplete metadata in certain scenarios. This label will now be shown for draft versions of such datasets and published datasets that the user can edit. This label can also be made invisible for published datasets (regardless of edit rights) with the new option ``dataverse.ui.show-validity-label-when-published`` set to `false`.
20 changes: 20 additions & 0 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2895,6 +2895,24 @@ Defaults to ``false``.
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_API_ALLOW_INCOMPLETE_METADATA``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.ui.show-validity-label-when-published:

dataverse.ui.show-validity-label-when-published
+++++++++++++++++++++++++++++++++++++++++++++++

Even when you do not allow incomplete metadata to be saved in dataverse, some metadata may end up being incomplete, e.g., after making a metadata field mandatory. Datasets where that field is
not filled out, become incomplete, and therefore can be labeled with the ``incomplete metadata`` label. By default, this label is only shown for draft datasets and published datasets that the
user can edit. This option can be disabled by setting it to ``false`` where only draft datasets with incomplete metadata will have that label. When disabled, all published dataset will not have
that label. Note that you need to reindex the datasets after changing the metadata definitions. Reindexing will update the labels and other dataset information according to the new situation.

When enabled (by default), published datasets with incomplete metadata will have an ``incomplete metadata`` label attached to them, but only for the datasets that the user can edit.
You can list these datasets, for example, with the validity of metadata filter shown in "My Data" page that can be turned on by enabling the :ref:`dataverse.ui.show-validity-filter` option.

Defaults to ``true``.

Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_API_SHOW_LABEL_FOR_INCOMPLETE_WHEN_PUBLISHED``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.signposting.level1-author-limit:

dataverse.signposting.level1-author-limit
Expand Down Expand Up @@ -3092,6 +3110,8 @@ Defaults to ``false``.
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_UI_ALLOW_REVIEW_FOR_INCOMPLETE``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.ui.show-validity-filter:

dataverse.ui.show-validity-filter
+++++++++++++++++++++++++++++++++

Expand Down
8 changes: 3 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -2296,13 +2296,11 @@ private void displayPublishMessage(){

public boolean isValid() {
if (valid == null) {
DatasetVersion version = dataset.getLatestVersion();
if (!version.isDraft()) {
if (workingVersion.isDraft() || (canUpdateDataset() && JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true))) {
valid = workingVersion.isValid();
} else {
valid = true;
}
DatasetVersion newVersion = version.cloneDatasetVersion();
newVersion.setDatasetFields(newVersion.initDatasetFields());
valid = newVersion.isValid();
}
return valid;
}
Expand Down
31 changes: 30 additions & 1 deletion src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
Original file line number Diff line number Diff line change
Expand Up @@ -1728,7 +1728,36 @@ public List<ConstraintViolation<DatasetField>> validateRequired() {
}

public boolean isValid() {
return validate().isEmpty();
// first clone to leave the original untouched
final DatasetVersion newVersion = this.cloneDatasetVersion();
// initDatasetFields
newVersion.setDatasetFields(newVersion.initDatasetFields());
// remove special "N/A" values and empty values
newVersion.removeEmptyValues();
// check validity of present fields and detect missing mandatory fields
return newVersion.validate().isEmpty();
}

private void removeEmptyValues() {
if (this.getDatasetFields() != null) {
for (DatasetField dsf : this.getDatasetFields()) {
removeEmptyValues(dsf);
}
}
}

private void removeEmptyValues(DatasetField dsf) {
if (dsf.getDatasetFieldType().isPrimitive()) { // primitive
final Iterator<DatasetFieldValue> i = dsf.getDatasetFieldValues().iterator();
while (i.hasNext()) {
final String v = i.next().getValue();
if (StringUtils.isBlank(v) || DatasetField.NA_VALUE.equals(v)) {
i.remove();
}
}
} else {
dsf.getDatasetFieldCompoundValues().forEach(cv -> cv.getChildDatasetFields().forEach(v -> removeEmptyValues(v)));
}
}

public Set<ConstraintViolation> validate() {
Expand Down
16 changes: 11 additions & 5 deletions src/main/java/edu/harvard/iq/dataverse/FilePage.java
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,7 @@
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean;
import edu.harvard.iq.dataverse.makedatacount.MakeDataCountLoggingServiceBean.MakeDataCountEntry;
import edu.harvard.iq.dataverse.privateurl.PrivateUrlServiceBean;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.util.FileUtil;
Expand Down Expand Up @@ -314,13 +315,18 @@ private void displayPublishMessage(){
}
}

Boolean valid = null;

public boolean isValid() {
if (!fileMetadata.getDatasetVersion().isDraft()) {
return true;
if (valid == null) {
final DatasetVersion workingVersion = fileMetadata.getDatasetVersion();
if (workingVersion.isDraft() || (canUpdateDataset() && JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true))) {
valid = workingVersion.isValid();
} else {
valid = true;
}
}
DatasetVersion newVersion = fileMetadata.getDatasetVersion().cloneDatasetVersion();
newVersion.setDatasetFields(newVersion.initDatasetFields());
return newVersion.isValid();
return valid;
}

private boolean canViewUnpublishedDataset() {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,7 @@
*/
package edu.harvard.iq.dataverse.mydata;

import edu.harvard.iq.dataverse.DatasetServiceBean;
import edu.harvard.iq.dataverse.DataverseRoleServiceBean;
import edu.harvard.iq.dataverse.DataverseServiceBean;
import edu.harvard.iq.dataverse.DataverseSession;
Expand Down Expand Up @@ -63,7 +64,7 @@ public class DataRetrieverAPI extends AbstractApiBean {
private static final String retrieveDataPartialAPIPath = "retrieve";

@Inject
DataverseSession session;
DataverseSession session;

@EJB
DataverseRoleServiceBean dataverseRoleService;
Expand All @@ -81,6 +82,8 @@ public class DataRetrieverAPI extends AbstractApiBean {
//MyDataQueryHelperServiceBean myDataQueryHelperServiceBean;
@EJB
GroupServiceBean groupService;
@EJB
DatasetServiceBean datasetService;

private List<DataverseRole> roleList;
private DataverseRolePermissionHelper rolePermissionHelper;
Expand Down Expand Up @@ -491,7 +494,8 @@ private JsonArrayBuilder formatSolrDocs(SolrQueryResponse solrResponse, RoleTagR
// -------------------------------------------
// (a) Get core card data from solr
// -------------------------------------------
myDataCardInfo = doc.getJsonForMyData();

myDataCardInfo = doc.getJsonForMyData(isValid(doc));

if (!doc.getEntity().isInstanceofDataFile()){
String parentAlias = dataverseService.getParentAliasString(doc);
Expand All @@ -514,4 +518,8 @@ private JsonArrayBuilder formatSolrDocs(SolrQueryResponse solrResponse, RoleTagR
return jsonSolrDocsArrayBuilder;

}

private boolean isValid(SolrSearchResult result) {
return result.isValid(x -> true);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -292,7 +292,7 @@ public String getSolrFragmentForPublicationStatus(){
}

public String getSolrFragmentForDatasetValidity(){
if ((this.datasetValidities == null) || (this.datasetValidities.isEmpty())){
if ((this.datasetValidities == null) || (this.datasetValidities.isEmpty()) || (this.datasetValidities.size() > 1)){
return "";
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -835,16 +835,7 @@ public SolrInputDocuments toSolrDocs(IndexableDataset indexableDataset, Set<Long
solrInputDocument.addField(SearchFields.DATASET_PERSISTENT_ID, dataset.getGlobalId().toString());
solrInputDocument.addField(SearchFields.PERSISTENT_URL, dataset.getPersistentURL());
solrInputDocument.addField(SearchFields.TYPE, "datasets");

boolean valid;
if (!indexableDataset.getDatasetVersion().isDraft()) {
valid = true;
} else {
DatasetVersion version = indexableDataset.getDatasetVersion().cloneDatasetVersion();
version.setDatasetFields(version.initDatasetFields());
valid = version.isValid();
}
solrInputDocument.addField(SearchFields.DATASET_VALID, valid);
solrInputDocument.addField(SearchFields.DATASET_VALID, indexableDataset.getDatasetVersion().isValid());

final Dataverse dataverse = dataset.getDataverseContext();
final String dvIndexableCategoryName = dataverse.getIndexableCategoryName();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -355,8 +355,7 @@ The real issue here (https://github.com/IQSS/dataverse/issues/7304) is caused
* https://github.com/IQSS/dataverse/issues/84
*/
int numRows = 10;
HttpServletRequest httpServletRequest = (HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest();
DataverseRequest dataverseRequest = new DataverseRequest(session.getUser(), httpServletRequest);
DataverseRequest dataverseRequest = getDataverseRequest();
List<Dataverse> dataverses = new ArrayList<>();
dataverses.add(dataverse);
solrQueryResponse = searchService.search(dataverseRequest, dataverses, queryToPassToSolr, filterQueriesFinal, sortField, sortOrder.toString(), paginationStart, onlyDataRelatedToMe, numRows, false, null, null, !isFacetsDisabled(), true);
Expand Down Expand Up @@ -1489,9 +1488,14 @@ public boolean isRetentionExpired(SolrSearchResult result) {
return false;
}
}

private DataverseRequest getDataverseRequest() {
final HttpServletRequest httpServletRequest = (HttpServletRequest) FacesContext.getCurrentInstance().getExternalContext().getRequest();
return new DataverseRequest(session.getUser(), httpServletRequest);
}

public boolean isValid(SolrSearchResult result) {
return result.isValid();
return result.isValid(x -> permissionsWrapper.canUpdateDataset(getDataverseRequest(), datasetService.find(x.getEntityId())));
}

public enum SortOrder {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
import java.util.HashMap;
import java.util.List;
import java.util.Map;
import java.util.function.Predicate;
import java.util.logging.Logger;

import edu.harvard.iq.dataverse.*;
Expand All @@ -19,6 +20,7 @@

import edu.harvard.iq.dataverse.api.Util;
import edu.harvard.iq.dataverse.dataset.DatasetThumbnail;
import edu.harvard.iq.dataverse.settings.JvmSettings;
import edu.harvard.iq.dataverse.util.DateUtil;
import edu.harvard.iq.dataverse.util.json.JsonPrinter;
import edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder;
Expand Down Expand Up @@ -402,15 +404,15 @@ public JsonArrayBuilder getRelevance() {
*
* @return
*/
public JsonObjectBuilder getJsonForMyData() {
public JsonObjectBuilder getJsonForMyData(boolean isValid) {

JsonObjectBuilder myDataJson = json(true, true, true);// boolean showRelevance, boolean showEntityIds, boolean showApiUrls)

myDataJson.add("publication_statuses", this.getPublicationStatusesAsJSON())
.add("is_draft_state", this.isDraftState()).add("is_in_review_state", this.isInReviewState())
.add("is_unpublished_state", this.isUnpublishedState()).add("is_published", this.isPublishedState())
.add("is_deaccesioned", this.isDeaccessionedState())
.add("is_valid", this.isValid())
.add("is_valid", isValid)
.add("date_to_display_on_card", getDateToDisplayOnCard());

// Add is_deaccessioned attribute, even though MyData currently screens any deaccessioned info out
Expand Down Expand Up @@ -1256,7 +1258,19 @@ public void setDatasetValid(Boolean datasetValid) {
this.datasetValid = datasetValid == null || Boolean.valueOf(datasetValid);
}

public boolean isValid() {
return datasetValid;
public boolean isValid(Predicate<SolrSearchResult> canUpdateDataset) {
if (this.datasetValid) {
return true;
}
if (!this.getType().equals("datasets")) {
return true;
}
if (this.isDraftState()) {
return false;
}
if (!JvmSettings.UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED.lookupOptional(Boolean.class).orElse(true)) {
return true;
}
return !canUpdateDataset.test(this);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -230,6 +230,7 @@ public enum JvmSettings {
SCOPE_UI(PREFIX, "ui"),
UI_ALLOW_REVIEW_INCOMPLETE(SCOPE_UI, "allow-review-for-incomplete"),
UI_SHOW_VALIDITY_FILTER(SCOPE_UI, "show-validity-filter"),
UI_SHOW_VALIDITY_LABEL_WHEN_PUBLISHED(SCOPE_UI, "show-validity-label-when-published"),

// NetCDF SETTINGS
SCOPE_NETCDF(PREFIX, "netcdf"),
Expand Down
2 changes: 1 addition & 1 deletion src/main/webapp/file.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@
<h:outputText value="#{bundle['dataset.versionUI.unpublished']}" styleClass="label label-warning" rendered="#{!FilePage.fileMetadata.datasetVersion.dataset.released}"/>
<h:outputText value="#{bundle['dataset.versionUI.deaccessioned']}" styleClass="label label-danger" rendered="#{FilePage.fileMetadata.datasetVersion.deaccessioned}"/>
<h:outputText value="#{FilePage.fileMetadata.datasetVersion.externalStatusLabel}" styleClass="label label-info" rendered="#{FilePage.fileMetadata.datasetVersion.externalStatusLabel!=null and FilePage.canPublishDataset()}"/>
<h:outputText value="#{bundle['incomplete']}" styleClass="label label-danger" rendered="#{FilePage.fileMetadata.datasetVersion.draft and !FilePage.fileMetadata.datasetVersion.valid}"/>
<h:outputText value="#{bundle['incomplete']}" styleClass="label label-danger" rendered="#{!FilePage.valid}"/>
<!-- DATASET VERSION NUMBER -->
<h:outputText styleClass="label label-default" rendered="#{FilePage.fileMetadata.datasetVersion.released and !(FilePage.fileMetadata.datasetVersion.draft or FilePage.fileMetadata.datasetVersion.inReview)}"
value="#{bundle['file.DatasetVersion']} #{FilePage.fileMetadata.datasetVersion.versionNumber}.#{FilePage.fileMetadata.datasetVersion.minorVersionNumber}"/>
Expand Down