Skip to content

Commit

Permalink
Merge remote-tracking branch 'dataverse/develop' into tc-citationdate…
Browse files Browse the repository at this point in the history
…-harvested-dataset
  • Loading branch information
tcoupin committed Sep 13, 2022
2 parents 7254c0c + c7b8b82 commit 6a2ec03
Show file tree
Hide file tree
Showing 179 changed files with 5,925 additions and 796 deletions.
6 changes: 6 additions & 0 deletions conf/solr/8.11.1/schema.xml
Original file line number Diff line number Diff line change
Expand Up @@ -261,6 +261,9 @@
<field name="cleaningOperations" type="text_en" multiValued="false" stored="true" indexed="true"/>
<field name="collectionMode" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="collectorTraining" type="text_en" multiValued="false" stored="true" indexed="true"/>
<field name="workflowType" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="workflowCodeRepository" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="workflowDocumentation" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="contributor" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="contributorName" type="text_en" multiValued="true" stored="true" indexed="true"/>
<field name="contributorType" type="text_en" multiValued="true" stored="true" indexed="true"/>
Expand Down Expand Up @@ -498,6 +501,9 @@
<copyField source="cleaningOperations" dest="_text_" maxChars="3000"/>
<copyField source="collectionMode" dest="_text_" maxChars="3000"/>
<copyField source="collectorTraining" dest="_text_" maxChars="3000"/>
<copyField source="workflowType" dest="_text_" maxChars="3000"/>
<copyField source="workflowCodeRepository" dest="_text_" maxChars="3000"/>
<copyField source="workflowDocumentation" dest="_text_" maxChars="3000"/>
<copyField source="contributor" dest="_text_" maxChars="3000"/>
<copyField source="contributorName" dest="_text_" maxChars="3000"/>
<copyField source="contributorType" dest="_text_" maxChars="3000"/>
Expand Down
72 changes: 72 additions & 0 deletions doc/release-notes/5.11.1-release-notes.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
# Dataverse Software 5.11.1

This is a bug fix release of the Dataverse Software. The .war file for v5.11 will no longer be made available and installations should upgrade directly from v5.10.1 to v5.11.1. To do so you will need **to follow the instructions for installing release 5.11 using the v5.11.1 war file**. (Note specifically the upgrade steps 6-9 from the 5.11 release note; most importantly, the ones related to the citation block and the Solr schema). **If you had previously installed v5.11** (no longer available), follow the simplified instructions below.

## Release Highlights

Dataverse Software 5.11 contains two critical issues that are fixed in this release.

First, if you delete a file from a published version of a dataset that has restricted files, the file will be deleted from the file system (or S3) and lose its "owner id" in the database. For details, see Issue #8867.

Second, if you are a superuser, it's possible to click "Delete Draft" and delete a published dataset if it has restricted files. For details, see #8845 and #8742.

## Notes for Dataverse Installation Administrators

### Identifying Datasets with Deleted Files

If you have been running 5.11, check if any files show "null" for the owner id. The "owner" of a file is the parent dataset:

```
select * from dvobject where dtype = 'DataFile' and owner_id is null;
```

For any of these files, change the owner id to the database id of the parent dataset. In addition, the file on disk (or in S3) is likely gone. Look at the "storageidentifier" field from the query above to determine the location of the file then restore the file from backup.

### Identifying Datasets Superusers May Have Accidentally Destroyed

Check the "actionlogrecord" table for DestroyDatasetCommand. While these "destroy" entries are normal when a superuser uses the API to destroy datasets, an entry is also created if a superuser has accidentally deleted a published dataset in the web interface with the "Delete Draft" button.

## Complete List of Changes

For the complete list of code changes in this release, see the [5.11.1 Milestone](https://github.com/IQSS/dataverse/milestone/105?closed=1) in GitHub.

For help with upgrading, installing, or general questions please post to the [Dataverse Community Google Group](https://groups.google.com/forum/#!forum/dataverse-community) or email support@dataverse.org.

## Installation

If this is a new installation, please see our [Installation Guide](https://guides.dataverse.org/en/5.11.1/installation/). Please also contact us to get added to the [Dataverse Project Map](https://guides.dataverse.org/en/5.11.1/installation/config.html#putting-your-dataverse-installation-on-the-map-at-dataverse-org) if you have not done so already.

## Upgrade Instructions

0\. These instructions assume that you've already successfully upgraded from Dataverse Software 4.x to Dataverse Software 5 following the instructions in the [Dataverse Software 5 Release Notes](https://github.com/IQSS/dataverse/releases/tag/v5.0). After upgrading from the 4.x series to 5.0, you should progress through the other 5.x releases before attempting the upgrade to 5.11.1. **To upgrade from 5.10.1, follow the instructions for installing release 5.11 using the v5.11.1 war file**. If you had previously installed v5.11 (no longer available), follow the simplified instructions below.

If you are running Payara as a non-root user (and you should be!), **remember not to execute the commands below as root**. Use `sudo` to change to that user first. For example, `sudo -i -u dataverse` if `dataverse` is your dedicated application user.

In the following commands we assume that Payara 5 is installed in `/usr/local/payara5`. If not, adjust as needed.

`export PAYARA=/usr/local/payara5`

(or `setenv PAYARA /usr/local/payara5` if you are using a `csh`-like shell)

1\. Undeploy the previous version.

- `$PAYARA/bin/asadmin list-applications`
- `$PAYARA/bin/asadmin undeploy dataverse<-version>`

2\. Stop Payara and remove the generated directory

- `service payara stop`
- `rm -rf $PAYARA/glassfish/domains/domain1/generated`

3\. Start Payara

- `service payara start`

4\. Deploy this version.

- `$PAYARA/bin/asadmin deploy dataverse-5.11.1.war`

5\. Restart Payara

- `service payara stop`
- `service payara start`
6 changes: 6 additions & 0 deletions doc/release-notes/8535-metadata-types-static-facet.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
## Adding new static search facet: Metadata Types
A new static search facet has been added to the search side panel. This new facet is called "Metadata Types" and is driven from metadata blocks. When a metadata field value is inserted into a dataset, an entry for the metadata block it belongs to is added to this new facet.

This new facet needs to be configured for it to appear on the search side panel. The configuration assigns to a dataverse what metadata blocks to show. The configuration is inherited by child dataverses.

To configure the new facet, use the Metadata Block Facet API: <https://guides.dataverse.org/en/latest/api/native-api.html#set-metadata-block-facet-for-a-dataverse-collection>
6 changes: 6 additions & 0 deletions doc/release-notes/8639-computational-workflow.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
## Adding Computational Workflow Metadata
The new Computational Workflow metadata block will allow depositors to effectively tag datasets as computational workflows.

To add the new metadata block, follow the instructions in the user guide: <https://guides.dataverse.org/en/latest/admin/metadatacustomization.html>

The location of the new metadata block tsv file is: `dataverse/scripts/api/data/metadatablocks/computational_workflow.tsv`
1 change: 1 addition & 0 deletions doc/release-notes/8715-importddi-termofuse.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Terms of Use is now imported when using DDI format through harvesting or the native API. (Issue #8715, PR #8743)
7 changes: 7 additions & 0 deletions doc/release-notes/8868-fix-json-import.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
Under "bug fixes":

Small bugs have been fixed in the dataset export in the JSON and DDI formats; eliminating the export of "undefined" as a metadata language in the former, and a duplicate keyword tag in the latter.

Run ReExportall to update Exports

Following the directions in the [Admin Guide](http://guides.dataverse.org/en/5.12/admin/metadataexport.html#batch-exports-through-the-api)
4 changes: 4 additions & 0 deletions doc/release-notes/8882-shib-affiliation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
## New DB Settings
The following DB settings have been added:
- `:ShibAffiliationOrder` - Select the first or last entry in an Affiliation array
- `:ShibAffiliationSeparator` (default: ";") - Set the separator for the Affiliation array
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
Tool Type Scope Description
Data Explorer explore file A GUI which lists the variables in a tabular data file allowing searching, charting and cross tabulation analysis. See the README.md file at https://github.com/scholarsportal/dataverse-data-explorer-v2 for the instructions on adding Data Explorer to your Dataverse.
Whole Tale explore dataset A platform for the creation of reproducible research packages that allows users to launch containerized interactive analysis environments based on popular tools such as Jupyter and RStudio. Using this integration, Dataverse users can launch Jupyter and RStudio environments to analyze published datasets. For more information, see the `Whole Tale User Guide <https://wholetale.readthedocs.io/en/stable/users_guide/integration.html>`_.
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, and spreadsheets - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/GlobalDataverseCommunityConsortium/dataverse-previewers
File Previewers explore file A set of tools that display the content of files - including audio, html, `Hypothes.is <https://hypothes.is/>`_ annotations, images, PDF, text, video, tabular data, spreadsheets, and GeoJSON - allowing them to be viewed without downloading. The previewers can be run directly from github.io, so the only required step is using the Dataverse API to register the ones you want to use. Documentation, including how to optionally brand the previewers, and an invitation to contribute through github are in the README.md file. Initial development was led by the Qualitative Data Repository and the spreasdheet previewer was added by the Social Sciences and Humanities Open Cloud (SSHOC) project. https://github.com/gdcc/dataverse-previewers
Data Curation Tool configure file A GUI for curating data by adding labels, groups, weights and other details to assist with informed reuse. See the README.md file at https://github.com/scholarsportal/Dataverse-Data-Curation-Tool for the installation instructions.
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/_static/api/dataverse-facets.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
["authorName", "authorAffiliation"]
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/_static/api/ddi_dataset.xml
Original file line number Diff line number Diff line change
Expand Up @@ -142,6 +142,7 @@
</method>
<dataAccs>
<notes type="DVN:TOA" level="dv">Terms of Access</notes>
<notes type="DVN:TOU" level="dv">Terms of Use</notes>
<setAvail>
<accsPlac>Data Access Place</accsPlac>
<origArch>Original Archive</origArch>
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
["socialscience", "geospatial"]
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/api/apps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,9 @@ https://github.com/scholarsportal/Dataverse-Data-Curation-Tool
File Previewers
~~~~~~~~~~~~~~~

File Previewers are tools that display the content of files - including audio, html, Hypothes.is annotations, images, PDF, text, video - allowing them to be viewed without downloading.
File Previewers are tools that display the content of files - including audio, html, Hypothes.is annotations, images, PDF, text, video, GeoJSON - allowing them to be viewed without downloading.

https://github.com/GlobalDataverseCommunityConsortium/dataverse-previewers
https://github.com/gdcc/dataverse-previewers

Python
------
Expand Down
1 change: 1 addition & 0 deletions doc/sphinx-guides/source/api/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,5 +21,6 @@ API Guide
client-libraries
external-tools
curation-labels
linkeddatanotification
apps
faq
65 changes: 65 additions & 0 deletions doc/sphinx-guides/source/api/linkeddatanotification.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,65 @@
Linked Data Notification API
============================

Dataverse has a limited, experimental API implementing a Linked Data Notification inbox allowing it to receive messages indicating a link between an external resource and a Dataverse dataset.
The motivating use case is to support a use case where Dataverse administrators may wish to create back-links to the remote resource (e.g. as a Related Publication, Related Material, etc.).

Upon receipt of a relevant message, Dataverse will create Announcement Received notifications for superusers, who can edit the dataset involved. (In the motivating use case, these users may then add an appropriate relationship and use the Update Curent Version publishing option to add it to the most recently published version of the dataset.)

The ``:LDNMessageHosts`` setting is a comma-separated whitelist of hosts from which Dataverse will accept and process messages. By default, no hosts are allowed. ``*`` can be used in testing to indicate all hosts are allowed.

Messages can be sent via POST, using the application/ld+json ContentType:

.. code-block:: bash
export SERVER_URL=https://demo.dataverse.org
curl -X POST -H 'ContentType:application/ld+json' $SERVER_URL/api/inbox --upload-file message.jsonld
The supported message format is described by `our preliminary specification <https://docs.google.com/document/d/1dqj8_vEcIBeyDIZCaPQvp0FM1eSGO_5CSNCdXOpoUz0/edit?usp=sharing>`_. The format is expected to change in the near future to match the standard for relationship announcements being developed as part of `the COAR Notify Project <https://notify.coar-repositories.org/>`_.

An example message is shown below. It indicates that a resource with the name "An Interesting Title" exists and "IsSupplementedBy" the dataset with DOI https://doi.org/10.5072/FK2/GGCCDL. If this dataset is managed in the receiving Dataverse, a notification will be sent to user with the relevant permissions (as described above).

.. code:: json
{
"@context": [
"https://www.w3.org/ns/activitystreams",
"https://purl.org/coar/notify"
],
"id": "urn:uuid:94ecae35-dcfd-4182-8550-22c7164fe23f",
"actor": {
"id": "https://research-organisation.org/dspace",
"name": "DSpace Repository",
"type": "Service"
},
"context": {
"IsSupplementedBy":
{
"id": "http://dev-hdc3b.lib.harvard.edu/dataset.xhtml?persistentId=doi:10.5072/FK2/GGCCDL",
"ietf:cite-as": "https://doi.org/10.5072/FK2/GGCCDL",
"type": "sorg:Dataset"
}
},
"object": {
"id": "https://research-organisation.org/dspace/item/35759679-5df3-4633-b7e5-4cf24b4d0614",
"ietf:cite-as": "https://research-organisation.org/authority/resolve/35759679-5df3-4633-b7e5-4cf24b4d0614",
"sorg:name": "An Interesting Title",
"type": "sorg:ScholarlyArticle"
},
"origin": {
"id": "https://research-organisation.org/dspace",
"inbox": "https://research-organisation.org/dspace/inbox/",
"type": "Service"
},
"target": {
"id": "https://research-organisation.org/dataverse",
"inbox": "https://research-organisation.org/dataverse/inbox/",
"type": "Service"
},
"type": [
"Announce",
"coar-notify:ReleaseAction"
]
}
Loading

0 comments on commit 6a2ec03

Please sign in to comment.