Skip to content

Commit

Permalink
Merge pull request #7 from IQSS/develop
Browse files Browse the repository at this point in the history
Update develop from IQSS
  • Loading branch information
lubitchv authored Mar 7, 2019
2 parents ea44ba6 + 6d72ccf commit 118d3cd
Show file tree
Hide file tree
Showing 60 changed files with 1,774 additions and 415 deletions.

Large diffs are not rendered by default.

5 changes: 5 additions & 0 deletions doc/sphinx-guides/source/admin/dataverses-datasets.rst
Original file line number Diff line number Diff line change
Expand Up @@ -72,3 +72,8 @@ Send Dataset metadata to PID provider
Forces update to metadata provided to the PID provider of a published dataset. Only accessible to superusers. ::

curl -H "X-Dataverse-key: $API_TOKEN" -X POST http://$SERVER/api/datasets/$dataset-id/modifyRegistrationMetadata

Make Metadata Updates Without Changing Dataset Version
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

As a superuser, click "Update Current Version" when publishing. (This option is only available when a 'Minor' update would be allowed.)
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/api/dataaccess.rst
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ original "Saved Original", the proprietary (SPSS, Stata, R, etc.) file fr


"All Formats" bundled download for Tabular Files.
-----------------------------------------------
-------------------------------------------------

``/api/access/datafile/bundle/$id``

Expand Down
39 changes: 31 additions & 8 deletions doc/sphinx-guides/source/api/metrics.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ Example: ``curl https://demo.dataverse.org/api/info/metrics/downloads``
To-Month
--------

Returns a count of various objects in dataverse up to a specified month ``$YYYY-DD`` in YYYY-MM format (i.e. ``2018-01``)::
Returns a count of various objects in dataverse up to a specified month ``$YYYY-DD`` in YYYY-MM format (e.g. ``2018-01``)::

GET https://$SERVER/api/info/metrics/$type/toMonth/$YYYY-DD

Expand All @@ -36,7 +36,7 @@ Example: ``curl https://demo.dataverse.org/api/info/metrics/dataverses/toMonth/2
Past Days
---------

Returns a count of various objects in dataverse for the past ``$days`` (i.e. ``30``)::
Returns a count of various objects in dataverse for the past ``$days`` (e.g. ``30``)::

GET https://$SERVER/api/info/metrics/$type/pastDays/$days

Expand All @@ -45,8 +45,8 @@ Returns a count of various objects in dataverse for the past ``$days`` (i.e. ``3
Example: ``curl https://demo.dataverse.org/api/info/metrics/datasets/pastDays/30``


Dataverse Specific Commands
---------------------------
Dataverse Specific Metrics
--------------------------

By Subject
~~~~~~~~~~~~~~~
Expand All @@ -64,18 +64,41 @@ Returns the number of dataverses by each category::
GET https://$SERVER/api/info/metrics/dataverses/byCategory


Dataset Specific Commands
-------------------------
Dataset Specific Metrics
------------------------

By Subject
~~~~~~~~~~~~~~~
~~~~~~~~~~

Returns the number of datasets by each subject::

GET https://$SERVER/api/info/metrics/datasets/bySubject


By Subject, and to Month
~~~~~~~~~~~~~~~~~~~~~~~~

Returns the number of datasets by each subject, and up to a specified month ``$YYYY-DD`` in YYYY-MM format (e.g. ``2018-01``)::

GET https://$SERVER/api/info/metrics/datasets/bySubject/toMonth/$YYYY-DD

Example: ``curl https://demo.dataverse.org/api/info/metrics/datasets/bySubject/toMonth/2018-01``

.. |CORS| raw:: html

<span class="label label-success pull-right">
CORS
</span>
</span>


Metric Query Parameters
-----------------------

To further tailor your metric, query parameters can be provided.

dataLocation
~~~~~~~~~~~~

Specifies whether the metric should query ``local`` data, ``remote`` data (e.g. harvested), or ``all`` data when getting results. Only works for dataset metrics.

Example: ``curl https://demo.dataverse.org/api/info/metrics/datasets/?dataLocation=remote``
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -372,7 +372,7 @@ For these deletes your JSON file must include an exact match of those dataset fi
Publish a Dataset
~~~~~~~~~~~~~~~~~

Publishes the dataset whose id is passed. If this is the first version of the dataset, its version number will be set to ``1.0``. Otherwise, the new dataset version number is determined by the most recent version number and the ``type`` parameter. Passing ``type=minor`` increases the minor version number (2.3 is updated to 2.4). Passing ``type=major`` increases the major version number (2.3 is updated to 3.0). ::
Publishes the dataset whose id is passed. If this is the first version of the dataset, its version number will be set to ``1.0``. Otherwise, the new dataset version number is determined by the most recent version number and the ``type`` parameter. Passing ``type=minor`` increases the minor version number (2.3 is updated to 2.4). Passing ``type=major`` increases the major version number (2.3 is updated to 3.0). Superusers can pass ``type=updatecurrent`` to update metadata without changing the version number::

POST http://$SERVER/api/datasets/$id/actions/:publish?type=$type&key=$apiKey

Expand Down
6 changes: 5 additions & 1 deletion doc/sphinx-guides/source/api/search.rst
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,11 @@ https://demo.dataverse.org/api/search?q=trees
"file_content_type":"image/png",
"size_in_bytes":8361,
"md5":"0386269a5acb2c57b4eade587ff4db64",
"dataset_citation":"Spruce, Sabrina, 2016, \"Spruce Goose\", http://dx.doi.org/10.5072/FK2/NFSEHG, Root Dataverse, V1"
"file_persistent_id": "doi:10.5072/FK2/XTT5BV/PCCHV7",
"dataset_name": "Dataset One",
"dataset_id": "32",
"dataset_persistent_id": "doi:10.5072/FK2/XTT5BV",
"dataset_citation":"Spruce, Sabrina, 2016, \"Spruce Goose\", http://dx.doi.org/10.5072/FK2/XTT5BV, Root Dataverse, V1"
},
{
"name":"Birds",
Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,9 +65,9 @@
# built documents.
#
# The short X.Y version.
version = '4.10.1'
version = '4.11'
# The full version, including alpha/beta/rc tags.
release = '4.10.1'
release = '4.11'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/developers/workflows.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Dataverse has a flexible workflow mechanism that can be used to trigger actions


Introduction
---------
------------

Dataverse can perform two sequences of actions when datasets are published: one prior to publishing (marked by a ``PrePublishDataset`` trigger), and one after the publication has succeeded (``PostPublishDataset``). The pre-publish workflow is useful for having an external system prepare a dataset for being publicly accessed (a possibly lengthy activity that requires moving files around, uploading videos to a streaming server, etc.), or to start an approval process. A post-publish workflow might be used for sending notifications about the newly published dataset.

Expand Down Expand Up @@ -104,7 +104,7 @@ Available variables are:
* ``releaseStatus``

archiver
+++++++
++++++++

A step that sends an archival copy of a Dataset Version to a configured archiver, e.g. the DuraCloud interface of Chronopolis. See the `DuraCloud/Chronopolis Integration documentation <http://guides.dataverse.org/en/latest/admin/integrations.html#id15>`_ for further detail.

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -606,7 +606,7 @@ In the Chronopolis case, since the transfer from the DuraCloud front-end to arch

**PostPublication Workflow**

To automate the submission of archival copies to an archive as part of publication, one can setup a Dataverse Workflow using the "archiver" workflow step - see the :doc:`developers/workflows` guide.
To automate the submission of archival copies to an archive as part of publication, one can setup a Dataverse Workflow using the "archiver" workflow step - see the :doc:`/developers/workflows` guide.
. The archiver step uses the configuration information discussed above including the :ArchiverClassName setting. The workflow step definition should include the set of properties defined in \:ArchiverSettings in the workflow definition.

To active this workflow, one must first install a workflow using the archiver step. A simple workflow that invokes the archiver step configured to submit to DuraCloud as its only action is included in dataverse at /scripts/api/data/workflows/internal-archiver-workflow.json.
Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/installation/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -223,7 +223,7 @@ Solr will warn about needing to increase the number of file descriptors and max
solr soft nofile 65000
solr hard nofile 65000

On operating systems which use systemd such as RHEL or CentOS 7, you may then add a line like LimitNOFILE=65000 to the systemd unit file, or adjust the limits on a running process using the prlimit tool::
On operating systems which use systemd such as RHEL or CentOS 7, you may then add a line like LimitNOFILE=65000 for the number of open file descriptors and a line with LimitNPROC=65000 for the max processes to the systemd unit file, or adjust the limits on a running process using the prlimit tool::

# sudo prlimit --pid pid --nofile=65000:65000

Expand Down
3 changes: 2 additions & 1 deletion doc/sphinx-guides/source/versions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,9 @@ Dataverse Guides Versions

This list provides a way to refer to previous versions of the Dataverse guides, which we still host. In order to learn more about the updates delivered from one version to another, visit the `Releases <https://github.com/IQSS/dataverse/releases>`__ page in our GitHub repo.

- 4.10.1
- 4.11

- `4.10.1 </en/4.10/>`__
- `4.10 </en/4.10/>`__
- `4.9.4 </en/4.9.4/>`__
- `4.9.3 </en/4.9.3/>`__
Expand Down
Binary file not shown.

This file was deleted.

This file was deleted.

This file was deleted.

This file was deleted.

Binary file not shown.
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,6 @@
<modelVersion>4.0.0</modelVersion>
<groupId>net.handle</groupId>
<artifactId>handle</artifactId>
<version>2006-06-16-generated</version>
<version>8.1.1</version>
<description>POM was created from install:install-file</description>
</project>
12 changes: 10 additions & 2 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
-->
<groupId>edu.harvard.iq</groupId>
<artifactId>dataverse</artifactId>
<version>4.10.1</version>
<version>4.11</version>
<packaging>war</packaging>

<name>dataverse</name>
Expand Down Expand Up @@ -328,7 +328,7 @@
<dependency>
<groupId>net.handle</groupId>
<artifactId>handle</artifactId>
<version>2006-06-16-generated</version>
<version>8.1.1</version>
</dependency>
<!-- UNF v5 (buggy), (temporarily) added for testing ingest against DVN v3 - L.A. -->
<dependency>
Expand Down Expand Up @@ -558,6 +558,10 @@
<groupId>org.slf4j</groupId>
<artifactId>log4j-over-slf4j</artifactId>
</exclusion>
<exclusion>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</exclusion>
</exclusions>
</dependency>
<dependency>
Expand All @@ -573,6 +577,10 @@
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-sqs</artifactId>
</exclusion>
<exclusion>
<groupId>ch.qos.logback</groupId>
<artifactId>logback-classic</artifactId>
</exclusion>
</exclusions>

</dependency>
Expand Down
Loading

0 comments on commit 118d3cd

Please sign in to comment.