diff --git a/doc/sphinx-guides/source/admin/apis.rst b/doc/sphinx-guides/source/admin/apis.rst new file mode 100644 index 00000000000..e69de29bb2d diff --git a/doc/sphinx-guides/source/admin/troubleshooting.rst b/doc/sphinx-guides/source/admin/troubleshooting.rst index 3e8cfbfa62f..1b22a58555b 100644 --- a/doc/sphinx-guides/source/admin/troubleshooting.rst +++ b/doc/sphinx-guides/source/admin/troubleshooting.rst @@ -3,11 +3,35 @@ Troubleshooting =============== -This new (as of v.4.6) section of the Admin guide is for tips on how to diagnose and fix system problems. +Sometimes Dataverse users get into trouble. Sometimes Dataverse itself gets into trouble. If something has gone wrong, this section is for you. .. contents:: Contents: :local: +Using Dataverse APIs to Troubleshoot and Fix Problems +----------------------------------------------------- + +See the :doc:`/api/intro` section of the API Guide for a high level overview of Dataverse APIs. Below are listed problems that support teams might encounter that can be handled via API (sometimes only via API). + +A Dataset Is Locked And Cannot Be Edited or Published +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +It's normal for the ingest process described in the :doc:`/user/tabulardataingest/ingestprocess` section of the User Guide to take some time but if hours or days have passed and the dataset is still locked, you might want to inspect the locks and consider deleting some or all of them. + +See :doc:`dataverses-datasets`. + +Someone Created Spam Datasets and I Need to Delete Them +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Depending on how open your installation of Dataverse is to the general public creating datasets, you may sometimes need to deal with spam datasets. + +Look for "destroy" in the :doc:`/api/native-api` section of the API Guide. + +A User Needs Their Account to Be Converted From Institutional (Shibboleth), ORCID, Google, or GitHub to Something Else +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +See :ref:`converting-shibboleth-users-to-local` and :ref:`converting-oauth-users-to-local`. + Glassfish --------- diff --git a/doc/sphinx-guides/source/api/apps.rst b/doc/sphinx-guides/source/api/apps.rst index 5e26261ca71..f9d8d4c9b02 100755 --- a/doc/sphinx-guides/source/api/apps.rst +++ b/doc/sphinx-guides/source/api/apps.rst @@ -1,9 +1,9 @@ Apps ==== -The introduction of Dataverse APIs has fostered the development of apps that are listed at http://dataverse.org/integrations and the :doc:`/admin/integrations` section of the Admin Guide. +The introduction of Dataverse APIs has fostered the development of a variety of software applications that are listed in the :doc:`/admin/integrations` and :doc:`/admin/reporting-tools` sections of the Admin Guide and the :doc:`/installation/external-tools` section of the Installation Guide. -The apps below are open source, demonstrating how to use Dataverse APIs. Some of these apps (and others) are built on :doc:`/api/client-libraries` that are available for Dataverse APIs. +The apps below are open source and demonstrate how to use Dataverse APIs. Some of these apps are built on :doc:`/api/client-libraries` that are available for Dataverse APIs in Python, R, and Java. .. contents:: |toctitle| :local: @@ -11,6 +11,27 @@ The apps below are open source, demonstrating how to use Dataverse APIs. Some of Javascript ---------- +Data Explorer +~~~~~~~~~~~~~ + +Data Explorer is a GUI which lists the variables in a tabular data file allowing searching, charting and cross tabulation analysis. + +https://github.com/scholarsportal/Dataverse-Data-Explorer + +Data Curation Tool +~~~~~~~~~~~~~~~~~~ + +Data Curation Tool is a GUI for curating data by adding labels, groups, weights and other details to assist with informed reuse. + +https://github.com/scholarsportal/Dataverse-Data-Curation-Tool + +File Previewers +~~~~~~~~~~~~~~~ + +File Previewers are tools that display the content of files - including audio, html, Hypothes.is annotations, images, PDF, text, video - allowing them to be viewed without downloading. + +https://github.com/QualitativeDataRepository/dataverse-previewers + TwoRavens ~~~~~~~~~ @@ -18,23 +39,31 @@ TwoRavens is a system of interlocking statistical tools for data exploration, an https://github.com/IQSS/TwoRavens -PHP ---- +Python +------ -OJS -~~~ +Please note that there are multiple Python modules for Dataverse APIs listed in the :doc:`client-libraries` section. -The Open Journal Systems (OJS) Dataverse Plugin adds data sharing and preservation to the OJS publication process. +dataverse-sample-data +~~~~~~~~~~~~~~~~~~~~~ -https://github.com/pkp/ojs/tree/ojs-stable-2_4_8/plugins/generic/dataverse +dataverse-sample-data allows you to populate your Dataverse installation with sample data. It makes uses of pyDataverse, which is listed in the :doc:`client-libraries` section. -Python ------- +https://github.com/IQSS/dataverse-sample-data + +Texas Digital Library dataverse-reports +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Dataverse Reports for Texas Digital Library generates and emails statistical reports for an installation of Dataverse using the native API and database queries. + +https://github.com/TexasDigitalLibrary/dataverse-reports OSF ~~~ -Allows you to view, download, and upload files to and from a Dataverse dataset from an Open Science Framework (OSF) project: https://github.com/CenterForOpenScience/osf.io/tree/develop/addons/dataverse +OSF allows you to view, download, and upload files to and from a Dataverse dataset from an Open Science Framework (OSF) project. + +https://github.com/CenterForOpenScience/osf.io/tree/develop/addons/dataverse GeoConnect ~~~~~~~~~~ @@ -46,22 +75,49 @@ https://github.com/IQSS/geoconnect dataverse-metrics ~~~~~~~~~~~~~~~~~ -dataverse-metrics aggregates and visualizes metrics across multiple Dataverse installations but can also be used with a single installation: https://github.com/IQSS/dataverse-metrics +dataverse-metrics aggregates and visualizes metrics across multiple Dataverse installations but can also be used with a single installation + +https://github.com/IQSS/dataverse-metrics + +Whole Tale +~~~~~~~~~~ + +Whole Tale enables researchers to analyze data using popular tools including Jupyter and RStudio with the ultimate goal of supporting publishing of reproducible research packages. + +https://github.com/whole-tale/girder_wholetale/tree/v0.7/server/lib/dataverse + +Archivematica +~~~~~~~~~~~~~ + +Archivematica is an integrated suite of open-source tools for processing digital objects for long-term preservation. + +https://github.com/artefactual/archivematica/tree/v1.9.2/src/MCPClient/lib/clientScripts Java ---- +Please note that there is a Java library for Dataverse APIs listed in the :doc:`client-libraries` section. + DVUploader -~~~~~~~~~~~~~~~~~~~~~ +~~~~~~~~~~ The open-source DVUploader tool is a stand-alone command-line Java application that uses the Dataverse API to upload files to a specified Dataset. Files can be specified by name, or the DVUploader can upload all files in a directory or recursively from a directory tree. The DVUploader can also verify that uploaded files match their local sources by comparing the local and remote fixity checksums. Source code, release 1.0.0- jar file, and documentation are available on GitHub. DVUploader's creation was supported by the Texas Digital Library. https://github.com/IQSS/dataverse-uploader - Dataverse for Android ~~~~~~~~~~~~~~~~~~~~~ -For now this is only a proof of concept. +Dataverse for Android makes use of Dataverse's Search API. https://github.com/IQSS/dataverse-android + +PHP +--- + +OJS +~~~ + +The Open Journal Systems (OJS) Dataverse Plugin adds data sharing and preservation to the OJS publication process. + +https://github.com/pkp/ojs/tree/ojs-stable-2_4_8/plugins/generic/dataverse diff --git a/doc/sphinx-guides/source/api/auth.rst b/doc/sphinx-guides/source/api/auth.rst new file mode 100644 index 00000000000..21e38424549 --- /dev/null +++ b/doc/sphinx-guides/source/api/auth.rst @@ -0,0 +1,63 @@ +API Tokens and Authentication +============================= + +An API token is similar to a password and allows you to authenticate to Dataverse APIs to perform actions as you. Many Dataverse APIs require the use of an API token. + +.. contents:: |toctitle| + :local: + +How to Get an API Token +----------------------- + +Your API token is unique to the server you are using. You cannot use your API token from one server on another server. + +Instructions for getting a token are described in the :doc:`/user/account` section of the User Guide. + +How Your API Token Is Like a Password +------------------------------------- + +Anyone who has your API Token can add and delete data as you so you should treat it with the same care as a password. + +Passing Your API Token as an HTTP Header (Preferred) or a Query Parameter +------------------------------------------------------------------------- + +See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +There are two ways to pass your API token to Dataverse APIs. The preferred method is to send the token in the ``X-Dataverse-key`` HTTP header, as in the following curl example. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export ALIAS=root + + curl -H X-Dataverse-key:$API_TOKEN $SERVER_URL/api/dataverses/$ALIAS/contents + +Here's how it looks without the environment variables: + +.. code-block:: bash + + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx https://demo.dataverse.org/api/dataverses/root/contents + +The second way to pass your API token is via a query parameter called ``key`` in the URL like below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export ALIAS=root + + curl $SERVER_URL/api/dataverses/$ALIAS/contents?key=$API_TOKEN + +Here's how it looks without the environment variables: + +.. code-block:: bash + + curl https://demo.dataverse.org/api/dataverses/root/contents?key=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + +Use of the ``X-Dataverse-key`` HTTP header form is preferred to passing ``key`` in the URL because query parameters like ``key`` appear in URLs and might accidentally get shared, exposing your API token. (Again it's like a password.) Additionally, URLs are often logged on servers while it's less common to log HTTP headers. + +Resetting Your API Token +------------------------ + +You can reset your API Token from your account page in Dataverse as described in the :doc:`/user/account` section of the User Guide. diff --git a/doc/sphinx-guides/source/api/faq.rst b/doc/sphinx-guides/source/api/faq.rst new file mode 100644 index 00000000000..0f0d71d775b --- /dev/null +++ b/doc/sphinx-guides/source/api/faq.rst @@ -0,0 +1,95 @@ +Frequently Asked Questions +========================== + +APIs are less intuitive than graphical user interfaces (GUIs) so questions are expected! + +.. contents:: |toctitle| + :local: + +What is an API? +--------------- + +See "What is an API?" in the :doc:`intro` section. + +What Are Common Use Cases for Dataverse APIs? +--------------------------------------------- + +See the :doc:`getting-started` section for common use cases for researchers and curators. Other types of API users should find starting points at :ref:`types-of-api-users`. + +Where Can I Find Examples of Using Dataverse APIs? +-------------------------------------------------- + +See the :doc:`getting-started` section links to examples using curl. + +For examples in Javascript, Python, R, and Java, and PHP, see the :doc:`apps` and :doc:`client-libraries` sections. + +When Should I Use the Native API vs. the SWORD API? +--------------------------------------------------- + +The :doc:`sword` is based on a standard, works fine, and is fully supported, but much more development effort has been going into the :doc:`native-api`, which is not based on a standard. It is specific to Dataverse. + +SWORD uses XML. The Native API uses JSON. + +SWORD only supports a dozen or so operations. The Native API supports many more. + +To Operate on a Dataset Should I Use Its DOI (or Handle) or Its Database ID? +---------------------------------------------------------------------------- + +It's fine to target a datasets using either its Persistent ID (PID such as DOI or Handle) or its database id. + +Here's an example from :ref:`publish-dataset-api` of targeting a dataset using its DOI: + +.. code-block:: bash + + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST "https://demo.dataverse.org/api/datasets/:persistentId/actions/:publish?persistentId=doi:10.5072/FK2/J8SJZB&type=major" + +You can target the same dataset with its database ID ("42" in the example below), like this: + +.. code-block:: bash + + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST "https://demo.dataverse.org/api/datasets/42/actions/:publish?type=major" + +Note that when multiple query parameters are used (such as ``persistentId`` and ``type`` above) there is a question mark (``?``) before the first query parameter and ampersands (``&``) before each of the subsequent query parameters. Also, ``&`` has special meaning in Unix shells such as Bash so you must put quotes around the entire URL. + +Where is the Comprehensive List of All API Functionality? +--------------------------------------------------------- + +There are so many Dataverse APIs that a single page in this guide would probably be overwhelming. See :ref:`list-of-dataverse-apis` for links to various pages. + +It's possible to get a complete list of API functionality in Swagger/OpenAPI format if you deploy Dataverse to Payara 5+. For details, see https://github.com/IQSS/dataverse/issues/5794 + +Is There a Changelog of API Functionality That Has Been Added Over Time? +------------------------------------------------------------------------ + +No, but there probably should be. If you have suggestions for how it should look, please create an issue at https://github.com/IQSS/dataverse/issues + +.. _no-api: + +What Funtionality is GUI Only and Not Available Via API +------------------------------------------------------- + +The following tasks cannot currently be automated via API because no API exists for them. The web interface should be used instead for these GUI-only features: + +- Setting a logo image, URL, and tagline when creating a dataverse. +- Editing properties of an existing dataverse. +- Set "Enable Access Request" for Terms of Use: https://groups.google.com/d/msg/dataverse-community/oKdesT9rFGc/qM6wrsnnBAAJ +- Downloading a guestbook. +- Set guestbook_id for a dataset: https://groups.google.com/d/msg/dataverse-community/oKdesT9rFGc/qM6wrsnnBAAJ +- Filling out a guestbook. See also https://groups.google.com/d/msg/dataverse-dev/G9FNGP_bT0w/dgE2Fk4iBQAJ +- Seeing why a file failed ingest. +- Dataset templates. +- Deaccessioning datasets. + +If you would like APIs for any of the features above, please open a GitHub issue at https://github.com/IQSS/dataverse/issues + +You are also welcome to open an issue to add to the list above. Or you are welcome to make a pull request. Please see the :doc:`/developers/documentation` section of the Developer Guide for instructions. + +Why Aren't the Return Values (HTTP Status Codes) Documented? +------------------------------------------------------------ + +They should be. Please consider making a pull request to help. The :doc:`/developers/documentation` section of the Developer Guide should help you get started. :ref:`create-dataverse-api` has an example you can follow or you can come up with a better way. + +What If My Question Isn't Answered Here? +---------------------------------------- + +Please ask! For information on where to ask, please see :ref:`getting-help-with-apis`. diff --git a/doc/sphinx-guides/source/api/getting-started.rst b/doc/sphinx-guides/source/api/getting-started.rst new file mode 100644 index 00000000000..a1e957de24f --- /dev/null +++ b/doc/sphinx-guides/source/api/getting-started.rst @@ -0,0 +1,146 @@ +Getting Started with APIs +========================= + +If you are a researcher or curator who wants to automate parts of your workflow, this section should help you get started. The :doc:`intro` section lists resources for other groups who may be interested in Dataverse APIs such as developers of integrations and support teams. + +.. contents:: |toctitle| + :local: + +Servers You Can Test With +------------------------- + +Rather than using a production installation of Dataverse, API users are welcome to use http://demo.dataverse.org for testing. You can email support@dataverse.org if you have any trouble with this server. + +If you would rather have full control over your own test server, deployments to AWS, Docker, Vagrant, and more are covered in the :doc:`/developers/index` and the :doc:`/installation/index`. + +Getting an API Token +-------------------- + +Many Dataverse APIs require an API token. + +Once you have identified a server to test with, create an account, click on your name, and get your API token. For more details, see the :doc:`auth` section. + +.. _curl-examples-and-environment-variables: + +curl Examples and Enviroment Variables +-------------------------------------- + +The examples in this guide use `curl`_ for the following reasons: + +- curl commands are succinct. +- curl commands can be copied and pasted into a terminal. +- This guide is programming language agnostic. It doesn't prefer any particular programming language. + +You'll find curl examples that look like this: + +.. code-block:: bash + + export SERVER_URL=https://demo.dataverse.org + export QUERY=data + + curl $SERVER_URL/api/search?q=$QUERY + +What's going on above is the declaration of "environment variables" that are substituted into a curl command. You should run the "export" commands but change the value for the server URL or the query (or whatever options the command supports). Then you should be able to copy and paste the curl command and it should "just work", substituting the variables like this: + +.. code-block:: bash + + curl https://demo.dataverse.org/api/search?q=data + +If you ever want to check an environment variable, you can "echo" it like this: + +.. code-block:: bash + + echo $SERVER_URL + +If you don't like curl, don't have curl, or want to use a different programming language, you are encouraged to check out the Python, R, and Java options in the :doc:`client-libraries` section. + +.. _curl: https://curl.haxx.se + +Depositing Data +--------------- + +Creating a Dataverse +~~~~~~~~~~~~~~~~~~~~ + +See :ref:`create-dataverse-api`. + +Creating a Dataset +~~~~~~~~~~~~~~~~~~ + +See :ref:`create-dataset-command`. + +Uploading Files +~~~~~~~~~~~~~~~ + +See :ref:`add-file-api`. + +Publishing a Dataverse +~~~~~~~~~~~~~~~~~~~~~~ + +See :ref:`publish-dataverse-api`. + +Publishing a Dataset +~~~~~~~~~~~~~~~~~~~~ + +See :ref:`publish-dataset-api`. + +Finding and Downloading Data +---------------------------- + +Finding Datasets +~~~~~~~~~~~~~~~~ + +A quick example search for the word "data" is https://demo.dataverse.org/api/search?q=data + +See the :doc:`search` section for details. + +Downloading Files +~~~~~~~~~~~~~~~~~ + +The :doc:`dataaccess` section explains how to download files. + +In order to download files, you must know their database IDs which you can get from the ``dataverse_json`` metadata at the dataset level. See :ref:`export-dataset-metadata-api`. + +Downloading Metadata +~~~~~~~~~~~~~~~~~~~~ + +Dataset metadata is availabe in a variety of formats listed at :ref:`metadata-export-formats`. + +See :ref:`export-dataset-metadata-api`. + +Listing the Contents of a Dataverse +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +See :ref:`show-contents-of-a-dataverse-api`. + +Managing Permissions +-------------------- + +Granting Permission +~~~~~~~~~~~~~~~~~~~ + +See :ref:`assign-role-on-a-dataverse-api`. + +Revoking Permission +~~~~~~~~~~~~~~~~~~~ + +See :ref:`revoke-role-on-a-dataverse-api`. + +Listing Permissions (Role Assignments) +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +See :ref:`list-role-assignments-on-a-dataverse-api`. + +Beyond "Getting Started" Tasks +------------------------------ + +In addition to the tasks listed above, Dataverse supports many other operations via API. + +See :ref:`list-of-dataverse-apis` and :ref:`types-of-api-users` to get oriented. + +If you're looking for some inspiration for how you can use Dataverse APIs, there are open source projects that integrate with Dataverse listed in the :doc:`apps` section. + +Getting Help +------------- + +See :ref:`getting-help-with-apis`. diff --git a/doc/sphinx-guides/source/api/index.rst b/doc/sphinx-guides/source/api/index.rst index dd70e871bd0..e70c369eeeb 100755 --- a/doc/sphinx-guides/source/api/index.rst +++ b/doc/sphinx-guides/source/api/index.rst @@ -10,11 +10,15 @@ API Guide .. toctree:: + new intro - sword + getting-started + auth search dataaccess native-api metrics + sword client-libraries apps + faq diff --git a/doc/sphinx-guides/source/api/intro.rst b/doc/sphinx-guides/source/api/intro.rst index 6e0e1a9e0d7..fce2824faa4 100755 --- a/doc/sphinx-guides/source/api/intro.rst +++ b/doc/sphinx-guides/source/api/intro.rst @@ -1,52 +1,235 @@ Introduction ============ -We encourage anyone interested in building tools that interoperate with Dataverse to utilize our APIs. The Dataverse community has supplied :doc:`client-libraries` for Python, R, and Java and we are always interested in helping the community develop libraries for additional languages. The :doc:`apps` section links to open source Javascript, PHP, Python, and Java code that you can learn from while developing against Dataverse APIs. +Dataverse APIs allow users to accomplish many tasks such as... + +- creating datasets +- uploading files +- publishing datasets +- and much, much more + +... all without using the Dataverse web interface. + +APIs open the door for integrations between Dataverse and other software. For a list, see the :doc:`/admin/integrations` section of the Admin Guide. .. contents:: |toctitle| :local: +What is an API? +--------------- + +API stands for "Application Programming Interface" and an example is Dataverse's "file upload" API. In the diagram below, we can see that while users can click a button within Dataverse's web interface to upload a file, there are many other ways to get files into Dataverse, all using an API that allows for uploading of files. + +.. graphviz:: + + digraph { + //rankdir="LR"; + node [fontsize=10] + + browser [label="Web Browser"] + terminal [label="Terminal"] + + osf [label="OSF",shape=box] + ojs [label="OJS",shape=box] + rspace [label="RSpace",shape=box] + uploader [label="DvUploader"] + script [label="Script\n(Python,\nR, etc.)"] + + addfilebutton [label="Add File Button"] + addfileapi [label="Add File API"] + storage [label="Storage",shape=box3d] + + terminal -> script + terminal -> uploader + + browser -> ojs + browser -> osf + browser -> rspace + browser -> addfilebutton + + uploader -> addfileapi + ojs -> addfileapi + osf -> addfileapi + rspace -> addfileapi + script -> addfileapi + + subgraph cluster_dataverse { + label="Dataverse" + labeljust="r" + labelloc="b" + addfilebutton -> storage + addfileapi -> storage + } + } + +The components above that use the "file" upload API are: + +- DvUploader is terminal-based application for uploading files that is described in the :doc:`/user/dataset-management` section of the User Guide. +- OJS, OSF, and RSpace are all web applications that can integrate with Dataverse and are described in "Getting Data In" in the :doc:`/admin/integrations` section of the Admin Guide. +- The script in the diagram can be as simple as single line of code that is run in a terminal. You can copy and paste "one-liners" like this from the guide. See the :doc:`getting-started` section for examples using a tool called "curl". + +The diagram above shows only a few examples of software using a specific API but many more APIs are available. + +.. _types-of-api-users: + +Types of Dataverse API Users +---------------------------- + +This guide is intended to serve multiple audiences but pointers various sections of the guide are provided below based on the type of API user you are. + +API Users Within a Single Installation of Dataverse +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Each installation of Dataverse will have its own groups of people interested in APIs. + +Users of Integrations and Apps +++++++++++++++++++++++++++++++ + +Integrations and apps can take many forms but two examples are: + +- Using Open Science Framework (OSF), a web application, to deposit and publish data into Dataverse. +- Using DVUploader, a terminal-based desktop application, to upload files into Dataverse. + +In both examples, users need to obtain an API Token to authenticate with Dataverse. + +|Start| A good starting point is "API Tokens" in the :doc:`/user/account` section of the User Guide. DvUploader is documented in the :doc:`/user/dataset-management` section of the User Guide. The integrations that are enabled depend on your installation of Dataverse. You can find a list in the :doc:`/admin/integrations` section of the Admin Guide. + +Power Users ++++++++++++ + +Power users may be researchers or curators who are comfortable with automating parts of their workflow by writing Python code or similar. + +|Start| The recommended starting point for power users is the :doc:`getting-started` section. + +Support Teams and Superusers +++++++++++++++++++++++++++++ + +Support teams that answer questions about their installation of Dataverse should familiarized themselves with the :doc:`getting-started` section to get a sense of common tasks that researchers and curators might be trying to accomplish by using Dataverse APIs. + +Superusers of an installation of Dataverse have access a superuser dashboard described in the :doc:`/admin/dashboard` section of the Admin Guide but some operations can only be done via API. + +|Start| A good starting point for both groups is the :doc:`getting-started` section of this guide followed by the :doc:`/admin/troubleshooting` section of the Admin Guide. + +Sysadmins ++++++++++ + +Sysadmins often write scripts to automate tasks and Dataverse APIs make this possible. Sysadmins have control over the server that Dataverse is running on and may be called upon to execute API commands that are limited to "localhost" (the server itself) for security reasons. + +|Start| A good starting point for sysadmins is "Blocking API Endpoints" in the :doc:`/installation/config` section of the Installation Guide, followed by the :doc:`getting-started` section of this guide, followed by the :doc:`/admin/troubleshooting` section of the Admin Guide. + +In House Developers ++++++++++++++++++++ + +Some organizations that run Dataverse employ developers who are tasked with using Dataverse APIs to accomplish specific tasks such as building custom integrations with in house systems or creating reports specific to the organization's needs. + +|Start| A good starting point for in house developers is the :doc:`getting-started` section. + +API Users Across the Dataverse Project +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +The Dataverse project loves contributors! Depending on your interests and skills, you might fall into one or more of the groups below. + +Developers of Integrations, External Tools, and Apps +++++++++++++++++++++++++++++++++++++++++++++++++++++ + +One of the primary purposes for Dataverse APIs in the first place is to enable integrations with third party software. Integrations are listed in the following places: + +- The :doc:`/admin/integrations` section of the Admin Guide. +- The :doc:`/installation/external-tools` section of the Installation Guide. +- The :doc:`apps` section of this guide. + +|Start| Good starting points are the three sections above to get a sense of third-party software that already integrates with Dataverse, followed by the :doc:`getting-started` section. + +Developers of Dataverse API Client Libraries +++++++++++++++++++++++++++++++++++++++++++++ + +A client library helps developers using a specific programming language such as Python, R, or Java interact with Dataverse APIs in a manner that is idiomatic for their language. For example, a Python programmer may want to + +|Start| A good starting point is the :doc:`client-libraries` section, followed by the :doc:`getting-started` section. + +Developers of Dataverse Itself +++++++++++++++++++++++++++++++ + +Developers working on Dataverse itself use Dataverse APIs when adding features, fixing bugs, and testing those features and bug fixes. + +|Start| A good starting point is the :doc:`/developers/testing` section of the Developer Guide. + +.. |Start| raw:: html + + + Starting point +   + How This Guide is Organized --------------------------- -We document the Dataverse API in five sections: +Getting Started +~~~~~~~~~~~~~~~ + +See :doc:`getting-started` + +API Tokens and Authentication +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +See :doc:`auth`. + +.. _list-of-dataverse-apis: + +Lists of Dataverse APIs +~~~~~~~~~~~~~~~~~~~~~~~ -- :doc:`sword`: For depositing data using a standards-based approach rather than the :doc:`native-api`. - :doc:`search`: For searching dataverses, datasets, and files. - :doc:`dataaccess`: For downloading and subsetting data. -- :doc:`native-api`: For performing most tasks that are possible in the GUI. +- :doc:`native-api`: For performing most tasks that are possible in the GUI. See :doc:`getting-started` for the most common commands which operate on endpoints with names like: + + - Dataverses + - Datasets + - Files + - etc. + - :doc:`metrics`: For query statisics about usage of a Dataverse installation. +- :doc:`sword`: For depositing data using a standards-based approach rather than the :doc:`native-api`. + +Please note that some APIs are only documented in other guides that are more suited to their audience: -We use the term "native" to mean that the API is not based on any standard. For this reason, the :doc:`search` and :doc:`dataaccess` could also be considered "native" and in the future we may reorganize the API Guide to split the :doc:`native-api` section into "Datasets API", "Files" API, etc. +- Admin Guide -Authentication --------------- + - :doc:`/admin/metadatacustomization` + - :doc:`/admin/metadataexport` + - :doc:`/admin/make-data-count` + - :doc:`/admin/geoconnect-worldmap` + - :doc:`/admin/solr-search-index` -Most Dataverse APIs require the use of an API token. (In code we sometimes call it a "key" because it's shorter.) Instructions for getting a token are described in the :doc:`/user/account` section of the User Guide. +- Installation Guide -There are two ways to pass your API token to Dataverse APIs. The preferred method is to send the token in the ``X-Dataverse-key`` HTTP header, as in the following curl example:: + - :doc:`/installation/config` + - :doc:`/installation/external-tools` - curl -H "X-Dataverse-key: 8b955f87-e49a-4462-945c-67d32e391e7e" https://demo.dataverse.org/api/datasets/:persistentId?persistentId=doi:TEST/12345 +Client Libraries +~~~~~~~~~~~~~~~~ -Throughout this guide you will often see Bash shell envionmental variables being used, like this:: +See :doc:`client-libraries` for how to use Dataverse APIs from Python, R, and Java. - export API_TOKEN='8b955f87-e49a-4462-945c-67d32e391e7e' - curl -H "X-Dataverse-key: $API_TOKEN" https://demo.dataverse.org/api/datasets/:persistentId?persistentId=doi:TEST/12345 +Examples +~~~~~~~~ -The second way to pass your API token is via an extra query parameter called ``key`` in the URL like this:: +:doc:`apps` links to example open source code you can study. :doc:`getting-started` also has many examples. - curl "https://demo.dataverse.org/api/datasets/:persistentId?persistentId=doi:TEST/12345&key=$API_TOKEN" +Frequently Asked Questions +~~~~~~~~~~~~~~~~~~~~~~~~~~ -Use of the ``X-Dataverse-key`` HTTP header form is preferred because putting the query parameters in URLs often results in them finding their way into web server access logs. Your API token should be kept as secret as your password because it can be used to perform any action *as you* in the Dataverse application. +See :doc:`faq`. -Testing -------- +.. _getting-help-with-apis: -Rather than using a production installation of Dataverse, API users are welcome to use http://demo.dataverse.org for testing. +Getting Help +------------ -Support -------- +Dataverse API questions are on topic in all the usual places: -If you are using the APIs for an installation of Dataverse hosted by your institution, you may want to reach out to the team that supports it. In the header at the top of the site, there should be a form you can fill out by clicking the "Support" link. +- The dataverse-community Google Group: https://groups.google.com/forum/#!forum/dataverse-community +- Dataverse community calls: https://dataverse.org/community-calls +- The Dataverse chat room: http://chat.dataverse.org +- The Dataverse ticketing system: support@dataverse.org -If you are having trouble with http://demo.dataverse.org or have questions about the APIs, please feel free to reach out to the Dataverse community via https://groups.google.com/forum/#!forum/dataverse-community . +After your question has been answered, you are welcome to help improve the :doc:`faq` section of this guide. diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst index c5bee4e250c..382c84191ee 100644 --- a/doc/sphinx-guides/source/api/native-api.rst +++ b/doc/sphinx-guides/source/api/native-api.rst @@ -5,7 +5,7 @@ Dataverse 4 exposes most of its GUI functionality via a REST-based API. This sec .. note:: |CORS| Some API endpoint allow CORS_ (cross-origin resource sharing), which makes them usable from scripts runing in web browsers. These endpoints are marked with a *CORS* badge. -.. note:: Bash environment variables shown below. The idea is that you can "export" these environment variables before copying and pasting the commands that use them. For example, you can set ``$SERVER_URL`` by running ``export SERVER_URL="https://demo.dataverse.org"`` in your Bash shell. To check if the environment variable was set properly, you can "echo" it (e.g. ``echo $SERVER_URL``). +.. note:: Bash environment variables shown below. The idea is that you can "export" these environment variables before copying and pasting the commands that use them. For example, you can set ``$SERVER_URL`` by running ``export SERVER_URL="https://demo.dataverse.org"`` in your Bash shell. To check if the environment variable was set properly, you can "echo" it (e.g. ``echo $SERVER_URL``). See also :ref:`curl-examples-and-environment-variables`. .. _CORS: https://www.w3.org/TR/cors/ @@ -17,17 +17,20 @@ Dataverse 4 exposes most of its GUI functionality via a REST-based API. This sec Dataverses ---------- +.. _create-dataverse-api: + Create a Dataverse ~~~~~~~~~~~~~~~~~~ -Generates a new dataverse under ``$id``. Expects a JSON content describing the dataverse, as in the example below. -If ``$id`` is omitted, a root dataverse is created. ``$id`` can either be a dataverse id (long) or a dataverse alias (more robust). In the example below, "root" is the id, which means that the dataverse will be created as a child of the root dataverse:: +A dataverse is a container for datasets and other dataverses as explained in the :doc:`/user/dataverse-management` section of the User Guide. -``export id=root` +The steps for creating a dataverse are: -``curl -H "X-Dataverse-key:$API_TOKEN" -X POST $SERVER_URL/api/dataverses/$id --upload-file dataverse-complete.json`` +- Prepare a JSON file containing the name, description, etc, of the dataverse you'd like to create. +- Figure out the alias or database id of the "parent" dataverse into which you will be creating your new dataverse. +- Execute a curl command or equivalent. -Download the :download:`JSON example <../_static/api/dataverse-complete.json>` file and modified to create dataverses to suit your needs. The fields ``name``, ``alias``, and ``dataverseContacts`` are required. The controlled vocabulary for ``dataverseType`` is +Download :download:`dataverse-complete.json <../_static/api/dataverse-complete.json>` file and modify it to suit your needs. The fields ``name``, ``alias``, and ``dataverseContacts`` are required. The controlled vocabulary for ``dataverseType`` is the following: - ``DEPARTMENT`` - ``JOURNALS`` @@ -41,6 +44,28 @@ Download the :download:`JSON example <../_static/api/dataverse-complete.json>` f .. literalinclude:: ../_static/api/dataverse-complete.json +The curl command below assumes you have kept the name "dataverse-complete.json" and that this file is in your current working directory. + +Next you need to figure out the alias or database id of the "parent" dataverse into which you will be creating your new dataverse. Out of the box the top level dataverse has an alias of "root" and a database id of "1" but your installation may vary. The easiest way to determine the alias of your root dataverse is to click "Advanced Search" and look at the URL. You may also choose a parent under the root. + +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export PARENT=root + export SERVER_URL=https://demo.dataverse.org + + curl -H X-Dataverse-key:$API_TOKEN -X POST $SERVER_URL/api/dataverses/$PARENT --upload-file dataverse-complete.json + +The fully expanded example above (without environment variables) looks like this: + +.. code-block:: bash + + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST https://demo.dataverse.org/api/dataverses/root --upload-file dataverse-complete.json + +You should expect a 201 ("CREATED") response and JSON indicating the database id that has been assigned to your newly created dataverse. + .. _view-dataverse: View a Dataverse @@ -59,13 +84,28 @@ Deletes the dataverse whose ID is given: ``curl -H "X-Dataverse-key:$API_TOKEN" -X DELETE $SERVER_URL/api/dataverses/$id`` +.. _show-contents-of-a-dataverse-api: + Show Contents of a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -|CORS| Lists all the DvObjects under dataverse ``id``. :: +|CORS| Lists all the dataverses and datasets directly under a dataverse (direct children only). You must specify the "alias" of a dataverse or its database id. If you specify your API token and have access, unpublished dataverses and datasets will be included in the listing. + +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export ALIAS=root + export SERVER_URL=https://demo.dataverse.org + + curl -H X-Dataverse-key:$API_TOKEN $SERVER_URL/api/dataverses/$ALIAS/contents + +The fully expanded example above (without environment variables) looks like this: -``curl -H "X-Dataverse-key:$API_TOKEN" http://$SERVER_URL/api/dataverses/$id/contents`` +.. code-block:: bash + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx https://demo.dataverse.org/api/dataverses/root/contents Report the data (file) size of a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -119,6 +159,8 @@ POSTed JSON example:: ] } +.. _list-role-assignments-on-a-dataverse-api: + List Role Assignments in a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -135,6 +177,7 @@ Assign a default role to a user creating a dataset in a dataverse ``id`` where ` Note: You may use "none" as the ``roleAlias``. This will prevent a user who creates a dataset from having any role on that dataset. It is not recommended for dataverses with human contributors. +.. _assign-role-on-a-dataverse-api: Assign a New Role on a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -150,6 +193,8 @@ POSTed JSON example:: "role": "curator" } +.. _revoke-role-on-a-dataverse-api: + Delete Role Assignment from a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -194,9 +239,38 @@ values are ``true`` and ``false`` (both are valid JSON expressions). :: Create a Dataset in a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ -To create a dataset, you must create a JSON file containing all the metadata you want such as in this example file: :download:`dataset-finch1.json <../../../../scripts/search/tests/data/dataset-finch1.json>`. Then, you must decide which dataverse to create the dataset in and target that datavese with either the "alias" of the dataverse (e.g. "root" or the database id of the dataverse (e.g. "1"). The initial version state will be set to ``DRAFT``:: +A dataset is a container for files as explained in the :doc:`/user/dataset-management` section of the User Guide. + +To create a dataset, you must supply a JSON file that contains at least the following required metadata fields: + +- Title +- Author +- Description +- Subject + +As a starting point, you can download :download:`dataset-finch1.json <../../../../scripts/search/tests/data/dataset-finch1.json>` and modify it to meet your needs. (In addition to this minimal example, you can download :download:`dataset-create-new-all-default-fields.json <../../../../scripts/api/data/dataset-create-new-all-default-fields.json>` which populates all of the metadata fields that ship with Dataverse.) + +The curl command below assumes you have kept the name "dataset-finch1.json" and that this file is in your current working directory. + +Next you need to figure out the alias or database id of the "parent" dataverse into which you will be creating your new dataset. Out of the box the top level dataverse has an alias of "root" and a database id of "1" but your installation may vary. The easiest way to determine the alias of your root dataverse is to click "Advanced Search" and look at the URL. You may also choose a parent dataverse under the root dataverse. + +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export PARENT=root + export SERVER_URL=https://demo.dataverse.org + + curl -H X-Dataverse-key:$API_TOKEN -X POST $SERVER_URL/api/dataverses/$PARENT/datasets --upload-file dataset-finch1.json + +The fully expanded example above (without the environment variables) looks like this: + +.. code-block:: bash - curl -H "X-Dataverse-key: $API_TOKEN" -X POST $SERVER_URL/api/dataverses/$DV_ALIAS/datasets --upload-file dataset-finch1.json + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST https://demo.dataverse.org/api/dataverses/root/datasets --upload-file dataset-finch1.json + +You should expect a 201 ("CREATED") response and JSON indicating the database ID and Persistent ID (PID such as DOI or Handle) that has been assigned to your newly created dataset. Import a Dataset into a Dataverse ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -246,13 +320,30 @@ The file is a DDI xml file. * This API does not handle files related to the DDI file. * A Dataverse server can import datasets with a valid PID that uses a different protocol or authority than said server is configured for. However, the server will not update the PID metadata on subsequent update and publish actions. +.. _publish-dataverse-api: Publish a Dataverse ~~~~~~~~~~~~~~~~~~~ -Publish the Dataverse pointed by ``identifier``, which can either by the dataverse alias or its numerical id. :: +In order to publish a dataverse, you must know either its "alias" (which the GUI calls an "identifier") or its database ID. + +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export ALIAS=root + export SERVER_URL=https://demo.dataverse.org + + curl -H X-Dataverse-key:$API_TOKEN -X POST $SERVER_URL/api/dataverses/$ALIAS/actions/:publish + +The fully expanded example above (without environment variables) looks like this: + +.. code-block:: bash + + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST https://demo.dataverse.org/api/dataverses/root/actions/:publish - POST http://$SERVER/api/dataverses/$identifier/actions/:publish?key=$apiKey +You should expect a 200 ("OK") response and JSON output. Datasets -------- @@ -298,6 +389,8 @@ Get Version of a Dataset GET http://$SERVER/api/datasets/$id/versions/$versionNumber?key=$apiKey +.. _export-dataset-metadata-api: + Export Metadata of a Dataset in Various Formats ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ @@ -305,7 +398,8 @@ Export Metadata of a Dataset in Various Formats GET http://$SERVER/api/datasets/export?exporter=ddi&persistentId=$persistentId -.. note:: Supported exporters (export formats) are ``ddi``, ``oai_ddi``, ``dcterms``, ``oai_dc``, ``schema.org`` , ``OAI_ORE`` , ``Datacite``, ``oai_datacite`` and ``dataverse_json``. +.. note:: Supported exporters (export formats) are ``ddi``, ``oai_ddi``, ``dcterms``, ``oai_dc``, ``schema.org`` , ``OAI_ORE`` , ``Datacite``, ``oai_datacite`` and ``dataverse_json``. Descriptive names can be found under :ref:`metadata-export-formats` in the User Guide. + Schema.org JSON-LD ^^^^^^^^^^^^^^^^^^ @@ -382,19 +476,37 @@ You may delete some of the metadata of a dataset version by supplying a file wit For these deletes your JSON file must include an exact match of those dataset fields which you would like to delete. A sample JSON file may be downloaded here: :download:`dataset-delete-author-metadata.json <../_static/api/dataset-delete-author-metadata.json>` +.. _publish-dataset-api: + Publish a Dataset ~~~~~~~~~~~~~~~~~ -Publishes the dataset whose id is passed. If this is the first version of the dataset, its version number will be set to ``1.0``. Otherwise, the new dataset version number is determined by the most recent version number and the ``type`` parameter. Passing ``type=minor`` increases the minor version number (2.3 is updated to 2.4). Passing ``type=major`` increases the major version number (2.3 is updated to 3.0). Superusers can pass ``type=updatecurrent`` to update metadata without changing the version number:: +When publishing a dataset it's good to be aware of Dataverse's versioning system, which is described in the :doc:`/user/dataset-management` section of the User Guide. - POST http://$SERVER/api/datasets/$id/actions/:publish?type=$type&key=$apiKey +If this is the first version of the dataset, its version number will be set to ``1.0``. Otherwise, the new dataset version number is determined by the most recent version number and the ``type`` parameter. Passing ``type=minor`` increases the minor version number (2.3 is updated to 2.4). Passing ``type=major`` increases the major version number (2.3 is updated to 3.0). (Superusers can pass ``type=updatecurrent`` to update metadata without changing the version number.) -.. note:: POST should be used to publish a dataset. GET is supported for backward compatibility but is deprecated and may be removed: https://github.com/IQSS/dataverse/issues/2431 +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export SERVER_URL=https://demo.dataverse.org + export PERSISTENT_ID=doi:10.5072/FK2/J8SJZB + export MAJOR_OR_MINOR=major + + curl -H X-Dataverse-key:$API_TOKEN -X POST \""$SERVER_URL/api/datasets/:persistentId/actions/:publish?persistentId=$PERSISTENT_ID&type=$MAJOR_OR_MINOR"\" -.. note:: When there are no default workflows, a successful publication process will result in ``200 OK`` response. When there are workflows, it is impossible for Dataverse to know - how long they are going to take and whether they will succeed or not (recall that some stages might require human intervention). Thus, - a ``202 ACCEPTED`` is returned immediately. To know whether the publication process succeeded or not, the client code has to check the status of the dataset periodically, - or perform some push request in the post-publish workflow. +The fully expanded example above (without environment variables) looks like this: + +.. code-block:: bash + + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST "https://demo.dataverse.org/api/datasets/:persistentId/actions/:publish?persistentId=doi:10.5072/FK2/J8SJZB&type=major" + +The quotes around the URL are required because there is more than one query parameter separated by an ampersand (``&``), which has special meaning to Unix shells such as Bash. Putting the ``&`` in quotes ensures that "type" is interpreted as one of the query parameters. + +You should expect JSON output and a 200 ("OK") response in most cases. If you receive a 202 ("ACCEPTED") response, this is normal for installations that have workflows configured. Workflows are described in the :doc:`/developers/workflows` section of the Developer Guide. + +.. note:: POST should be used to publish a dataset. GET is supported for backward compatibility but is deprecated and may be removed: https://github.com/IQSS/dataverse/issues/2431 Delete Dataset Draft ~~~~~~~~~~~~~~~~~~~~ @@ -447,20 +559,41 @@ Delete a Private URL from a dataset (if it exists):: DELETE http://$SERVER/api/datasets/$id/privateUrl?key=$apiKey +.. _add-file-api: + Add a File to a Dataset ~~~~~~~~~~~~~~~~~~~~~~~ -Add a file to an existing Dataset. Description and tags are optional:: +When adding a file to a dataset, you can optionally specify the following: + +- A description of the file. +- The "File Path" of the file, indicating which folder the file should be uploaded to within the dataset. +- Whether or not the file is restricted. + +In the curl example below, all of the above are specified but they are optional. + +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. + +.. code-block:: bash + + export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx + export FILENAME='data.tsv' + export SERVER_URL=https://demo.dataverse.org + export PERSISTENT_ID=doi:10.5072/FK2/J8SJZB + + curl -H X-Dataverse-key:$API_TOKEN -X POST -F "file=@$FILENAME" -F 'jsonData={"description":"My description.","directoryLabel":"data/subdir1","categories":["Data"], "restrict":"false"}' "$SERVER_URL/api/datasets/:persistentId/add?persistentId=$PERSISTENT_ID" + +The fully expanded example above (without environment variables) looks like this: - POST http://$SERVER/api/datasets/$id/add?key=$apiKey +.. code-block:: bash -A more detailed "add" example using curl:: + curl -H X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx -X POST -F file=@data.tsv -F jsonData={"description":"My description.","directoryLabel":"data/subdir1","categories":["Data"], "restrict":"false"} https://demo.dataverse.org/api/datasets/:persistentId/add?persistentId=doi:10.5072/FK2/J8SJZB - curl -H "X-Dataverse-key:$API_TOKEN" -X POST -F 'file=@data.tsv' -F 'jsonData={"description":"My description.","directoryLabel":"data/subdir1","categories":["Data"], "restrict":"true"}' "https://example.dataverse.edu/api/datasets/:persistentId/add?persistentId=$PERSISTENT_ID" +You should expect a 201 ("CREATED") response and JSON indicating the database id that has been assigned to your newly uploaded file. Please note that it's possible to "trick" Dataverse into giving a file a content type (MIME type) of your choosing. For example, you can make a text file be treated like a video file with ``-F 'file=@README.txt;type=video/mpeg4'``, for example. If Dataverse does not properly detect a file type, specifying the content type via API like this a potential workaround. -Example python code to add a file. This may be run by changing these parameters in the sample code: +The curl syntax above to upload a file is tricky and a Python version is provided below. (Please note that it depends on libraries such as "requests" that you may need to install but this task is out of scope for this guide.) Here are some parameters you can set in the script: * ``dataverse_server`` - e.g. https://demo.dataverse.org * ``api_key`` - See the top of this document for a description @@ -640,7 +773,7 @@ Dataset Metrics Please note that these dataset level metrics are only available if support for Make Data Count has been enabled in your installation of Dataverse. See the :ref:`Dataset Metrics ` in the :doc:`/user/dataset-management` section of the User Guide and the :doc:`/admin/make-data-count` section of the Admin Guide for details. -Please note that in the curl examples, Bash environment variables are used with the idea that you can set a few environment variables and copy and paste the examples as is. For example, "$DV_BASE_URL" could become "https://demo.dataverse.org" by issuing the following ``export`` command from Bash: +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of ``export`` below. ``export DV_BASE_URL=https://demo.dataverse.org`` diff --git a/doc/sphinx-guides/source/installation/oauth2.rst b/doc/sphinx-guides/source/installation/oauth2.rst index 946cf85f3aa..d77a320446b 100644 --- a/doc/sphinx-guides/source/installation/oauth2.rst +++ b/doc/sphinx-guides/source/installation/oauth2.rst @@ -80,6 +80,8 @@ Converting Local Users to OAuth Once you have enabled at least one OAuth provider, existing users might want to change their login method from local to OAuth to avoid having a Dataverse-specific password. This is documented from the end user perspective in the :doc:`/user/account` section of the User Guide. Users will be prompted to create a new account but can choose to convert an existing local account after confirming their password. +.. _converting-oauth-users-to-local: + Converting OAuth Users to Local ------------------------------- diff --git a/doc/sphinx-guides/source/installation/shibboleth.rst b/doc/sphinx-guides/source/installation/shibboleth.rst index 46ce52b7508..813c46203c3 100644 --- a/doc/sphinx-guides/source/installation/shibboleth.rst +++ b/doc/sphinx-guides/source/installation/shibboleth.rst @@ -390,6 +390,8 @@ If you are running in "remote and local" mode and have existing local users that - If the email address associated with your local account matches the email address asserted by the Identity Provider (IdP), you will be prompted for the password of your local account and asked to confirm the conversion of your account. You're done! Browse around to ensure you see all the data you expect to see. Permissions have been preserved. - If the email address asserted by the Identity Provider (IdP) does not match the email address of any local user, you will be prompted to create a new account. If you were expecting account conversion, you should decline creating a new Shibboleth account, log back in to your local account, and let Support know the email on file for your local account. Support may ask you to change your email address for your local account to the one that is being asserted by the Identity Provider. Someone with access to the Glassfish logs will see this email address there. +.. _converting-shibboleth-users-to-local: + Converting Shibboleth Users to Local ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ diff --git a/doc/sphinx-guides/source/user/account.rst b/doc/sphinx-guides/source/user/account.rst index ffdbecdc4f6..8b6ccfff8a3 100755 --- a/doc/sphinx-guides/source/user/account.rst +++ b/doc/sphinx-guides/source/user/account.rst @@ -164,5 +164,33 @@ Notifications will only be emailed one time even if you haven't read the notific API Token --------- -#. To create your API token, click on your name in the header on the right hand side and then click on API Token. -#. In this tab, you can create your API Token for the first time as well as recreate it if you need a new API Token or if your API Token becomes compromised. +What APIs Are and Why They Are Useful +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +API stands for "Application Programming Interface" and Dataverse APIs allow you to take advantage of integrations with other software that may have been set up by admins of your installation of Dataverse. See the :doc:`/admin/integrations` section of the Admin Guide and the :doc:`/installation/external-tools` section of the Installation Guide for examples of software that is commonly integrated with Dataverse. + +Additionally, if you are willing to write a little code (or find someone to write it for you), APIs provide a way to automate parts of your workflow. See the :doc:`/api/getting-started` section of the API Guide for details. + +How Your API Token Is Like a Password +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +In many cases, such as when depositing data, an API Token is required to interact with Dataverse APIs. The word "token" indicates a series of letters and numbers such as ``c6527048-5bdc-48b0-a1d5-ed1b62c8113b``. Anyone who has your API Token can add and delete data as you so you should treat it with the same care as a password. + +How to Create Your API Token +~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +To create your API token, click on your name in the upper right and then click "API Token". In this tab, click "Create Token". + +How to Recreate Your API Token +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +If your API Token becomes compromised or has expired, click on your name in the upper right and click "API Token". In this tab, click "Recreate Token". + +Additional Information about API Tokens and Dataverse APIs +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +Dataverse APIs are documented in the :doc:`/api/index` but the following sections may be of particular interest: + +- :doc:`/api/getting-started` +- :doc:`/api/auth` +- :doc:`/api/faq`