From 8195b1ed98e4312b93e28c7577c22b61aac13991 Mon Sep 17 00:00:00 2001 From: Jim Myers Date: Mon, 11 Mar 2019 13:55:17 -0400 Subject: [PATCH 01/18] use dataset thumbnail if available --- src/main/webapp/dataset.xhtml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/webapp/dataset.xhtml b/src/main/webapp/dataset.xhtml index 0cb0f9d80e4..133bb263b7f 100644 --- a/src/main/webapp/dataset.xhtml +++ b/src/main/webapp/dataset.xhtml @@ -43,7 +43,7 @@ - + From 372a3738278bb72cba08b3ed934d9c9e0e0bc817 Mon Sep 17 00:00:00 2001 From: Jim Myers Date: Tue, 23 Apr 2024 14:52:48 -0400 Subject: [PATCH 02/18] add release note --- doc/release-notes/5621_dataset image in header.md | 1 + 1 file changed, 1 insertion(+) create mode 100644 doc/release-notes/5621_dataset image in header.md diff --git a/doc/release-notes/5621_dataset image in header.md b/doc/release-notes/5621_dataset image in header.md new file mode 100644 index 00000000000..34b445fd9e1 --- /dev/null +++ b/doc/release-notes/5621_dataset image in header.md @@ -0,0 +1 @@ +Dataverse will use the Dataset thumbnail, if one is defined, rather than the generic Dataverse logo in the Open Graph metadata header. This means the image will be seen when, for example, the dataset is referenced in Facebook. From cf1e35a1c6f9bc82d8062cb70bdff3134a8d82de Mon Sep 17 00:00:00 2001 From: Jim Myers Date: Tue, 23 Apr 2024 16:49:15 -0400 Subject: [PATCH 03/18] method now takes a size param --- src/main/webapp/dataset.xhtml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/src/main/webapp/dataset.xhtml b/src/main/webapp/dataset.xhtml index 3a109243f22..441c817da8f 100644 --- a/src/main/webapp/dataset.xhtml +++ b/src/main/webapp/dataset.xhtml @@ -86,7 +86,7 @@ - + From 2cb05a2591b2c20447f0b21c8b358cf971e60914 Mon Sep 17 00:00:00 2001 From: Philip Durbin Date: Tue, 14 May 2024 11:33:24 -0400 Subject: [PATCH 04/18] add 3DViewer by openforestdata.pl to list of external tools #10561 --- doc/release-notes/10561-3dviewer.md | 1 + .../source/_static/admin/dataverse-external-tools.tsv | 1 + 2 files changed, 2 insertions(+) create mode 100644 doc/release-notes/10561-3dviewer.md diff --git a/doc/release-notes/10561-3dviewer.md b/doc/release-notes/10561-3dviewer.md new file mode 100644 index 00000000000..47da10f8837 --- /dev/null +++ b/doc/release-notes/10561-3dviewer.md @@ -0,0 +1 @@ +3DViewer by openforestdata.pl has been added to the list of external tools: https://preview.guides.gdcc.io/en/develop/admin/external-tools.html#inventory-of-external-tools diff --git a/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv b/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv index fe256828d44..3df5ed5d24f 100644 --- a/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv +++ b/doc/sphinx-guides/source/_static/admin/dataverse-external-tools.tsv @@ -7,3 +7,4 @@ Data Curation Tool configure file "A GUI for curating data by adding labels, gro Ask the Data query file Ask the Data is an experimental tool that allows you ask natural language questions about the data contained in Dataverse tables (tabular data). See the README.md file at https://github.com/IQSS/askdataverse/tree/main/askthedata for the instructions on adding Ask the Data to your Dataverse installation. TurboCurator by ICPSR configure dataset TurboCurator generates metadata improvements for title, description, and keywords. It relies on open AI's ChatGPT & ICPSR best practices. See the `TurboCurator Dataverse Administrator `_ page for more details on how it works and adding TurboCurator to your Dataverse installation. JupyterHub explore file The `Dataverse-to-JupyterHub Data Transfer Connector `_ is a tool that simplifies the transfer of data between Dataverse repositories and the cloud-based platform JupyterHub. It is designed for researchers, scientists, and data analysts, facilitating collaboration on projects by seamlessly moving datasets and files. The tool is a lightweight client-side web application built using React and relies on the Dataverse External Tool feature, allowing for easy deployment on modern integration systems. Currently optimized for small to medium-sized files, future plans include extending support for larger files and signed Dataverse endpoints. For more details, you can refer to the external tool manifest: https://forgemia.inra.fr/dipso/eosc-pillar/dataverse-jupyterhub-connector/-/blob/master/externalTools.json +3DViewer by openforestdata.pl explore file The 3DViewer by openforestdata.pl can be used to explore 3D files (e.g. STL format). It was presented by Kamil Guryn during the 2020 community meeting (`slide deck `_, `video `_) and can be found at https://github.com/OpenForestData/open-forest-data-previewers From 44d9daf8ffe748f3094c6e4bec7d8c2a0a0aea51 Mon Sep 17 00:00:00 2001 From: Philip Durbin Date: Thu, 23 May 2024 12:18:28 -0400 Subject: [PATCH 05/18] docs typo, add $ to env var --- doc/sphinx-guides/source/api/native-api.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst index f22f8727fb0..8c54a937353 100644 --- a/doc/sphinx-guides/source/api/native-api.rst +++ b/doc/sphinx-guides/source/api/native-api.rst @@ -1179,7 +1179,7 @@ See also :ref:`batch-exports-through-the-api` and the note below: export PERSISTENT_IDENTIFIER=doi:10.5072/FK2/J8SJZB export METADATA_FORMAT=ddi - curl "$SERVER_URL/api/datasets/export?exporter=$METADATA_FORMAT&persistentId=PERSISTENT_IDENTIFIER" + curl "$SERVER_URL/api/datasets/export?exporter=$METADATA_FORMAT&persistentId=$PERSISTENT_IDENTIFIER" The fully expanded example above (without environment variables) looks like this: From be6c6f7a9e85ce75bb257d8138671d5053ef8374 Mon Sep 17 00:00:00 2001 From: Julian Gautier Date: Thu, 23 May 2024 14:24:12 -0400 Subject: [PATCH 06/18] Update deployment.rst Correct info about what happens when -b is not used in command to create a Dataverse instance on AWS --- doc/sphinx-guides/source/developers/deployment.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/sphinx-guides/source/developers/deployment.rst b/doc/sphinx-guides/source/developers/deployment.rst index 678e29f4079..6bd42a7ab95 100755 --- a/doc/sphinx-guides/source/developers/deployment.rst +++ b/doc/sphinx-guides/source/developers/deployment.rst @@ -96,7 +96,7 @@ To run it with default values you just need the script, but you may also want a ec2-create-instance accepts a number of command-line switches, including: * -r: GitHub Repository URL (defaults to https://github.com/IQSS/dataverse.git) -* -b: branch to build (defaults to develop) +* -b: branch to build (defaults to the latest release of Dataverse) * -p: pemfile directory (defaults to $HOME) * -g: Ansible GroupVars file (if you wish to override role defaults) * -h: help (displays usage for each available option) From 42172f72a172bdd0e853c2c8927a18007e3c5f64 Mon Sep 17 00:00:00 2001 From: Julian Gautier Date: Fri, 24 May 2024 16:43:23 -0400 Subject: [PATCH 07/18] Update deployment.rst Removing list of options and encouraging users to user -h to see the options. --- doc/sphinx-guides/source/developers/deployment.rst | 12 +++--------- 1 file changed, 3 insertions(+), 9 deletions(-) diff --git a/doc/sphinx-guides/source/developers/deployment.rst b/doc/sphinx-guides/source/developers/deployment.rst index 6bd42a7ab95..89ae9ac4c2e 100755 --- a/doc/sphinx-guides/source/developers/deployment.rst +++ b/doc/sphinx-guides/source/developers/deployment.rst @@ -91,17 +91,11 @@ Download `ec2-create-instance.sh`_ and put it somewhere reasonable. For the purp .. _ec2-create-instance.sh: https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/ec2/ec2-create-instance.sh -To run it with default values you just need the script, but you may also want a current copy of the ansible `group vars `_ file. +To run the script, you can make it executable (``chmod 755 ec2-create-instance.sh``) or run it with bash, like this with ``-h`` as an argument to print the help: -ec2-create-instance accepts a number of command-line switches, including: +``bash ~/Downloads/ec2-create-instance.sh -h`` -* -r: GitHub Repository URL (defaults to https://github.com/IQSS/dataverse.git) -* -b: branch to build (defaults to the latest release of Dataverse) -* -p: pemfile directory (defaults to $HOME) -* -g: Ansible GroupVars file (if you wish to override role defaults) -* -h: help (displays usage for each available option) - -``bash ~/Downloads/ec2-create-instance.sh -b develop -r https://github.com/scholarsportal/dataverse.git -g main.yml`` +If you run the script without any arguments, it should spin up the latest version of Dataverse. You will need to wait for 15 minutes or so until the deployment is finished, longer if you've enabled sample data and/or the API test suite. Eventually, the output should tell you how to access the Dataverse installation in a web browser or via SSH. It will also provide instructions on how to delete the instance when you are finished with it. Please be aware that AWS charges per minute for a running instance. You may also delete your instance from https://console.aws.amazon.com/console/home?region=us-east-1 . From c5954454e7ec025c4fdfd9db709939ea4ae1993f Mon Sep 17 00:00:00 2001 From: Philip Durbin Date: Wed, 29 May 2024 22:14:59 -0400 Subject: [PATCH 08/18] mention Docker in main README, add docker/README.md #10157 Also, remove the dummy Dockerfile. --- Dockerfile | 1 - README.md | 3 ++- docker/README.md | 5 +++++ 3 files changed, 7 insertions(+), 2 deletions(-) delete mode 100644 Dockerfile create mode 100644 docker/README.md diff --git a/Dockerfile b/Dockerfile deleted file mode 100644 index b0864a0c55f..00000000000 --- a/Dockerfile +++ /dev/null @@ -1 +0,0 @@ -# See http://guides.dataverse.org/en/latest/developers/containers.html diff --git a/README.md b/README.md index 651d0352dec..77720453d5f 100644 --- a/README.md +++ b/README.md @@ -7,7 +7,7 @@ Dataverse is an [open source][] software platform for sharing, finding, citing, We maintain a demo site at [demo.dataverse.org][] which you are welcome to use for testing and evaluating Dataverse. -To install Dataverse, please see our [Installation Guide][] which will prompt you to download our [latest release][]. +To install Dataverse, please see our [Installation Guide][] which will prompt you to download our [latest release][]. Docker users should consult the [Container Guide][]. To discuss Dataverse with the community, please join our [mailing list][], participate in a [community call][], chat with us at [chat.dataverse.org][], or attend our annual [Dataverse Community Meeting][]. @@ -28,6 +28,7 @@ Dataverse is a trademark of President and Fellows of Harvard College and is regi [Dataverse community]: https://dataverse.org/developers [Installation Guide]: https://guides.dataverse.org/en/latest/installation/index.html [latest release]: https://github.com/IQSS/dataverse/releases +[Container Guide]: https://guides.dataverse.org/en/latest/container/index.html [features]: https://dataverse.org/software-features [project board]: https://github.com/orgs/IQSS/projects/34 [roadmap]: https://www.iq.harvard.edu/roadmap-dataverse-project diff --git a/docker/README.md b/docker/README.md new file mode 100644 index 00000000000..3e3e6c1b9ff --- /dev/null +++ b/docker/README.md @@ -0,0 +1,5 @@ +# Dataverse in Docker + +Please see the [Container Guide][]. + +[Container Guide]: https://guides.dataverse.org/en/latest/container/index.html From b1d4e03ffb100edc69ee8210005da6294d3cd010 Mon Sep 17 00:00:00 2001 From: Ludovic DANIEL Date: Wed, 29 May 2024 15:59:13 +0200 Subject: [PATCH 09/18] Fix NoResultException on DatasetServiceBean.findDeep (.getSingleResult():L137) --- .../iq/dataverse/DatasetServiceBean.java | 53 ++++++++++--------- 1 file changed, 27 insertions(+), 26 deletions(-) diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java index 2686584f307..dab0ff43fcf 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java @@ -19,8 +19,6 @@ import edu.harvard.iq.dataverse.export.ExportService; import edu.harvard.iq.dataverse.globus.GlobusServiceBean; import edu.harvard.iq.dataverse.harvest.server.OAIRecordServiceBean; -import edu.harvard.iq.dataverse.pidproviders.PidProvider; -import edu.harvard.iq.dataverse.pidproviders.PidUtil; import edu.harvard.iq.dataverse.search.IndexServiceBean; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.BundleUtil; @@ -41,11 +39,10 @@ import jakarta.ejb.TransactionAttributeType; import jakarta.inject.Named; import jakarta.persistence.EntityManager; -import jakarta.persistence.LockModeType; import jakarta.persistence.NoResultException; +import jakarta.persistence.NonUniqueResultException; import jakarta.persistence.PersistenceContext; import jakarta.persistence.Query; -import jakarta.persistence.StoredProcedureQuery; import jakarta.persistence.TypedQuery; import org.apache.commons.lang3.StringUtils; @@ -115,28 +112,32 @@ public Dataset find(Object pk) { * @return a dataset with pre-fetched file objects */ public Dataset findDeep(Object pk) { - return (Dataset) em.createNamedQuery("Dataset.findById") - .setParameter("id", pk) - // Optimization hints: retrieve all data in one query; this prevents point queries when iterating over the files - .setHint("eclipselink.left-join-fetch", "o.files.ingestRequest") - .setHint("eclipselink.left-join-fetch", "o.files.thumbnailForDataset") - .setHint("eclipselink.left-join-fetch", "o.files.dataTables") - .setHint("eclipselink.left-join-fetch", "o.files.auxiliaryFiles") - .setHint("eclipselink.left-join-fetch", "o.files.ingestReports") - .setHint("eclipselink.left-join-fetch", "o.files.dataFileTags") - .setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas") - .setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.fileCategories") - .setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.varGroups") - //.setHint("eclipselink.left-join-fetch", "o.files.guestbookResponses - .setHint("eclipselink.left-join-fetch", "o.files.embargo") - .setHint("eclipselink.left-join-fetch", "o.files.retention") - .setHint("eclipselink.left-join-fetch", "o.files.fileAccessRequests") - .setHint("eclipselink.left-join-fetch", "o.files.owner") - .setHint("eclipselink.left-join-fetch", "o.files.releaseUser") - .setHint("eclipselink.left-join-fetch", "o.files.creator") - .setHint("eclipselink.left-join-fetch", "o.files.alternativePersistentIndentifiers") - .setHint("eclipselink.left-join-fetch", "o.files.roleAssignments") - .getSingleResult(); + try { + return (Dataset) em.createNamedQuery("Dataset.findById") + .setParameter("id", pk) + // Optimization hints: retrieve all data in one query; this prevents point queries when iterating over the files + .setHint("eclipselink.left-join-fetch", "o.files.ingestRequest") + .setHint("eclipselink.left-join-fetch", "o.files.thumbnailForDataset") + .setHint("eclipselink.left-join-fetch", "o.files.dataTables") + .setHint("eclipselink.left-join-fetch", "o.files.auxiliaryFiles") + .setHint("eclipselink.left-join-fetch", "o.files.ingestReports") + .setHint("eclipselink.left-join-fetch", "o.files.dataFileTags") + .setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas") + .setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.fileCategories") + .setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.varGroups") + //.setHint("eclipselink.left-join-fetch", "o.files.guestbookResponses + .setHint("eclipselink.left-join-fetch", "o.files.embargo") + .setHint("eclipselink.left-join-fetch", "o.files.retention") + .setHint("eclipselink.left-join-fetch", "o.files.fileAccessRequests") + .setHint("eclipselink.left-join-fetch", "o.files.owner") + .setHint("eclipselink.left-join-fetch", "o.files.releaseUser") + .setHint("eclipselink.left-join-fetch", "o.files.creator") + .setHint("eclipselink.left-join-fetch", "o.files.alternativePersistentIndentifiers") + .setHint("eclipselink.left-join-fetch", "o.files.roleAssignments") + .getSingleResult(); + } catch (NoResultException | NonUniqueResultException ex) { + return null; + } } public List findByOwnerId(Long ownerId) { From 14e4c514bf65eed516faafff5c14b466760c8e05 Mon Sep 17 00:00:00 2001 From: plecor <146710476+plecor@users.noreply.github.com> Date: Fri, 31 May 2024 15:14:15 +0200 Subject: [PATCH 10/18] Add info about DOI2PMH project --- doc/sphinx-guides/source/admin/harvestclients.rst | 5 +++++ doc/sphinx-guides/source/api/apps.rst | 7 +++++++ 2 files changed, 12 insertions(+) diff --git a/doc/sphinx-guides/source/admin/harvestclients.rst b/doc/sphinx-guides/source/admin/harvestclients.rst index 59fc4dc2c64..73bdc5058f9 100644 --- a/doc/sphinx-guides/source/admin/harvestclients.rst +++ b/doc/sphinx-guides/source/admin/harvestclients.rst @@ -47,3 +47,8 @@ What if a Run Fails? Each harvesting client run logs a separate file per run to the app server's default logging directory (``/usr/local/payara6/glassfish/domains/domain1/logs/`` unless you've changed it). Look for filenames in the format ``harvest_TARGET_YYYY_MM_DD_timestamp.log`` to get a better idea of what's going wrong. Note that you'll want to run a minimum of Dataverse Software 4.6, optimally 4.18 or beyond, for the best OAI-PMH interoperability. + +Harvesting non-OAI-PMH +~~~~~~~~~~~~~~~~~~~~~~ + +`DOI2PMH `__ is a community-driven project intended to allow OAI-PMH harvesting from non-OAI-PMH sources. \ No newline at end of file diff --git a/doc/sphinx-guides/source/api/apps.rst b/doc/sphinx-guides/source/api/apps.rst index 44db666736c..4c8612d0d80 100755 --- a/doc/sphinx-guides/source/api/apps.rst +++ b/doc/sphinx-guides/source/api/apps.rst @@ -133,6 +133,13 @@ https://github.com/libis/rdm-integration PHP --- +DOI2PMH +~~~~~~~ + +The DOI2PMH server allow Dataverse instances to harvest DOI through OAI-PMH from otherwise unharvestable sources. + +https://github.com/IQSS/doi2pmh-server + OJS ~~~ From 26be8e191fd31ca338c830eb2dd30daa12d431e3 Mon Sep 17 00:00:00 2001 From: Steven Ferey Date: Mon, 3 Jun 2024 16:56:48 +0200 Subject: [PATCH 11/18] Allow merging of accounts that are members of the same group (#9909) MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit * Remove unused import * Merge users with same groups * Added additional line in Permalinks config Added an additional line to restart Payara after changing settings in Permalinks section * Revert "#9717 grant CREATE instead of ALL per pdurbin" This reverts commit f71274e7c7a4d47ab7fb973320bcfdb7e6822fbd. * CREATE instead of ALL for public schema * Added: getZipDownloadLimit and getEmbargoEnabled API info endpoints * Added: docs for new info API endpoints * Fixed: missing guides reference in config.rst * Changed: :MaxEmbargoDurationInMonths setting directly exposed via API info endpoint * Changed: updated release notes * Changed: private Info.java method renamed * stub out page on API design, esp paths #9880 * remove embargo example, no longer used in #9881 * typo #9880 * Remove unused GPL-licensed code For unknown reasons, in 2009 several files from the JDK were copied into the Dataverse codebase, instead of referenced. It appears that these classes weren't really used. * Removed unused code --------- Co-authored-by: Jérôme ROUCOU Co-authored-by: Pradyumna Sridhara <95268690+prsridha@users.noreply.github.com> Co-authored-by: Philip Durbin Co-authored-by: GPortas Co-authored-by: bencomp Co-authored-by: jeromeroucou --- .../engine/command/impl/MergeInAccountCommand.java | 3 +-- src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java | 7 +++---- 2 files changed, 4 insertions(+), 6 deletions(-) diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/MergeInAccountCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/MergeInAccountCommand.java index 1ec51764d73..03f4dceef88 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/MergeInAccountCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/MergeInAccountCommand.java @@ -14,7 +14,6 @@ import edu.harvard.iq.dataverse.UserNotification; import edu.harvard.iq.dataverse.authorization.AuthenticatedUserLookup; import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUser; -import edu.harvard.iq.dataverse.authorization.providers.oauth2.OAuth2TokenData; import edu.harvard.iq.dataverse.authorization.users.ApiToken; import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser; import edu.harvard.iq.dataverse.batch.util.LoggingUtil; @@ -25,7 +24,6 @@ import edu.harvard.iq.dataverse.engine.command.RequiredPermissions; import edu.harvard.iq.dataverse.engine.command.exception.CommandException; import edu.harvard.iq.dataverse.engine.command.exception.IllegalCommandException; -import edu.harvard.iq.dataverse.passwordreset.PasswordResetData; import edu.harvard.iq.dataverse.search.IndexResponse; import edu.harvard.iq.dataverse.search.savedsearch.SavedSearch; import edu.harvard.iq.dataverse.workflows.WorkflowComment; @@ -177,6 +175,7 @@ protected void executeImpl(CommandContext ctxt) throws CommandException { ctxt.em().createNativeQuery("Delete from OAuth2TokenData where user_id ="+consumedAU.getId()).executeUpdate(); + ctxt.em().createNativeQuery("DELETE FROM explicitgroup_authenticateduser consumed USING explicitgroup_authenticateduser ongoing WHERE consumed.containedauthenticatedusers_id="+ongoingAU.getId()+" AND ongoing.containedauthenticatedusers_id="+consumedAU.getId()).executeUpdate(); ctxt.em().createNativeQuery("UPDATE explicitgroup_authenticateduser SET containedauthenticatedusers_id="+ongoingAU.getId()+" WHERE containedauthenticatedusers_id="+consumedAU.getId()).executeUpdate(); ctxt.actionLog().changeUserIdentifierInHistory(consumedAU.getIdentifier(), ongoingAU.getIdentifier()); diff --git a/src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java b/src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java index 5880b08e5c2..0189ffd6e58 100644 --- a/src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java +++ b/src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java @@ -8,6 +8,7 @@ import edu.harvard.iq.dataverse.authorization.DataverseRole; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import java.util.ArrayList; +import java.util.Arrays; import java.util.List; import java.util.UUID; import jakarta.json.Json; @@ -206,15 +207,13 @@ public void testMergeAccounts(){ String aliasInOwner = "groupFor" + dataverseAlias; String displayName = "Group for " + dataverseAlias; String user2identifier = "@" + usernameConsumed; + String target2identifier = "@" + targetname; Response createGroup = UtilIT.createGroup(dataverseAlias, aliasInOwner, displayName, superuserApiToken); createGroup.prettyPrint(); createGroup.then().assertThat() .statusCode(CREATED.getStatusCode()); - String groupIdentifier = JsonPath.from(createGroup.asString()).getString("data.identifier"); - - List roleAssigneesToAdd = new ArrayList<>(); - roleAssigneesToAdd.add(user2identifier); + List roleAssigneesToAdd = Arrays.asList(user2identifier, target2identifier); Response addToGroup = UtilIT.addToGroup(dataverseAlias, aliasInOwner, roleAssigneesToAdd, superuserApiToken); addToGroup.prettyPrint(); addToGroup.then().assertThat() From 836a775486ab188d887c41b43115bef74806d521 Mon Sep 17 00:00:00 2001 From: Philip Durbin Date: Mon, 3 Jun 2024 11:37:09 -0400 Subject: [PATCH 12/18] Title Case --- doc/sphinx-guides/source/admin/harvestclients.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/sphinx-guides/source/admin/harvestclients.rst b/doc/sphinx-guides/source/admin/harvestclients.rst index 73bdc5058f9..c4c63c80786 100644 --- a/doc/sphinx-guides/source/admin/harvestclients.rst +++ b/doc/sphinx-guides/source/admin/harvestclients.rst @@ -48,7 +48,7 @@ Each harvesting client run logs a separate file per run to the app server's defa Note that you'll want to run a minimum of Dataverse Software 4.6, optimally 4.18 or beyond, for the best OAI-PMH interoperability. -Harvesting non-OAI-PMH +Harvesting Non-OAI-PMH ~~~~~~~~~~~~~~~~~~~~~~ `DOI2PMH `__ is a community-driven project intended to allow OAI-PMH harvesting from non-OAI-PMH sources. \ No newline at end of file From 34ed8d863b10fe6234df6b5eec04523f3cd82ed7 Mon Sep 17 00:00:00 2001 From: qqmyers Date: Mon, 3 Jun 2024 14:52:58 -0400 Subject: [PATCH 13/18] IQSS/10568-Fix File Reingest from UI (#10569) * fix reingest * release note --- doc/release-notes/10568-Fix File Reingest.md | 1 + .../edu/harvard/iq/dataverse/FilePage.java | 21 +++++++++---------- 2 files changed, 11 insertions(+), 11 deletions(-) create mode 100644 doc/release-notes/10568-Fix File Reingest.md diff --git a/doc/release-notes/10568-Fix File Reingest.md b/doc/release-notes/10568-Fix File Reingest.md new file mode 100644 index 00000000000..354aa847f01 --- /dev/null +++ b/doc/release-notes/10568-Fix File Reingest.md @@ -0,0 +1 @@ +A bug that prevented the Ingest option in the File page Edit File menu from working has been fixed \ No newline at end of file diff --git a/src/main/java/edu/harvard/iq/dataverse/FilePage.java b/src/main/java/edu/harvard/iq/dataverse/FilePage.java index 9889d23cf55..afede00f3eb 100644 --- a/src/main/java/edu/harvard/iq/dataverse/FilePage.java +++ b/src/main/java/edu/harvard/iq/dataverse/FilePage.java @@ -522,10 +522,9 @@ public String ingestFile() throws CommandException{ return null; } - DataFile dataFile = fileMetadata.getDataFile(); - editDataset = dataFile.getOwner(); + editDataset = file.getOwner(); - if (dataFile.isTabularData()) { + if (file.isTabularData()) { JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("file.ingest.alreadyIngestedWarning")); return null; } @@ -537,25 +536,25 @@ public String ingestFile() throws CommandException{ return null; } - if (!FileUtil.canIngestAsTabular(dataFile)) { + if (!FileUtil.canIngestAsTabular(file)) { JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("file.ingest.cantIngestFileWarning")); return null; } - dataFile.SetIngestScheduled(); + file.SetIngestScheduled(); - if (dataFile.getIngestRequest() == null) { - dataFile.setIngestRequest(new IngestRequest(dataFile)); + if (file.getIngestRequest() == null) { + file.setIngestRequest(new IngestRequest(file)); } - dataFile.getIngestRequest().setForceTypeCheck(true); + file.getIngestRequest().setForceTypeCheck(true); // update the datafile, to save the newIngest request in the database: datafileService.save(file); // queue the data ingest job for asynchronous execution: - String status = ingestService.startIngestJobs(editDataset.getId(), new ArrayList<>(Arrays.asList(dataFile)), (AuthenticatedUser) session.getUser()); + String status = ingestService.startIngestJobs(editDataset.getId(), new ArrayList<>(Arrays.asList(file)), (AuthenticatedUser) session.getUser()); if (!StringUtil.isEmpty(status)) { // This most likely indicates some sort of a problem (for example, @@ -565,9 +564,9 @@ public String ingestFile() throws CommandException{ // successfully gone through the process of trying to schedule the // ingest job... - logger.warning("Ingest Status for file: " + dataFile.getId() + " : " + status); + logger.warning("Ingest Status for file: " + file.getId() + " : " + status); } - logger.fine("File: " + dataFile.getId() + " ingest queued"); + logger.fine("File: " + file.getId() + " ingest queued"); init(); JsfHelper.addInfoMessage(BundleUtil.getStringFromBundle("file.ingest.ingestQueued")); From 3c55c3fa503b200fbe3b20e8d7d8965424e71160 Mon Sep 17 00:00:00 2001 From: Philip Durbin Date: Tue, 4 Jun 2024 15:23:14 -0400 Subject: [PATCH 14/18] avoid expensive Solr join for public dvObjects in search (experimental) (#10555) * avoid expensive Solr join when guest users search (affect IP Groups) #10554 * fix copy/past error, target doc for file, not dataset #10554 * Checking a few experimental changes into the branch: Jim's soft commit fixes from 10547; A quick experiment, replacing join on public objects with a boolean publicObject_b:true for logged-in users as well (with a join added for just for their own personal documents; groups are ignored for now). #10554 * Step 3, of the performance improvement effort relying on a boolean "publicObject" flag for published documents - now for logged-in users, AND with support for groups. Group support experimental, but appears to be working. #10554 * Modified the implementation for the guest user, to support ip groups. #10554 * Removed the few autocommit-related changes previously borrowed from 10547, to keep things separate and clear, for testing etc. #10554 * Reorganized the optimized code in SearchServiceBean; combined the code block for the guest and authenticated users. #10554 * updated the release note. #10554 * Removed the warning from the ip groups guide about the effect of the new search optimization feture that was no longer true. #10554 * Updated the section of the guide describing the new Solr optimization feature flags. #10554 * Updated the performance section of the guide. #10554 * Modified IndexServiceBean to use the new feature flag, that has been separated from the flag that enables the search-side optimization; Fixed the groups sub-query for the guest user. #10554 * cosmetic #10554 * doc tweaks #10554 * no-op code cleanup, correct case of publicObject_b #10554 --------- Co-authored-by: Leonid Andreev --- .../10554-avoid-solr-join-guest.md | 5 + .../source/developers/performance.rst | 4 + .../source/installation/config.rst | 6 + .../iq/dataverse/search/IndexServiceBean.java | 10 + .../iq/dataverse/search/SearchFields.java | 9 + .../dataverse/search/SearchServiceBean.java | 172 +++++++++++++----- .../iq/dataverse/settings/FeatureFlags.java | 22 +++ 7 files changed, 181 insertions(+), 47 deletions(-) create mode 100644 doc/release-notes/10554-avoid-solr-join-guest.md diff --git a/doc/release-notes/10554-avoid-solr-join-guest.md b/doc/release-notes/10554-avoid-solr-join-guest.md new file mode 100644 index 00000000000..956c658dbed --- /dev/null +++ b/doc/release-notes/10554-avoid-solr-join-guest.md @@ -0,0 +1,5 @@ +Two experimental features flag called "add-publicobject-solr-field" and "avoid-expensive-solr-join" have been added to change how Solr documents are indexed for public objects and how Solr queries are constructed to accommodate access to restricted content (drafts, etc.). It is hoped that it will help with performance, especially on large instances and under load. + +Before the search feature flag ("avoid-expensive...") can be turned on, the indexing flag must be enabled, and a full reindex performed. Otherwise publicly available objects are NOT going to be shown in search results. + +For details see https://dataverse-guide--10555.org.readthedocs.build/en/10555/installation/config.html#feature-flags and #10555. diff --git a/doc/sphinx-guides/source/developers/performance.rst b/doc/sphinx-guides/source/developers/performance.rst index 46c152f322e..562fa330d75 100644 --- a/doc/sphinx-guides/source/developers/performance.rst +++ b/doc/sphinx-guides/source/developers/performance.rst @@ -118,6 +118,10 @@ Solr While in the past Solr performance hasn't been much of a concern, in recent years we've noticed performance problems when Harvard Dataverse is under load. Improvements were made in `PR #10050 `_, for example. +We are tracking performance problems in `#10469 `_. + +In a meeting with a Solr expert on 2024-05-10 we were advised to avoid joins as much as possible. (It was acknowledged that many Solr users make use of joins because they have to, like we do, to keep some documents private.) Toward that end we have added two feature flags called ``avoid-expensive-solr-join`` and ``add-publicobject-solr-field`` as explained under :ref:`feature-flags`. It was confirmed experimentally that performing the join on all the public objects (published collections, datasets and files), i.e., the bulk of the content in the search index, was indeed very expensive, especially on a large instance the size of the IQSS prod. archive, especially under indexing load. We confirmed that it was in fact unnecessary and were able to replace it with a boolean field directly in the indexed documents, which is achieved by the two feature flags above. However, as of writing this, this mechanism should still be considered experimental. + Datasets with Large Numbers of Files or Versions ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst index 907631e6236..8fb9460892b 100644 --- a/doc/sphinx-guides/source/installation/config.rst +++ b/doc/sphinx-guides/source/installation/config.rst @@ -3268,6 +3268,12 @@ please find all known feature flags below. Any of these flags can be activated u * - api-session-auth - Enables API authentication via session cookie (JSESSIONID). **Caution: Enabling this feature flag exposes the installation to CSRF risks!** We expect this feature flag to be temporary (only used by frontend developers, see `#9063 `_) and for the feature to be removed in the future. - ``Off`` + * - avoid-expensive-solr-join + - Changes the way Solr queries are constructed for public content (published Collections, Datasets and Files). It removes a very expensive Solr join on all such documents, improving overall performance, especially for large instances under heavy load. Before this feature flag is enabled, the corresponding indexing feature (see next feature flag) must be turned on and a full reindex performed (otherwise public objects are not going to be shown in search results). See :doc:`/admin/solr-search-index`. + - ``Off`` + * - add-publicobject-solr-field + - Adds an extra boolean field `PublicObject_b:true` for public content (published Collections, Datasets and Files). Once reindexed with these fields, we can rely on it to remove a very expensive Solr join on all such documents in Solr queries, significantly improving overall performance (by enabling the feature flag above, `avoid-expensive-solr-join`). These two flags are separate so that an instance can reindex their holdings before enabling the optimization in searches, thus avoiding having their public objects temporarily disappear from search results while the reindexing is in progress. + - ``Off`` **Note:** Feature flags can be set via any `supported MicroProfile Config API source`_, e.g. the environment variable ``DATAVERSE_FEATURE_XXX`` (e.g. ``DATAVERSE_FEATURE_API_SESSION_AUTH=1``). These environment variables can be set in your shell before starting Payara. If you are using :doc:`Docker for development `, you can set them in the `docker compose `_ file. diff --git a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java index e61b93a741f..24efec4cca3 100644 --- a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java @@ -12,6 +12,7 @@ import edu.harvard.iq.dataverse.datavariable.VariableMetadataUtil; import edu.harvard.iq.dataverse.datavariable.VariableServiceBean; import edu.harvard.iq.dataverse.harvest.client.HarvestingClient; +import edu.harvard.iq.dataverse.settings.FeatureFlags; import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import edu.harvard.iq.dataverse.util.FileUtil; @@ -214,6 +215,9 @@ public Future indexDataverse(Dataverse dataverse, boolean processPaths) solrInputDocument.addField(SearchFields.DATAVERSE_CATEGORY, dataverse.getIndexableCategoryName()); if (dataverse.isReleased()) { solrInputDocument.addField(SearchFields.PUBLICATION_STATUS, PUBLISHED_STRING); + if (FeatureFlags.ADD_PUBLICOBJECT_SOLR_FIELD.enabled()) { + solrInputDocument.addField(SearchFields.PUBLIC_OBJECT, true); + } solrInputDocument.addField(SearchFields.RELEASE_OR_CREATE_DATE, dataverse.getPublicationDate()); } else { solrInputDocument.addField(SearchFields.PUBLICATION_STATUS, UNPUBLISHED_STRING); @@ -878,6 +882,9 @@ public SolrInputDocuments toSolrDocs(IndexableDataset indexableDataset, Set groups; + + if (user instanceof GuestUser) { + // Yes, GuestUser may be part of one or more groups; such as IP Groups. + groups = groupService.collectAncestors(groupService.groupsFor(dataverseRequest)); + } else { + if (!(user instanceof AuthenticatedUser)) { + logger.severe("Should never reach here. A User must be an AuthenticatedUser or a Guest"); + throw new IllegalStateException("A User must be an AuthenticatedUser or a Guest"); + } + + au = (AuthenticatedUser) user; + + // ---------------------------------------------------- + // (3) Is this a Super User? + // If so, they can see everything + // ---------------------------------------------------- + if (au.isSuperuser()) { + // Somewhat dangerous because this user (a superuser) will be able + // to see everything in Solr with no regard to permissions. But it's + // been this way since Dataverse 4.0. So relax. :) + + return dangerZoneNoSolrJoin; + } + + // ---------------------------------------------------- + // (4) User is logged in AND onlyDatatRelatedToMe == true + // Yes, give back everything -> the settings will be in + // the filterqueries given to search + // ---------------------------------------------------- + if (onlyDatatRelatedToMe == true) { + if (systemConfig.myDataDoesNotUsePermissionDocs()) { + logger.fine("old 4.2 behavior: MyData is not using Solr permission docs"); + return dangerZoneNoSolrJoin; + } else { + // fall-through + logger.fine("new post-4.2 behavior: MyData is using Solr permission docs"); + } + } + + // ---------------------------------------------------- + // (5) Work with Authenticated User who is not a Superuser + // ---------------------------------------------------- + + groups = groupService.collectAncestors(groupService.groupsFor(dataverseRequest)); + } + + if (FeatureFlags.AVOID_EXPENSIVE_SOLR_JOIN.enabled()) { + /** + * Instead of doing a super expensive join, we will rely on the + * new boolean field PublicObject:true for public objects. This field + * is indexed on the content document itself, rather than a permission + * document. An additional join will be added only for any extra, + * more restricted groups that the user may be part of. + * **Note the experimental nature of this optimization**. + */ + StringBuilder sb = new StringBuilder(); + StringBuilder sbgroups = new StringBuilder(); + + // All users, guests and authenticated, should see all the + // documents marked as publicObject_b:true, at least: + sb.append(SearchFields.PUBLIC_OBJECT + ":" + true); + + // One or more groups *may* also be available for this user. Once again, + // do note that Guest users may be part of some groups, such as + // IP groups. + + int groupCounter = 0; + + // An AuthenticatedUser should also be able to see all the content + // on which they have direct permissions: + if (au != null) { + groupCounter++; + sbgroups.append(IndexServiceBean.getGroupPerUserPrefix() + au.getId()); + } + + // In addition to the user referenced directly, we will also + // add joins on all the non-public groups that may exist for the + // user: + for (Group group : groups) { + String groupAlias = group.getAlias(); + if (groupAlias != null && !groupAlias.isEmpty() && !groupAlias.startsWith("builtIn")) { + groupCounter++; + if (groupCounter > 1) { + sbgroups.append(" OR "); + } + sbgroups.append(IndexServiceBean.getGroupPrefix() + groupAlias); + } + } + + if (groupCounter > 1) { + // If there is more than one group, the parentheses must be added: + sbgroups.insert(0, "("); + sbgroups.append(")"); + } + + if (groupCounter > 0) { + // If there are any groups for this user, an extra join must be + // added to the query, and the extra sub-query must be added to + // the combined Solr query: + sb.append(" OR {!join from=" + SearchFields.DEFINITION_POINT + " to=id v=$q1}"); + // Add the subquery to the combined Solr query: + solrQuery.setParam("q1", SearchFields.DISCOVERABLE_BY + ":" + sbgroups.toString()); + logger.info("The sub-query q1 set to " + SearchFields.DISCOVERABLE_BY + ":" + sbgroups.toString()); + } + + String ret = sb.toString(); + logger.info("Returning experimental query: " + ret); + return ret; + } + + // END OF EXPERIMENTAL OPTIMIZATION + + // Old, un-optimized way of handling permissions. + // Largely left intact, minus the lookups that have already been performed + // above. + // ---------------------------------------------------- // (1) Is this a GuestUser? - // Yes, see if GuestUser is part of any groups such as IP Groups. // ---------------------------------------------------- if (user instanceof GuestUser) { - String groupsFromProviders = ""; - Set groups = groupService.collectAncestors(groupService.groupsFor(dataverseRequest)); + StringBuilder sb = new StringBuilder(); + + String groupsFromProviders = ""; for (Group group : groups) { logger.fine("found group " + group.getIdentifier() + " with alias " + group.getAlias()); String groupAlias = group.getAlias(); @@ -1025,51 +1144,11 @@ private String getPermissionFilterQuery(DataverseRequest dataverseRequest, SolrQ return guestWithGroups; } - // ---------------------------------------------------- - // (2) Retrieve Authenticated User - // ---------------------------------------------------- - if (!(user instanceof AuthenticatedUser)) { - logger.severe("Should never reach here. A User must be an AuthenticatedUser or a Guest"); - throw new IllegalStateException("A User must be an AuthenticatedUser or a Guest"); - } - - AuthenticatedUser au = (AuthenticatedUser) user; - - // if (addFacets) { - // // Logged in user, has publication status facet - // // - // solrQuery.addFacetField(SearchFields.PUBLICATION_STATUS); - // } - - // ---------------------------------------------------- - // (3) Is this a Super User? - // Yes, give back everything - // ---------------------------------------------------- - if (au.isSuperuser()) { - // Somewhat dangerous because this user (a superuser) will be able - // to see everything in Solr with no regard to permissions. But it's - // been this way since Dataverse 4.0. So relax. :) - - return dangerZoneNoSolrJoin; - } - - // ---------------------------------------------------- - // (4) User is logged in AND onlyDatatRelatedToMe == true - // Yes, give back everything -> the settings will be in - // the filterqueries given to search - // ---------------------------------------------------- - if (onlyDatatRelatedToMe == true) { - if (systemConfig.myDataDoesNotUsePermissionDocs()) { - logger.fine("old 4.2 behavior: MyData is not using Solr permission docs"); - return dangerZoneNoSolrJoin; - } else { - logger.fine("new post-4.2 behavior: MyData is using Solr permission docs"); - } - } - // ---------------------------------------------------- // (5) Work with Authenticated User who is not a Superuser - // ---------------------------------------------------- + // ---------------------------------------------------- + // It was already confirmed, that if the user is not GuestUser, we + // have an AuthenticatedUser au which is not null. /** * @todo all this code needs cleanup and clarification. */ @@ -1100,7 +1179,6 @@ private String getPermissionFilterQuery(DataverseRequest dataverseRequest, SolrQ * a given "content document" (dataset version, etc) in Solr. */ String groupsFromProviders = ""; - Set groups = groupService.collectAncestors(groupService.groupsFor(dataverseRequest)); StringBuilder sb = new StringBuilder(); for (Group group : groups) { logger.fine("found group " + group.getIdentifier() + " with alias " + group.getAlias()); diff --git a/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java b/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java index afa5a1c986a..14a7ab86f22 100644 --- a/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java +++ b/src/main/java/edu/harvard/iq/dataverse/settings/FeatureFlags.java @@ -36,6 +36,28 @@ public enum FeatureFlags { * @since Dataverse @TODO: */ API_BEARER_AUTH("api-bearer-auth"), + /** + * For published (public) objects, don't use a join when searching Solr. + * Experimental! Requires a reindex with the following feature flag enabled, + * in order to add the boolean publicObject_b:true field to all the public + * Solr documents. + * + * @apiNote Raise flag by setting + * "dataverse.feature.avoid-expensive-solr-join" + * @since Dataverse 6.3 + */ + AVOID_EXPENSIVE_SOLR_JOIN("avoid-expensive-solr-join"), + /** + * With this flag enabled, the boolean field publicObject_b:true will be + * added to all the indexed Solr documents for publicly-available collections, + * datasets and files. This flag makes it possible to rely on it in searches, + * instead of the very expensive join (the feature flag above). + * + * @apiNote Raise flag by setting + * "dataverse.feature.add-publicobject-solr-field" + * @since Dataverse 6.3 + */ + ADD_PUBLICOBJECT_SOLR_FIELD("add-publicobject-solr-field"), ; final String flag; From c0527738abd7b13053eeccfbff60e283ea98d602 Mon Sep 17 00:00:00 2001 From: Juan Pablo Tosca Villanueva <142103991+jp-tosca@users.noreply.github.com> Date: Thu, 6 Jun 2024 09:24:02 -0400 Subject: [PATCH 15/18] OpenAPI definition endpoint (#10328) * Plugin initial config * Initial changes to provide OpenAPI definition * Added integration test * Imports fix * Add patchnotes * Update the changelog * Update src/main/java/edu/harvard/iq/dataverse/api/Info.java Co-authored-by: Philip Durbin * Update doc/release-notes/10236-openapi-definition-endpoint.md Co-authored-by: Philip Durbin * Update doc/release-notes/10236-openapi-definition-endpoint.md Co-authored-by: Philip Durbin * Add native API docs * Remove generated definitions * Add to gitignore generated openapi files * Updates to docs * Ignore files correction * Remove files created by the plugin * Changes to move the definition files to META-INF * Changes to move the definitions to WEB-INF * Changes to get the files from META-INF * Changed the phase of execution of the smallrye plugin * Changes of names to improve the generation of the spec * Add support for OpenAPI annotations and documents the version endpoint * Multipart Annotations * Typos correction * Changes for tags * Renaming of methods * Changes to the endpoint * Added test * Add test * Deleted extra import * Docs updated * openapi doc tweaks #9981 #10236 * improve release note #9981 #10236 * Remove old test and changes response to JSON * stub out guidance on openapi validation #9981 #10236 * add InfoIT to list of tests * use description of Dataverse from website * mention status codes in openapi doc * update api faq about changelog, link to breaking changes doc * typo * Change to OpenApi * Changes to docs * Name fix * Removing the multipart from unirest --------- Co-authored-by: Philip Durbin --- .gitignore | 1 + .../10236-openapi-definition-endpoint.md | 8 ++ doc/sphinx-guides/source/api/changelog.rst | 1 + doc/sphinx-guides/source/api/faq.rst | 6 +- .../source/api/getting-started.rst | 26 +++++ .../source/developers/api-design.rst | 15 +++ pom.xml | 37 ++++++- .../edu/harvard/iq/dataverse/api/Access.java | 22 ++++ .../edu/harvard/iq/dataverse/api/Admin.java | 4 +- .../iq/dataverse/api/BuiltinUsers.java | 2 +- .../harvard/iq/dataverse/api/Datasets.java | 49 ++++++++- .../edu/harvard/iq/dataverse/api/Files.java | 17 ++- .../edu/harvard/iq/dataverse/api/Groups.java | 12 +-- .../edu/harvard/iq/dataverse/api/Info.java | 24 +++++ .../edu/harvard/iq/dataverse/api/TestApi.java | 2 +- .../edu/harvard/iq/dataverse/api/Users.java | 2 +- .../iq/dataverse/api/WorkflowsAdmin.java | 4 +- .../harvard/iq/dataverse/openapi/OpenApi.java | 101 ++++++++++++++++++ src/main/java/propertyFiles/Bundle.properties | 5 + .../edu/harvard/iq/dataverse/api/InfoIT.java | 4 +- .../harvard/iq/dataverse/api/OpenApiIT.java | 40 +++++++ .../edu/harvard/iq/dataverse/api/UtilIT.java | 9 ++ tests/integration-tests.txt | 2 +- 23 files changed, 372 insertions(+), 21 deletions(-) create mode 100644 doc/release-notes/10236-openapi-definition-endpoint.md create mode 100644 src/main/java/edu/harvard/iq/dataverse/openapi/OpenApi.java create mode 100644 src/test/java/edu/harvard/iq/dataverse/api/OpenApiIT.java diff --git a/.gitignore b/.gitignore index a9733538f7c..514f82116de 100644 --- a/.gitignore +++ b/.gitignore @@ -34,6 +34,7 @@ oauth-credentials.md /src/main/webapp/oauth2/newAccount.html scripts/api/setup-all.sh* scripts/api/setup-all.*.log +src/main/resources/edu/harvard/iq/dataverse/openapi/ # ctags generated tag file tags diff --git a/doc/release-notes/10236-openapi-definition-endpoint.md b/doc/release-notes/10236-openapi-definition-endpoint.md new file mode 100644 index 00000000000..60492c29d78 --- /dev/null +++ b/doc/release-notes/10236-openapi-definition-endpoint.md @@ -0,0 +1,8 @@ +In Dataverse 6.0 Payara was updated, which caused the url `/openapi` to stop working: + +- https://github.com/IQSS/dataverse/issues/9981 +- https://github.com/payara/Payara/issues/6369 + +When it worked in Dataverse 5.x, the `/openapi` output was generated automatically by Payara, but in this release we have switched to OpenAPI output produced by the [SmallRye OpenAPI plugin](https://github.com/smallrye/smallrye-open-api/tree/main/tools/maven-plugin). This gives us finer control over the output. + +For more information, see the section on [OpenAPI](https://dataverse-guide--10328.org.readthedocs.build/en/10328/api/getting-started.html#openapi) in the API Guide. diff --git a/doc/sphinx-guides/source/api/changelog.rst b/doc/sphinx-guides/source/api/changelog.rst index db994a629b3..a7af3e84b28 100644 --- a/doc/sphinx-guides/source/api/changelog.rst +++ b/doc/sphinx-guides/source/api/changelog.rst @@ -30,6 +30,7 @@ v6.0 ---- - **/api/access/datafile**: When a null or invalid API token is provided to download a public (non-restricted) file with this API call, it will result on a ``401`` error response. Previously, the download was allowed (``200`` response). Please note that we noticed this change sometime between 5.9 and 6.0. If you can help us pinpoint the exact version (or commit!), please get in touch. See :doc:`dataaccess`. +- **/openapi**: This endpoint is currently broken. See https://github.com/IQSS/dataverse/issues/9981 v5.6 ---- diff --git a/doc/sphinx-guides/source/api/faq.rst b/doc/sphinx-guides/source/api/faq.rst index b9d4be18373..439783779c3 100644 --- a/doc/sphinx-guides/source/api/faq.rst +++ b/doc/sphinx-guides/source/api/faq.rst @@ -56,12 +56,12 @@ Where is the Comprehensive List of All API Functionality? There are so many Dataverse Software APIs that a single page in this guide would probably be overwhelming. See :ref:`list-of-dataverse-apis` for links to various pages. -It is possible to get a complete list of API functionality in Swagger/OpenAPI format if you deploy Dataverse Software 5.x. For details, see https://github.com/IQSS/dataverse/issues/5794 +It is possible to get a complete list of API functionality in Swagger/OpenAPI format. See :ref:`openapi`. Is There a Changelog of API Functionality That Has Been Added Over Time? ------------------------------------------------------------------------ -No, but there probably should be. If you have suggestions for how it should look, please create an issue at https://github.com/IQSS/dataverse/issues +Changes to the API that don't break anything can be found in the `release notes `_ of each release. Breaking changes are documented in :doc:`changelog`. .. _no-api: @@ -89,6 +89,8 @@ Why Are the Return Values (HTTP Status Codes) Not Documented? They should be. Please consider making a pull request to help. The :doc:`/developers/documentation` section of the Developer Guide should help you get started. :ref:`create-dataverse-api` has an example you can follow or you can come up with a better way. +Also, please note that we are starting to experiment with putting response codes in our OpenAPI document. See :ref:`openapi`. + What If My Question Is Not Answered Here? ----------------------------------------- diff --git a/doc/sphinx-guides/source/api/getting-started.rst b/doc/sphinx-guides/source/api/getting-started.rst index c12fb01a269..3f93f4ac444 100644 --- a/doc/sphinx-guides/source/api/getting-started.rst +++ b/doc/sphinx-guides/source/api/getting-started.rst @@ -154,6 +154,32 @@ Listing Permissions (Role Assignments) See :ref:`list-role-assignments-on-a-dataverse-api`. +.. _openapi: + +Getting the OpenAPI Document +---------------------------- + +You can access our `OpenAPI document`_ using the ``/openapi`` endpoint. The default format is YAML if no parameter is provided, but you can also obtain the JSON version by either passing ``format=json`` as a query parameter or by sending ``Accept:application/json`` (case-sensitive) as a header. + +.. _OpenAPI document: https://spec.openapis.org/oas/latest.html#openapi-document + +.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of export below. + +.. code-block:: bash + + export SERVER_URL=https://demo.dataverse.org + export FORMAT=json + + curl "$SERVER_URL/openapi?format=$FORMAT" + +The fully expanded example above (without environment variables) looks like this: + +.. code-block:: bash + + curl "https://demo.dataverse.org/openapi?format=json" + +We are aware that our OpenAPI document is not perfect. You can find more information about validating the document under :ref:`openapi-dev` in the Developer Guide. + Beyond "Getting Started" Tasks ------------------------------ diff --git a/doc/sphinx-guides/source/developers/api-design.rst b/doc/sphinx-guides/source/developers/api-design.rst index e7a7a6408bb..d51481fece4 100755 --- a/doc/sphinx-guides/source/developers/api-design.rst +++ b/doc/sphinx-guides/source/developers/api-design.rst @@ -7,6 +7,21 @@ API design is a large topic. We expect this page to grow over time. .. contents:: |toctitle| :local: +.. _openapi-dev: + +OpenAPI +------- + +As you add API endpoints, please be conscious that we are exposing these endpoints as an OpenAPI document at ``/openapi`` (e.g. http://localhost:8080/openapi ). See :ref:`openapi` in the API Guide for the user-facing documentation. + +We've played around with validation tools such as https://quobix.com/vacuum/ and https://pb33f.io/doctor/ only to discover that our OpenAPI output is less than ideal, generating various warnings and errors. + +You can prevent additional problems in our OpenAPI document by observing the following practices: + +- When creating a method name within an API class, make it unique. + +If you are looking for a reference about the annotations used to generate the OpenAPI document, you can find it in the `MicroProfile OpenAPI Specification `_. + Paths ----- diff --git a/pom.xml b/pom.xml index 091ea206bd2..8fa1d993f6e 100644 --- a/pom.xml +++ b/pom.xml @@ -32,6 +32,12 @@ 5.2.1 2.4.1 5.5.3 + + Dataverse API + ${project.version} + Open source research data repository software. + + ${project.build.outputDirectory}/META-INF + process-classes + + ${openapi.outputDirectory} + openapi + ${openapi.infoTitle} + ${openapi.infoVersion} + ${openapi.infoDescription} + CLASS_METHOD + edu.harvard.iq.dataverse + true + + + + @@ -1087,4 +1122,4 @@ - + \ No newline at end of file diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Access.java b/src/main/java/edu/harvard/iq/dataverse/api/Access.java index e95500426c0..00da4990996 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Access.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Access.java @@ -130,6 +130,14 @@ import jakarta.ws.rs.core.MediaType; import static jakarta.ws.rs.core.Response.Status.FORBIDDEN; import static jakarta.ws.rs.core.Response.Status.UNAUTHORIZED; + +import org.eclipse.microprofile.openapi.annotations.Operation; +import org.eclipse.microprofile.openapi.annotations.media.Content; +import org.eclipse.microprofile.openapi.annotations.media.Schema; +import org.eclipse.microprofile.openapi.annotations.parameters.RequestBody; +import org.eclipse.microprofile.openapi.annotations.responses.APIResponse; +import org.eclipse.microprofile.openapi.annotations.responses.APIResponses; +import org.eclipse.microprofile.openapi.annotations.tags.Tag; import org.glassfish.jersey.media.multipart.FormDataBodyPart; import org.glassfish.jersey.media.multipart.FormDataParam; @@ -1248,6 +1256,20 @@ private String getWebappImageResource(String imageName) { @AuthRequired @Path("datafile/{fileId}/auxiliary/{formatTag}/{formatVersion}") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Save auxiliary file with version", + description = "Saves an auxiliary file") + @APIResponses(value = { + @APIResponse(responseCode = "200", + description = "File saved response"), + @APIResponse(responseCode = "403", + description = "User not authorized to edit the dataset."), + @APIResponse(responseCode = "400", + description = "File not found based on id.") + }) + @Tag(name = "saveAuxiliaryFileWithVersion", + description = "Save Auxiliary File With Version") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response saveAuxiliaryFileWithVersion(@Context ContainerRequestContext crc, @PathParam("fileId") Long fileId, @PathParam("formatTag") String formatTag, diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Admin.java b/src/main/java/edu/harvard/iq/dataverse/api/Admin.java index 802904b5173..154fa2350bd 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Admin.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Admin.java @@ -201,7 +201,7 @@ public Response putSetting(@PathParam("name") String name, String content) { @Path("settings/{name}/lang/{lang}") @PUT - public Response putSetting(@PathParam("name") String name, @PathParam("lang") String lang, String content) { + public Response putSettingLang(@PathParam("name") String name, @PathParam("lang") String lang, String content) { Setting s = settingsSvc.set(name, lang, content); return ok("Setting " + name + " - " + lang + " - added."); } @@ -224,7 +224,7 @@ public Response deleteSetting(@PathParam("name") String name) { @Path("settings/{name}/lang/{lang}") @DELETE - public Response deleteSetting(@PathParam("name") String name, @PathParam("lang") String lang) { + public Response deleteSettingLang(@PathParam("name") String name, @PathParam("lang") String lang) { settingsSvc.delete(name, lang); return ok("Setting " + name + " - " + lang + " deleted."); } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java b/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java index 50862bc0d35..ba99cf33c5b 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java @@ -119,7 +119,7 @@ public Response create(BuiltinUser user, @PathParam("password") String password, */ @POST @Path("{password}/{key}/{sendEmailNotification}") - public Response create(BuiltinUser user, @PathParam("password") String password, @PathParam("key") String key, @PathParam("sendEmailNotification") Boolean sendEmailNotification) { + public Response createWithNotification(BuiltinUser user, @PathParam("password") String password, @PathParam("key") String key, @PathParam("sendEmailNotification") Boolean sendEmailNotification) { return internalSave(user, password, key, sendEmailNotification); } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java index 1befb3869c3..fc0afc562fc 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java @@ -1,6 +1,7 @@ package edu.harvard.iq.dataverse.api; import com.amazonaws.services.s3.model.PartETag; + import edu.harvard.iq.dataverse.*; import edu.harvard.iq.dataverse.DatasetLock.Reason; import edu.harvard.iq.dataverse.actionlogging.ActionLogRecord; @@ -66,6 +67,12 @@ import jakarta.ws.rs.core.*; import jakarta.ws.rs.core.Response.Status; import org.apache.commons.lang3.StringUtils; +import org.eclipse.microprofile.openapi.annotations.Operation; +import org.eclipse.microprofile.openapi.annotations.media.Content; +import org.eclipse.microprofile.openapi.annotations.media.Schema; +import org.eclipse.microprofile.openapi.annotations.parameters.RequestBody; +import org.eclipse.microprofile.openapi.annotations.responses.APIResponse; +import org.eclipse.microprofile.openapi.annotations.tags.Tag; import org.glassfish.jersey.media.multipart.FormDataBodyPart; import org.glassfish.jersey.media.multipart.FormDataContentDisposition; import org.glassfish.jersey.media.multipart.FormDataParam; @@ -796,7 +803,7 @@ public Response getVersionJsonLDMetadata(@Context ContainerRequestContext crc, @ @AuthRequired @Path("{id}/metadata") @Produces("application/ld+json, application/json-ld") - public Response getVersionJsonLDMetadata(@Context ContainerRequestContext crc, @PathParam("id") String id, @Context UriInfo uriInfo, @Context HttpHeaders headers) { + public Response getJsonLDMetadata(@Context ContainerRequestContext crc, @PathParam("id") String id, @Context UriInfo uriInfo, @Context HttpHeaders headers) { return getVersionJsonLDMetadata(crc, id, DS_VERSION_LATEST, uriInfo, headers); } @@ -2261,6 +2268,14 @@ public Response setDataFileAsThumbnail(@Context ContainerRequestContext crc, @Pa @AuthRequired @Path("{id}/thumbnail") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Uploads a logo for a dataset", + description = "Uploads a logo for a dataset") + @APIResponse(responseCode = "200", + description = "Dataset logo uploaded successfully") + @Tag(name = "uploadDatasetLogo", + description = "Uploads a logo for a dataset") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response uploadDatasetLogo(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @FormDataParam("file") InputStream inputStream) { try { DatasetThumbnail datasetThumbnail = execCommand(new UpdateDatasetThumbnailCommand(createDataverseRequest(getRequestUser(crc)), findDatasetOrDie(idSupplied), UpdateDatasetThumbnailCommand.UserIntent.setNonDatasetFileAsThumbnail, null, inputStream)); @@ -2733,6 +2748,14 @@ public Response completeMPUpload(@Context ContainerRequestContext crc, String pa @AuthRequired @Path("{id}/add") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Uploads a file for a dataset", + description = "Uploads a file for a dataset") + @APIResponse(responseCode = "200", + description = "File uploaded successfully to dataset") + @Tag(name = "addFileToDataset", + description = "Uploads a file for a dataset") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response addFileToDataset(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @FormDataParam("jsonData") String jsonData, @@ -3958,6 +3981,14 @@ public Response requestGlobusUpload(@Context ContainerRequestContext crc, @PathP @AuthRequired @Path("{id}/addGlobusFiles") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Uploads a Globus file for a dataset", + description = "Uploads a Globus file for a dataset") + @APIResponse(responseCode = "200", + description = "Globus file uploaded successfully to dataset") + @Tag(name = "addGlobusFilesToDataset", + description = "Uploads a Globus file for a dataset") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response addGlobusFilesToDataset(@Context ContainerRequestContext crc, @PathParam("id") String datasetId, @FormDataParam("jsonData") String jsonData, @@ -4340,6 +4371,14 @@ public Response monitorGlobusDownload(@Context ContainerRequestContext crc, @Pat @AuthRequired @Path("{id}/addFiles") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Uploads a set of files to a dataset", + description = "Uploads a set of files to a dataset") + @APIResponse(responseCode = "200", + description = "Files uploaded successfully to dataset") + @Tag(name = "addFilesToDataset", + description = "Uploads a set of files to a dataset") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response addFilesToDataset(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @FormDataParam("jsonData") String jsonData) { @@ -4407,6 +4446,14 @@ public Response addFilesToDataset(@Context ContainerRequestContext crc, @PathPar @AuthRequired @Path("{id}/replaceFiles") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Replace a set of files to a dataset", + description = "Replace a set of files to a dataset") + @APIResponse(responseCode = "200", + description = "Files replaced successfully to dataset") + @Tag(name = "replaceFilesInDataset", + description = "Replace a set of files to a dataset") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response replaceFilesInDataset(@Context ContainerRequestContext crc, @PathParam("id") String idSupplied, @FormDataParam("jsonData") String jsonData) { diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Files.java b/src/main/java/edu/harvard/iq/dataverse/api/Files.java index 2d48322c90e..d786aab35a8 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Files.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Files.java @@ -64,6 +64,13 @@ import static jakarta.ws.rs.core.Response.Status.FORBIDDEN; import jakarta.ws.rs.core.UriInfo; + +import org.eclipse.microprofile.openapi.annotations.Operation; +import org.eclipse.microprofile.openapi.annotations.media.Content; +import org.eclipse.microprofile.openapi.annotations.media.Schema; +import org.eclipse.microprofile.openapi.annotations.parameters.RequestBody; +import org.eclipse.microprofile.openapi.annotations.responses.APIResponse; +import org.eclipse.microprofile.openapi.annotations.tags.Tag; import org.glassfish.jersey.media.multipart.FormDataBodyPart; import org.glassfish.jersey.media.multipart.FormDataContentDisposition; import org.glassfish.jersey.media.multipart.FormDataParam; @@ -176,6 +183,14 @@ public Response restrictFileInDataset(@Context ContainerRequestContext crc, @Pat @AuthRequired @Path("{id}/replace") @Consumes(MediaType.MULTIPART_FORM_DATA) + @Produces("application/json") + @Operation(summary = "Replace a file on a dataset", + description = "Replace a file to a dataset") + @APIResponse(responseCode = "200", + description = "File replaced successfully on the dataset") + @Tag(name = "replaceFilesInDataset", + description = "Replace a file to a dataset") + @RequestBody(content = @Content(mediaType = MediaType.MULTIPART_FORM_DATA)) public Response replaceFileInDataset( @Context ContainerRequestContext crc, @PathParam("id") String fileIdOrPersistentId, @@ -497,7 +512,7 @@ public Response getFileData(@Context ContainerRequestContext crc, @GET @AuthRequired @Path("{id}/versions/{datasetVersionId}") - public Response getFileData(@Context ContainerRequestContext crc, + public Response getFileDataForVersion(@Context ContainerRequestContext crc, @PathParam("id") String fileIdOrPersistentId, @PathParam("datasetVersionId") String datasetVersionId, @QueryParam("includeDeaccessioned") boolean includeDeaccessioned, diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Groups.java b/src/main/java/edu/harvard/iq/dataverse/api/Groups.java index d56a787c7ff..ed996b8ecf9 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Groups.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Groups.java @@ -88,8 +88,8 @@ public Response postIpGroup( JsonObject dto ){ * that group from being created. */ @PUT - @Path("ip/{groupName}") - public Response putIpGroups( @PathParam("groupName") String groupName, JsonObject dto ){ + @Path("ip/{group}") + public Response putIpGroups( @PathParam("group") String groupName, JsonObject dto ){ try { if ( groupName == null || groupName.trim().isEmpty() ) { return badRequest("Group name cannot be empty"); @@ -118,8 +118,8 @@ public Response listIpGroups() { } @GET - @Path("ip/{groupIdtf}") - public Response getIpGroup( @PathParam("groupIdtf") String groupIdtf ) { + @Path("ip/{group}") + public Response getIpGroup( @PathParam("group") String groupIdtf ) { IpGroup grp; if ( isNumeric(groupIdtf) ) { grp = ipGroupPrv.get( Long.parseLong(groupIdtf) ); @@ -131,8 +131,8 @@ public Response getIpGroup( @PathParam("groupIdtf") String groupIdtf ) { } @DELETE - @Path("ip/{groupIdtf}") - public Response deleteIpGroup( @PathParam("groupIdtf") String groupIdtf ) { + @Path("ip/{group}") + public Response deleteIpGroup( @PathParam("group") String groupIdtf ) { IpGroup grp; if ( isNumeric(groupIdtf) ) { grp = ipGroupPrv.get( Long.parseLong(groupIdtf) ); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Info.java b/src/main/java/edu/harvard/iq/dataverse/api/Info.java index 40ce6cd25b7..257519677d3 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Info.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Info.java @@ -1,16 +1,35 @@ package edu.harvard.iq.dataverse.api; +import java.io.FileInputStream; +import java.io.InputStream; +import java.net.URL; +import java.nio.charset.StandardCharsets; +import java.util.Arrays; +import java.util.List; +import java.util.logging.Level; +import java.util.logging.Logger; + +import jakarta.ws.rs.Produces; +import org.apache.commons.io.IOUtils; + import edu.harvard.iq.dataverse.settings.JvmSettings; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; +import edu.harvard.iq.dataverse.util.BundleUtil; import edu.harvard.iq.dataverse.util.SystemConfig; import jakarta.ejb.EJB; import jakarta.json.Json; import jakarta.json.JsonValue; import jakarta.ws.rs.GET; import jakarta.ws.rs.Path; +import jakarta.ws.rs.PathParam; +import jakarta.ws.rs.core.MediaType; import jakarta.ws.rs.core.Response; +import org.eclipse.microprofile.openapi.annotations.Operation; +import org.eclipse.microprofile.openapi.annotations.responses.APIResponse; +import org.eclipse.microprofile.openapi.annotations.tags.Tag; @Path("info") +@Tag(name = "info", description = "General information about the Dataverse installation.") public class Info extends AbstractApiBean { @EJB @@ -19,6 +38,8 @@ public class Info extends AbstractApiBean { @EJB SystemConfig systemConfig; + private static final Logger logger = Logger.getLogger(Info.class.getCanonicalName()); + @GET @Path("settings/:DatasetPublishPopupCustomText") public Response getDatasetPublishPopupCustomText() { @@ -33,6 +54,9 @@ public Response getMaxEmbargoDurationInMonths() { @GET @Path("version") + @Operation(summary = "Get version and build information", description = "Get version and build information") + @APIResponse(responseCode = "200", + description = "Version and build information") public Response getInfo() { String versionStr = systemConfig.getVersion(true); String[] comps = versionStr.split("build",2); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java b/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java index b9db44b2671..46747b50c29 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java @@ -21,7 +21,7 @@ public class TestApi extends AbstractApiBean { @GET @Path("datasets/{id}/externalTools") - public Response getExternalToolsforFile(@PathParam("id") String idSupplied, @QueryParam("type") String typeSupplied) { + public Response getDatasetExternalToolsforFile(@PathParam("id") String idSupplied, @QueryParam("type") String typeSupplied) { ExternalTool.Type type; try { type = ExternalTool.Type.fromString(typeSupplied); diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Users.java b/src/main/java/edu/harvard/iq/dataverse/api/Users.java index 791fc7aa774..1f5430340c2 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/Users.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/Users.java @@ -234,7 +234,7 @@ public Response getTraces(@Context ContainerRequestContext crc, @PathParam("iden @AuthRequired @Path("{identifier}/traces/{element}") @Produces("text/csv, application/json") - public Response getTraces(@Context ContainerRequestContext crc, @Context Request req, @PathParam("identifier") String identifier, @PathParam("element") String element) { + public Response getTracesElement(@Context ContainerRequestContext crc, @Context Request req, @PathParam("identifier") String identifier, @PathParam("element") String element) { try { AuthenticatedUser userToQuery = authSvc.getAuthenticatedUser(identifier); if(!elements.contains(element)) { diff --git a/src/main/java/edu/harvard/iq/dataverse/api/WorkflowsAdmin.java b/src/main/java/edu/harvard/iq/dataverse/api/WorkflowsAdmin.java index 8d5024c1c14..15478aacff7 100644 --- a/src/main/java/edu/harvard/iq/dataverse/api/WorkflowsAdmin.java +++ b/src/main/java/edu/harvard/iq/dataverse/api/WorkflowsAdmin.java @@ -111,9 +111,9 @@ public Response deleteDefault(@PathParam("triggerType") String triggerType) { } } - @Path("/{identifier}") + @Path("/{id}") @GET - public Response getWorkflow(@PathParam("identifier") String identifier ) { + public Response getWorkflow(@PathParam("id") String identifier ) { try { long idtf = Long.parseLong(identifier); return workflows.getWorkflow(idtf) diff --git a/src/main/java/edu/harvard/iq/dataverse/openapi/OpenApi.java b/src/main/java/edu/harvard/iq/dataverse/openapi/OpenApi.java new file mode 100644 index 00000000000..6bd54916e0d --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/openapi/OpenApi.java @@ -0,0 +1,101 @@ +package edu.harvard.iq.dataverse.openapi; + +import java.io.*; +import java.net.URL; +import java.nio.charset.StandardCharsets; +import java.util.*; +import java.util.logging.*; + +import jakarta.json.Json; +import jakarta.json.JsonObject; +import jakarta.servlet.ServletException; +import jakarta.servlet.annotation.WebServlet; +import jakarta.servlet.http.*; +import jakarta.ws.rs.core.*; +import org.apache.commons.io.IOUtils; +import edu.harvard.iq.dataverse.api.Info; +import edu.harvard.iq.dataverse.util.BundleUtil; + +@WebServlet("/openapi") +public class OpenApi extends HttpServlet { + + private static final Logger logger = Logger.getLogger(Info.class.getCanonicalName()); + + private static final String YAML_FORMAT = "yaml"; + private static final String JSON_FORMAT = "json"; + + + @Override + protected void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException { + + + String format = req.getParameter("format"); + String accept = req.getHeader("Accept"); + + /* + * We first check for the headers, if the request accepts application/json + * have to check for the format parameter, if it is different from json + * return BAD_REQUEST (400) + */ + if (MediaType.APPLICATION_JSON.equals(accept)){ + if (format != null && !JSON_FORMAT.equals(format)){ + List args = Arrays.asList(accept, format); + String bundleResponse = BundleUtil.getStringFromBundle("openapi.exception.unaligned", args); + resp.sendError(Response.Status.BAD_REQUEST.getStatusCode(), + bundleResponse); + return; + } else { + format = JSON_FORMAT; + } + } + + /* + * We currently support only JSON or YAML being the second the default + * if no format is specified, if a different format is specified we return + * UNSUPPORTED_MEDIA_TYPE (415) specifying that the format is not supported + */ + + format = format == null ? YAML_FORMAT : format.toLowerCase(); + + if (JSON_FORMAT.equals(format)) { + resp.setContentType(MediaType.APPLICATION_JSON_TYPE.toString()); + } else if (YAML_FORMAT.equals(format)){ + resp.setContentType(MediaType.TEXT_PLAIN_TYPE.toString()); + } else { + + List args = Arrays.asList(format); + String bundleResponse = BundleUtil.getStringFromBundle("openapi.exception.invalid.format", args); + + JsonObject errorResponse = Json.createObjectBuilder() + .add("status", "ERROR") + .add("code", HttpServletResponse.SC_UNSUPPORTED_MEDIA_TYPE) + .add("message", bundleResponse) + .build(); + + resp.setContentType(MediaType.APPLICATION_JSON_TYPE.toString()); + resp.setStatus(HttpServletResponse.SC_UNSUPPORTED_MEDIA_TYPE); + + PrintWriter responseWriter = resp.getWriter(); + responseWriter.println(errorResponse.toString()); + responseWriter.flush(); + return; + } + + try { + String baseFileName = "/META-INF/openapi." + format; + ClassLoader classLoader = this.getClass().getClassLoader(); + URL aliasesResource = classLoader.getResource(baseFileName); + InputStream openapiDefinitionStream = aliasesResource.openStream(); + String content = IOUtils.toString(openapiDefinitionStream, StandardCharsets.UTF_8); + resp.getWriter().write(content); + } catch (Exception e) { + logger.log(Level.SEVERE, "OpenAPI Definition format not found " + format + ":" + e.getMessage(), e); + String bundleResponse = BundleUtil.getStringFromBundle("openapi.exception"); + resp.sendError(Response.Status.INTERNAL_SERVER_ERROR.getStatusCode(), + bundleResponse); + } + + + } + +} diff --git a/src/main/java/propertyFiles/Bundle.properties b/src/main/java/propertyFiles/Bundle.properties index 0441853eee9..7bc3ec33a80 100644 --- a/src/main/java/propertyFiles/Bundle.properties +++ b/src/main/java/propertyFiles/Bundle.properties @@ -3023,3 +3023,8 @@ publishDatasetCommand.pidNotReserved=Cannot publish dataset because its persiste api.errors.invalidApiToken=Invalid API token. api.ldninbox.citation.alert={0},

The {1} has just been notified that the {2}, {3}, cites "{6}" in this repository. api.ldninbox.citation.subject={0}: A Dataset Citation has been reported! + +#Info.java +openapi.exception.invalid.format=Invalid format {0}, currently supported formats are YAML and JSON. +openapi.exception=Supported format definition not found. +openapi.exception.unaligned=Unaligned parameters on Headers [{0}] and Request [{1}] diff --git a/src/test/java/edu/harvard/iq/dataverse/api/InfoIT.java b/src/test/java/edu/harvard/iq/dataverse/api/InfoIT.java index 3d5691dbe03..5e436dd0e98 100644 --- a/src/test/java/edu/harvard/iq/dataverse/api/InfoIT.java +++ b/src/test/java/edu/harvard/iq/dataverse/api/InfoIT.java @@ -1,13 +1,12 @@ package edu.harvard.iq.dataverse.api; import static io.restassured.RestAssured.given; - import io.restassured.response.Response; import edu.harvard.iq.dataverse.settings.SettingsServiceBean; import org.junit.jupiter.api.AfterAll; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.Test; - +import static jakarta.ws.rs.core.Response.Status.BAD_REQUEST; import static jakarta.ws.rs.core.Response.Status.NOT_FOUND; import static jakarta.ws.rs.core.Response.Status.OK; import static org.hamcrest.CoreMatchers.equalTo; @@ -82,6 +81,7 @@ public void testGetZipDownloadLimit() { .body("data", notNullValue()); } + private void testSettingEndpoint(SettingsServiceBean.Key settingKey, String testSettingValue) { String endpoint = "/api/info/settings/" + settingKey; // Setting not found diff --git a/src/test/java/edu/harvard/iq/dataverse/api/OpenApiIT.java b/src/test/java/edu/harvard/iq/dataverse/api/OpenApiIT.java new file mode 100644 index 00000000000..eb98bdcda8e --- /dev/null +++ b/src/test/java/edu/harvard/iq/dataverse/api/OpenApiIT.java @@ -0,0 +1,40 @@ +package edu.harvard.iq.dataverse.api; + +import org.junit.jupiter.api.BeforeAll; +import org.junit.jupiter.api.Test; +import io.restassured.RestAssured; +import io.restassured.response.Response; + +public class OpenApiIT { + + @BeforeAll + public static void setUpClass() { + RestAssured.baseURI = UtilIT.getRestAssuredBaseUri(); + } + + @Test + public void testOpenApi(){ + + Response openApi = UtilIT.getOpenAPI("application/json", "json"); + openApi.prettyPrint(); + openApi.then().assertThat() + .statusCode(200); + + openApi = UtilIT.getOpenAPI("", "json"); + openApi.prettyPrint(); + openApi.then().assertThat() + .statusCode(200); + + openApi = UtilIT.getOpenAPI("", "yaml"); + openApi.prettyPrint(); + openApi.then().assertThat() + .statusCode(200); + + openApi = UtilIT.getOpenAPI("application/json", "yaml"); + openApi.prettyPrint(); + openApi.then().assertThat() + .statusCode(400); + + + } +} \ No newline at end of file diff --git a/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java b/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java index 507c9b302b3..12f36901faa 100644 --- a/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java +++ b/src/test/java/edu/harvard/iq/dataverse/api/UtilIT.java @@ -11,6 +11,7 @@ import jakarta.json.JsonObjectBuilder; import jakarta.json.JsonArrayBuilder; import jakarta.json.JsonObject; + import static jakarta.ws.rs.core.Response.Status.CREATED; import java.nio.charset.StandardCharsets; @@ -3954,4 +3955,12 @@ static Response updateDataverseInputLevels(String dataverseAlias, String[] input .contentType(ContentType.JSON) .put("/api/dataverses/" + dataverseAlias + "/inputLevels"); } + + public static Response getOpenAPI(String accept, String format) { + Response response = given() + .header("Accept", accept) + .queryParam("format", format) + .get("/openapi"); + return response; + } } diff --git a/tests/integration-tests.txt b/tests/integration-tests.txt index 58d8d814bb9..44bbfdcceb7 100644 --- a/tests/integration-tests.txt +++ b/tests/integration-tests.txt @@ -1 +1 @@ -DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,HarvestingClientsIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT,ExternalToolsIT,AccessIT,DuplicateFilesIT,DownloadFilesIT,LinkIT,DeleteUsersIT,DeactivateUsersIT,AuxiliaryFilesIT,InvalidCharactersIT,LicensesIT,NotificationsIT,BagIT,MetadataBlocksIT,NetcdfIT,SignpostingIT,FitsIT,LogoutIT,DataRetrieverApiIT,ProvIT,S3AccessIT +DataversesIT,DatasetsIT,SwordIT,AdminIT,BuiltinUsersIT,UsersIT,UtilIT,ConfirmEmailIT,FileMetadataIT,FilesIT,SearchIT,InReviewWorkflowIT,HarvestingServerIT,HarvestingClientsIT,MoveIT,MakeDataCountApiIT,FileTypeDetectionIT,EditDDIIT,ExternalToolsIT,AccessIT,DuplicateFilesIT,DownloadFilesIT,LinkIT,DeleteUsersIT,DeactivateUsersIT,AuxiliaryFilesIT,InvalidCharactersIT,LicensesIT,NotificationsIT,BagIT,MetadataBlocksIT,NetcdfIT,SignpostingIT,FitsIT,LogoutIT,DataRetrieverApiIT,ProvIT,S3AccessIT,OpenApiIT,InfoIT From e7a0e375f4082aa3816370535861e724225c6c97 Mon Sep 17 00:00:00 2001 From: jeromeroucou Date: Mon, 10 Jun 2024 15:41:17 +0200 Subject: [PATCH 16/18] #10466: quick fix for math challenge contact form on 403 error page (#10602) --- doc/release-notes/10466-math-challenge-403-error-page.md | 1 + .../java/edu/harvard/iq/dataverse/SendFeedbackDialog.java | 4 ++++ 2 files changed, 5 insertions(+) create mode 100644 doc/release-notes/10466-math-challenge-403-error-page.md diff --git a/doc/release-notes/10466-math-challenge-403-error-page.md b/doc/release-notes/10466-math-challenge-403-error-page.md new file mode 100644 index 00000000000..160c760dc9d --- /dev/null +++ b/doc/release-notes/10466-math-challenge-403-error-page.md @@ -0,0 +1 @@ +On forbidden access error page, also know as 403 error page, the math challenge is now correctly display to submit the contact form. diff --git a/src/main/java/edu/harvard/iq/dataverse/SendFeedbackDialog.java b/src/main/java/edu/harvard/iq/dataverse/SendFeedbackDialog.java index 5a522eb7e45..46941c8b5b6 100644 --- a/src/main/java/edu/harvard/iq/dataverse/SendFeedbackDialog.java +++ b/src/main/java/edu/harvard/iq/dataverse/SendFeedbackDialog.java @@ -129,6 +129,10 @@ public void setUserSum(Long userSum) { } public String getMessageTo() { + if (op1 == null || op2 == null) { + // Fix for 403 error page: initUserInput method doesn't call before + initUserInput(null); + } if (feedbackTarget == null) { return BrandingUtil.getSupportTeamName(systemAddress); } else if (feedbackTarget.isInstanceofDataverse()) { From 3934c3f4a968e60350075546a280c8e72caf0fd6 Mon Sep 17 00:00:00 2001 From: Jose Lucas Cordeiro Date: Mon, 10 Jun 2024 11:35:18 -0300 Subject: [PATCH 17/18] API: Handling creation of duplicate role for a dataset object (#10474) * 9729: Handling repeated role creation error for datasets * 9729: Fixing unit test * 9729: Optimizing Imports * 9729: Deleting white space * 9729: Deleting redundant line * Update doc/release-notes/9729-release-notes.md Co-authored-by: Philip Durbin * 9375: Rolling back wrong code removal and reformatting * 9375: Refactoring and adding test * 9375: Changing message to use property file * 9887: rolling back formatting * 9729: rolling back formatting * 9729: rolling back formatting --------- Co-authored-by: Philip Durbin --- doc/release-notes/9729-release-notes.md | 1 + .../engine/command/impl/AssignRoleCommand.java | 14 +++++++++++++- src/main/java/propertyFiles/Bundle.properties | 1 + .../edu/harvard/iq/dataverse/api/DatasetsIT.java | 11 +++++++++++ .../command/impl/CreatePrivateUrlCommandTest.java | 10 ++++++++-- 5 files changed, 34 insertions(+), 3 deletions(-) create mode 100644 doc/release-notes/9729-release-notes.md diff --git a/doc/release-notes/9729-release-notes.md b/doc/release-notes/9729-release-notes.md new file mode 100644 index 00000000000..9dc27995405 --- /dev/null +++ b/doc/release-notes/9729-release-notes.md @@ -0,0 +1 @@ +An error is now correctly reported when an attempt is made to assign an identical role to the same collection, dataset, or file. #9729 #10465 \ No newline at end of file diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AssignRoleCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AssignRoleCommand.java index e4edb973cd9..121af765737 100644 --- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AssignRoleCommand.java +++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/AssignRoleCommand.java @@ -3,7 +3,6 @@ */ package edu.harvard.iq.dataverse.engine.command.impl; -import edu.harvard.iq.dataverse.DataFile; import edu.harvard.iq.dataverse.Dataset; import edu.harvard.iq.dataverse.Dataverse; import edu.harvard.iq.dataverse.authorization.DataverseRole; @@ -18,6 +17,8 @@ import edu.harvard.iq.dataverse.engine.command.DataverseRequest; import edu.harvard.iq.dataverse.engine.command.exception.CommandException; import edu.harvard.iq.dataverse.engine.command.exception.IllegalCommandException; +import edu.harvard.iq.dataverse.util.BundleUtil; + import java.util.Collections; import java.util.HashSet; import java.util.Map; @@ -68,11 +69,22 @@ public RoleAssignment execute(CommandContext ctxt) throws CommandException { throw new IllegalCommandException("User " + user.getUserIdentifier() + " is deactivated and cannot be given a role.", this); } } + if(isExistingRole(ctxt)){ + throw new IllegalCommandException(BundleUtil.getStringFromBundle("datasets.api.grant.role.assignee.has.role.error"), this); + } // TODO make sure the role is defined on the dataverse. RoleAssignment roleAssignment = new RoleAssignment(role, grantee, defPoint, privateUrlToken, anonymizedAccess); return ctxt.roles().save(roleAssignment); } + private boolean isExistingRole(CommandContext ctxt) { + return ctxt.roles() + .directRoleAssignments(grantee, defPoint) + .stream() + .map(RoleAssignment::getRole) + .anyMatch(it -> it.equals(role)); + } + @Override public Map> getRequiredPermissions() { // for data file check permission on owning dataset diff --git a/src/main/java/propertyFiles/Bundle.properties b/src/main/java/propertyFiles/Bundle.properties index 7bc3ec33a80..2996ccb509b 100644 --- a/src/main/java/propertyFiles/Bundle.properties +++ b/src/main/java/propertyFiles/Bundle.properties @@ -2694,6 +2694,7 @@ datasets.api.datasize.ioerror=Fatal IO error while trying to determine the total datasets.api.grant.role.not.found.error=Cannot find role named ''{0}'' in dataverse {1} datasets.api.grant.role.cant.create.assignment.error=Cannot create assignment: {0} datasets.api.grant.role.assignee.not.found.error=Assignee not found +datasets.api.grant.role.assignee.has.role.error=User already has this role for this dataset datasets.api.revoke.role.not.found.error="Role assignment {0} not found" datasets.api.revoke.role.success=Role {0} revoked for assignee {1} in {2} datasets.api.privateurl.error.datasetnotfound=Could not find dataset. diff --git a/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java b/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java index 5b603d88c6d..d2d14b824bd 100644 --- a/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java +++ b/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java @@ -1717,6 +1717,9 @@ public void testAddRoles(){ giveRandoPermission.prettyPrint(); assertEquals(200, giveRandoPermission.getStatusCode()); + //Asserting same role creation is covered + validateAssignExistingRole(datasetPersistentId,randomUsername,apiToken, "fileDownloader"); + // Create another random user to become curator: Response createCuratorUser = UtilIT.createRandomUser(); @@ -1853,6 +1856,14 @@ public void testListRoleAssignments() { assertEquals(UNAUTHORIZED.getStatusCode(), notPermittedToListRoleAssignmentOnDataverse.getStatusCode()); } + private static void validateAssignExistingRole(String datasetPersistentId, String randomUsername, String apiToken, String role) { + final Response failedGrantPermission = UtilIT.grantRoleOnDataset(datasetPersistentId, role, "@" + randomUsername, apiToken); + failedGrantPermission.prettyPrint(); + failedGrantPermission.then().assertThat() + .body("message", containsString("User already has this role for this dataset")) + .statusCode(FORBIDDEN.getStatusCode()); + } + @Test public void testFileChecksum() { diff --git a/src/test/java/edu/harvard/iq/dataverse/engine/command/impl/CreatePrivateUrlCommandTest.java b/src/test/java/edu/harvard/iq/dataverse/engine/command/impl/CreatePrivateUrlCommandTest.java index 33f9acd0e1a..b67fc8cb4c3 100644 --- a/src/test/java/edu/harvard/iq/dataverse/engine/command/impl/CreatePrivateUrlCommandTest.java +++ b/src/test/java/edu/harvard/iq/dataverse/engine/command/impl/CreatePrivateUrlCommandTest.java @@ -3,8 +3,10 @@ import edu.harvard.iq.dataverse.Dataset; import edu.harvard.iq.dataverse.DatasetVersion; import edu.harvard.iq.dataverse.DataverseRoleServiceBean; +import edu.harvard.iq.dataverse.DvObject; import edu.harvard.iq.dataverse.RoleAssignment; import edu.harvard.iq.dataverse.authorization.DataverseRole; +import edu.harvard.iq.dataverse.authorization.RoleAssignee; import edu.harvard.iq.dataverse.authorization.users.PrivateUrlUser; import edu.harvard.iq.dataverse.engine.TestCommandContext; import edu.harvard.iq.dataverse.engine.TestDataverseEngine; @@ -18,8 +20,8 @@ import org.junit.jupiter.api.Test; import static org.junit.jupiter.api.Assertions.assertEquals; -import static org.junit.jupiter.api.Assertions.*; -import static org.junit.jupiter.api.Assertions.*; +import static org.junit.jupiter.api.Assertions.assertNotNull; +import static org.junit.jupiter.api.Assertions.assertNull; import static org.junit.jupiter.api.Assertions.assertTrue; public class CreatePrivateUrlCommandTest { @@ -73,6 +75,10 @@ public RoleAssignment save(RoleAssignment assignment) { // no-op return assignment; } + @Override + public List directRoleAssignments(RoleAssignee roas, DvObject dvo) { + return List.of(); + } }; } From 5bf6b6defb1c22971951233f30b679d762496832 Mon Sep 17 00:00:00 2001 From: qqmyers Date: Mon, 10 Jun 2024 10:36:35 -0400 Subject: [PATCH 18/18] Solr: Try Soft Commit on Indexing (#10547) * try soft commit * keep softcommit short to avoid delays in visibility * add test delay for autosoft, make hardcommit 30s like auto setting * add 1-2 second delays in tests for softAutocomplete at 1s * more sleeps * more delays * remove commented out deletes * more commented out code to remove * add 1 sec on failing tests * add missing perm reindex * change waiting * fix index object and add null check for unit test * remove test-specific null check * reindex linking dv * general solr release note * more fixes * revert change - was correct * another sleepforsearch * test adding explicit reindexing * avoid other uses of cache in test that looks for exact counts * Adding longer max sleep, add count param to sleep method * Revert "add missing perm reindex" This reverts commit 317038ae8083e5d421e91c65176df0a806b04011. --- conf/solr/9.3.0/solrconfig.xml | 4 +- doc/release-notes/10547-solr-updates.md | 1 + .../iq/dataverse/search/IndexServiceBean.java | 34 ++++------------- .../search/SolrIndexServiceBean.java | 13 ++----- .../iq/dataverse/api/DataversesIT.java | 2 +- .../edu/harvard/iq/dataverse/api/LinkIT.java | 6 +++ .../harvard/iq/dataverse/api/SearchIT.java | 37 ++++++------------- .../edu/harvard/iq/dataverse/api/UtilIT.java | 19 ++++++++-- .../impl/CreatePrivateUrlCommandTest.java | 16 ++++++++ .../util/cache/CacheFactoryBeanTest.java | 2 + 10 files changed, 66 insertions(+), 68 deletions(-) create mode 100644 doc/release-notes/10547-solr-updates.md diff --git a/conf/solr/9.3.0/solrconfig.xml b/conf/solr/9.3.0/solrconfig.xml index 36ed4f23390..34386375fe1 100644 --- a/conf/solr/9.3.0/solrconfig.xml +++ b/conf/solr/9.3.0/solrconfig.xml @@ -290,7 +290,7 @@ have some sort of hard autoCommit to limit the log size. --> - ${solr.autoCommit.maxTime:15000} + ${solr.autoCommit.maxTime:30000} false @@ -301,7 +301,7 @@ --> - ${solr.autoSoftCommit.maxTime:-1} + ${solr.autoSoftCommit.maxTime:1000}