Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DANS - Exporters in external jars #9175

Merged
Merged
Show file tree
Hide file tree
Changes from 59 commits
Commits
Show all changes
78 commits
Select commit Hold shift + click to select a range
5208393
initial capability
qqmyers Nov 18, 2022
2cc7ec2
add mp setting for dir
qqmyers Nov 18, 2022
fa8f633
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Dec 15, 2022
a3fcd00
add a map where we can prefer external exporters over internal ones
qqmyers Dec 15, 2022
89ea8b9
Cleanup/ formatting
qqmyers Dec 16, 2022
1a09f6e
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Feb 14, 2023
a2472d9
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Mar 3, 2023
8b672e0
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Mar 22, 2023
8123310
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Mar 28, 2023
8e283f5
initial attempt to create a data provider interface
qqmyers Mar 29, 2023
5f0b041
remove unused method, add interface call to get prereq.
qqmyers Mar 29, 2023
baf3234
Convert DDI Exporter to no use internal DV classes
qqmyers Mar 31, 2023
18ca8a7
add filedetails json
qqmyers Apr 13, 2023
f8376b2
update tests
qqmyers Apr 13, 2023
fd2e66a
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Apr 13, 2023
3b5d196
test fixes - these already had a dataset object
qqmyers Apr 13, 2023
aef8630
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers Apr 24, 2023
7943ae4
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers May 2, 2023
835df29
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers May 8, 2023
c7b632a
create spi pom, refactor exception, move code, add jar locally/as dep
qqmyers May 8, 2023
9465859
formatting
qqmyers May 8, 2023
0bb6e53
add Locale param
qqmyers May 9, 2023
7f8baa4
add example
qqmyers May 9, 2023
d074965
import cleanup
qqmyers May 9, 2023
73410eb
Merge remote-tracking branch 'IQSS/develop' into DANS-external_exporters
qqmyers May 9, 2023
38e71eb
use io.gdcc package, use artifact name for dir
qqmyers May 9, 2023
a5c0e36
style(spi): cleanup whitespace and line endings in POM
poikilotherm May 10, 2023
f1be0d2
fix(spi): make dataverse-exporter-spi a submodule
poikilotherm May 10, 2023
8644a34
style(spi): cleanup dependencies in dataverse-exporter-spi
poikilotherm May 10, 2023
6ae546c
Merge pull request #35 from poikilotherm/9175-external-exporters
qqmyers May 10, 2023
fa0e281
naming changes per review
qqmyers May 10, 2023
a9c720a
use Optional per review
qqmyers May 10, 2023
b92c8b7
extend IOException, remove @author per review
qqmyers May 10, 2023
9948aa8
Create XMLExporter interface with added methods
qqmyers May 10, 2023
12e5e96
refactor example
qqmyers May 10, 2023
89ec230
further changes per review
qqmyers May 10, 2023
3288ae5
Apply suggestions from code review
qqmyers May 10, 2023
60ac4a0
Trying Optional<Path>
qqmyers May 10, 2023
22420ba
style change per review
qqmyers May 10, 2023
318b01f
Revert "Trying Optional<Path>"
qqmyers May 10, 2023
6501b0d
update example
qqmyers May 10, 2023
e1f509a
handle prereqs in ExportService and add some javadocs
qqmyers May 10, 2023
47b21e8
documentation/release note
qqmyers May 11, 2023
dcdb71c
remove local deploy plugin
qqmyers May 11, 2023
45c2418
remove example - now in gdcc/dataverse-exporters repo
qqmyers May 11, 2023
9252f13
fix guide formatting issue
qqmyers May 11, 2023
b643565
add page to toc
qqmyers May 11, 2023
c541bc2
use lists
qqmyers May 11, 2023
b5d2794
add blank lines
qqmyers May 11, 2023
83a2af3
remove default license header
qqmyers May 12, 2023
dec9b08
remove isXMLFormat()
qqmyers May 12, 2023
8c091da
note about XML spec
qqmyers May 12, 2023
6d8f2e5
Merge branch 'develop' into 7050-exporters-spi
poikilotherm May 23, 2023
e1be9ca
ci: don't trigger all of the main app Maven tests by dataverse-spi / …
poikilotherm May 24, 2023
d97bc09
chore(build): add more Maven plugins to parent
poikilotherm May 24, 2023
ef331ea
build(spi): add necessary POM changes to release packages
poikilotherm May 24, 2023
8035537
build(spi): add version suffix to enable snapshots
poikilotherm May 24, 2023
dcf8e32
ci(spi): add CI configuration for dataverse-spi package
poikilotherm May 24, 2023
7e9d1e7
ci(spi): tune workflow a little more
poikilotherm May 24, 2023
7566c63
chore(git): add flattened POM to gitignore
poikilotherm May 24, 2023
8b30280
Merge branch 'develop' into DANS-external_exporters #9175
pdurbin May 26, 2023
e6b48b5
ci(spi): do not run SPI release when no access to secrets
poikilotherm May 26, 2023
b34282c
ci(dataverse): switch to build unit tests with parent module
poikilotherm May 26, 2023
3c099d7
ci(dataverse): install after build to make dependencies available in …
poikilotherm May 26, 2023
b14b669
ci(dataverse): cache new Maven packages from build
poikilotherm May 28, 2023
a565267
ci(dataverse): update GitHub Actions to latest versions
poikilotherm May 28, 2023
36413e6
ci(dataverse): fix non-critical build fails from coverage by making t…
poikilotherm May 28, 2023
e6bc0e4
ci(dataverse): avoid cache exists problem
poikilotherm May 28, 2023
c582de2
ci(dataverse): clear cache before trying to save to it again
poikilotherm May 28, 2023
937fa9c
ci(dataverse): add permission to clear and save the cache
poikilotherm May 28, 2023
db2d26b
ci(dataverse): fix typo in env var for running gh CLI
poikilotherm May 28, 2023
ffd547a
ci(dataverse): switch back to normal caching
poikilotherm May 28, 2023
61e1978
build,ci(ct): make app image build use parent module
poikilotherm May 28, 2023
0d8e386
build,ci(ct): make multiarch app image build use parent module
poikilotherm May 29, 2023
dfa7432
build,ci(ct): readd install step to multiarch app image builds
poikilotherm May 29, 2023
bdce3b4
fix for #9601
qqmyers May 19, 2023
c927490
Fix filedetails errors from QA
qqmyers May 30, 2023
4322d50
catch null possible in test case
qqmyers May 30, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .github/workflows/maven_unit_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -6,11 +6,15 @@ on:
- "**.java"
- "pom.xml"
- "modules/**/pom.xml"
- "!modules/container-base/**"
- "!modules/dataverse-spi/**"
pull_request:
paths:
- "**.java"
- "pom.xml"
- "modules/**/pom.xml"
- "!modules/container-base/**"
- "!modules/dataverse-spi/**"

jobs:
unittest:
Expand Down
74 changes: 74 additions & 0 deletions .github/workflows/spi_release.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
name: Dataverse SPI

on:
push:
branch:
- "develop"
paths:
- "modules/dataverse-spi/**"
pull_request:
branch:
- "develop"
paths:
- "modules/dataverse-spi/**"

jobs:
snapshot:
name: Release Snapshot
runs-on: ubuntu-latest
if: github.event_name == 'pull_request'
steps:
- uses: actions/checkout@v3
- uses: actions/setup-java@v3
with:
java-version: '11'
distribution: 'adopt'
server-id: ossrh
server-username: MAVEN_USERNAME
server-password: MAVEN_PASSWORD
- uses: actions/cache@v2
with:
path: ~/.m2
key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
restore-keys: ${{ runner.os }}-m2

- name: Deploy Snapshot
run: mvn -f modules/dataverse-spi -Dproject.version.suffix="-PR${{ github.event.number }}-SNAPSHOT" deploy
env:
MAVEN_USERNAME: ${{ secrets.DATAVERSEBOT_SONATYPE_USERNAME }}
MAVEN_PASSWORD: ${{ secrets.DATAVERSEBOT_SONATYPE_TOKEN }}

release:
name: Release
runs-on: ubuntu-latest
if: github.event_name == 'push'
steps:
- uses: actions/checkout@v3
- uses: actions/setup-java@v3
with:
java-version: '11'
distribution: 'adopt'
- uses: actions/cache@v2
with:
path: ~/.m2
key: ${{ runner.os }}-m2-${{ hashFiles('**/pom.xml') }}
restore-keys: ${{ runner.os }}-m2

# Running setup-java again overwrites the settings.xml - IT'S MANDATORY TO DO THIS SECOND SETUP!!!
- name: Set up Maven Central Repository
uses: actions/setup-java@v3
with:
java-version: '11'
distribution: 'adopt'
server-id: ossrh
server-username: MAVEN_USERNAME
server-password: MAVEN_PASSWORD
gpg-private-key: ${{ secrets.DATAVERSEBOT_GPG_KEY }}
gpg-passphrase: MAVEN_GPG_PASSPHRASE

- name: Sign + Publish Release
run: mvn -f modules/dataverse-spi -P release deploy
env:
MAVEN_USERNAME: ${{ secrets.DATAVERSEBOT_SONATYPE_USERNAME }}
MAVEN_PASSWORD: ${{ secrets.DATAVERSEBOT_SONATYPE_TOKEN }}
MAVEN_GPG_PASSPHRASE: ${{ secrets.DATAVERSEBOT_GPG_PASSWORD }}
11 changes: 11 additions & 0 deletions doc/release-notes/9175-external-exporters.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
## Ability to Create New Exporters

It is now possible for third parties to develop and share code to provide new metadata export formats for Dataverse. Export formats can be made available via the Dataverse UI and API or configured for use in Harvesting. Dataverse now provides developers with a separate dataverse-spi JAR file that contains the Java interfaces and classes required to create a new metadata Exporter. Once a new Exporter has been created and packaged as a JAR file, administrators can use it by specifying a local directory for third party Exporters, dropping then Exporter JAR there, and restarting Payara. This mechanism also allows new Exporters to replace any of Dataverse's existing metadata export formats.

## Backward Incompatibilities

Care should be taken when replacing Dataverse's internal metadata export formats as third party code, including other third party Exporters may depend on the contents of those export formats. When replacing an existing format, one must also remember to delete the cached metadata export files or run the reExport command for the metadata exports of existing datasets to be updated.

## New JVM/MicroProfile Settings

dataverse.spi.export.directory - specifies a directory, readable by the Dataverse server. Any Exporter JAR files placed in this directory will be read by Dataverse and used to add/replace the specified metadata format.
3 changes: 2 additions & 1 deletion doc/sphinx-guides/source/developers/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
contain the root `toctree` directive.
Developer Guide
=======================================================
===============

**Contents:**

Expand All @@ -27,6 +27,7 @@ Developer Guide
deployment
containers
making-releases
metadataexport
tools
unf/index
make-data-count
Expand Down
88 changes: 88 additions & 0 deletions doc/sphinx-guides/source/developers/metadataexport.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,88 @@
=======================
Metadata Export Formats
=======================

.. contents:: |toctitle|
:local:

Introduction
------------

Dataverse ships with a number of metadata export formats available for published datasets. A given metadata export
format may be available for user download (via the UI and API) and/or be available for use in Harvesting between
Dataverse instances.

As of v5.14, Dataverse provides a mechanism for third-party developers to create new metadata Exporters than implement
new metadata formats or that replace existing formats. All the necessary dependencies are packaged in an interface JAR file
available from Maven Central. Developers can distribute their new Exporters as JAR files which can be dynamically loaded
into Dataverse instances - see :ref:`external-exporters`. Developers are encouraged to make their Exporter code available
via https://github.com/gdcc/dataverse-exporters (or minimally, to list their existence in the README there).

Exporter Basics
---------------

New Exports must implement the ``io.gdcc.spi.export.Exporter`` interface. The interface includes a few methods for the Exporter
to provide Dataverse with the format it produces, a display name, format mimetype, and whether the format is for download
and/or harvesting use, etc. It also includes a main ``exportDataset(ExportDataProvider dataProvider, OutputStream outputStream)``
method through which the Exporter receives metadata about the given dataset (via the ``ExportDataProvider``, described further
below) and writes its output (as an OutputStream).

Exporters that create an XML format must implement the ``io.gdcc.spi.export.XMLExporter`` interface (which extends the Exporter
interface). XMLExporter adds a few methods through which the XMLExporter provides information to Dataverse about the XML
namespace and version being used.

Exporters also need to use the ``@AutoService(Exporter.class)`` which makes the class discoverable as an Exporter implementation.

The ``ExportDataProvider`` interface provides several methods through which your Exporter can receive dataset and file metadata
in various formats. Your exporter would parse the information in one or more of these inputs to retrieve the values needed to
generate the Exporter's output format.

The most important methods/input formats are:

- ``getDatasetJson()`` - metadata in the internal Dataverse JSON format used in the native API and available via the built-in JSON metadata export.
- ``getDatasetORE()`` - metadata in the OAI_ORE format available as a built-in metadata format and as used in Dataverse's BagIT-based Archiving capability.
- ``getDatasetFileDetails`` - detailed file-level metadata for ingested tabular files.

The first two of these provide ~complete metadata about the dataset along with the metadata common to all files. This includes all metadata
entries from all metadata blocks, PIDs, tags, Licenses and custom terms, etc. Almost all built-in exporters today use the JSON input.
The newer OAI_ORE export, which is JSON-LD-based, provides a flatter structure and references metadata terms by their external vocabulary ids
(e.g. http://purl.org/dc/terms/title) which may make it a prefereable starting point in some cases.

The last method above provides a new JSON-formatted serialization of the variable-level file metadata Dataverse generates during ingest of tabular files.
This information has only been included in the built-in DDI export, as the content of a ``dataDscr`` element. (Hence inspecting the edu.harvard.iq.dataverse.export.DDIExporter and related classes would be a good way to explore how the JSON is structured.)

The interface also provides

- ``getDatasetSchemaDotOrg();`` and
- ``getDataCiteXml();``.

These provide subsets of metadata in the indicated formats. They may be useful starting points if your exporter will, for example, only add one or two additional fields to the given format.

If an Exporter cannot create a requested metadata format for some reason, it should throw an ``io.gdcc.spi.export.ExportException``.

Building an Exporter
--------------------

The example at https://github.com/gdcc/dataverse-exporters provides a Maven pom.xml file suitable for building an Exporter JAR file and that repository provides additional development guidance.

There are four dependencies needed to build an Exporter:

- ``io.gdcc dataverse-spi`` library containing the interfaces discussed above and the ExportException class
- ``com.google.auto.service auto-service``, which provides the @AutoService annotation
- ``jakarta.json jakarata.json-api`` for JSON classes
- ``jakarta.ws.rs jakarta.ws.rs-api``, which provides a MediaType enumeration for specifying mime types.

Specifying a Prerequisite Export
--------------------------------

An advanced feature of the Exporter mechanism allows a new Exporter to specify that it requires, as input,
the output of another Exporter. An example of this is the builting HTMLExporter which requires the output
of the DDI XML Exporter to produce an HTML document with the same DDI content.

This is configured by providing the metadata format name via the ``Exporter.getPrerequisiteFormatName()`` method.
When this method returns a non-empty format name, Dataverse will provide the requested format to the Exporter via
the ``ExportDataProvider.getPrerequisiteInputStream()`` method.

Developers and administrators deploying Exporters using this mechanism should be aware that, since metadata formats
can be changed by other Exporters, the InputStream received may not hold the expected metadata. Developers should clearly
document their compatability with the built-in or third-party Exporters they support as prerequisites.
26 changes: 26 additions & 0 deletions doc/sphinx-guides/source/installation/advanced.rst
Original file line number Diff line number Diff line change
Expand Up @@ -115,3 +115,29 @@ To activate in your Dataverse installation::

curl -X PUT -d '/cgi-bin/zipdownload' http://localhost:8080/api/admin/settings/:CustomZipDownloadServiceUrl

.. _external-exporters:

Installing External Metadata Exporters
++++++++++++++++++++++++++++++++++++++

As of Dataverse Software 5.14 Dataverse supports the use of external Exporters as a way to add additional metadata
export formats to Dataverse or replace the built-in formats. This should be considered an **experimental** capability
in that the mechanism is expected to evolve and using it may require additional effort when upgrading to new Dataverse
versions.

This capability is enabled by specifying a directory in which Dataverse should look for third-party Exporters. See
:ref:`dataverse.spi.exporters.directory`.

See :doc:`/developers/metadataexport` for details about how to develop new Exporters.

An minimal example Exporter is available at https://github.com/gdcc/dataverse-exporters. The community is encourage to
add additional exporters (and/or links to exporters elsewhere) in this repository. Once you have downloaded the
dataverse-spi-export-examples-1.0.0.jar (or other exporter jar), installed it in the directory specified above, and
restarted your Payara server, the new exporter should be available.

The example dataverse-spi-export-examples-1.0.0.jar replaces the ``JSON`` export with a ``MyJSON in <locale>`` version
that just wraps the existing JSON export object in a new JSON object with the key ``inputJson`` containing the original
JSON.(Note that the ``MyJSON in <locale>`` label will appear in the dataset Metadata Export download menu immediately,
but the content for already published datasets will only be updated after you delete the cached exports and/or use a
reExport API call (see :ref:`batch-exports-through-the-api`).)

20 changes: 15 additions & 5 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2419,6 +2419,19 @@ Defaults to ``false``.
Can also be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_UI_SHOW_VALIDITY_FILTER``. Will accept ``[tT][rR][uU][eE]|1|[oO][nN]`` as "true" expressions.

.. _dataverse.spi.exporters.directory:

dataverse.spi.exporters.directory
+++++++++++++++++++++++++++++++++

This JVM option is used to configure the file system path where external Exporter JARs can be placed. See :ref:`external-exporters` for more information.

``./asadmin create-jvm-options '-Ddataverse.spi.exporters.directory=PATH_LOCATION_HERE'``

If this value is set, Dataverse will examine all JARs in the specified directory and will use them to add, or replace existing, metadata export formats.
If this value is not set (the default), Dataverse will not use external Exporters.

Can also be set via *MicroProfile Config API* sources, e.g. the environment variable ``DATAVERSE_SPI_EXPORTERS_DIRECTORY``.

.. _feature-flags:

Expand All @@ -2440,9 +2453,6 @@ please find all known feature flags below. Any of these flags can be activated u
* - api-session-auth
- Enables API authentication via session cookie (JSESSIONID). **Caution: Enabling this feature flag exposes the installation to CSRF risks!** We expect this feature flag to be temporary (only used by frontend developers, see `#9063 <https://github.com/IQSS/dataverse/issues/9063>`_) and removed once support for bearer tokens has been implemented (see `#9229 <https://github.com/IQSS/dataverse/issues/9229>`_).
- ``Off``
* - api-bearer-auth
- Enables API authentication via Bearer Token for OIDC User Accounts. **Information: This feature works only for OIDC UserAccounts!**
- ``Off``

**Note:** Feature flags can be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_FEATURE_XXX`` (e.g. ``DATAVERSE_FEATURE_API_SESSION_AUTH=1``). These environment variables can be set in your shell before starting Payara. If you are using :doc:`Docker for development </container/dev-usage>`, you can set them in the `docker compose <https://docs.docker.com/compose/environment-variables/set-environment-variables/>`_ file.
Expand Down Expand Up @@ -3889,7 +3899,7 @@ To use the current GDCC version directly:
:CategoryOrder
++++++++++++++

A comma separated list of Category/Tag names defining the order in which files with those tags should be displayed.
A comma separated list of Category/Tag names defining the order in which files with those tags should be displayed.
The setting can include custom tag names along with the pre-defined tags(Documentation, Data, and Code are the defaults but the :ref:`:FileCategories` setting can be used to use a different set of tags).
The default is category ordering disabled.

Expand All @@ -3901,7 +3911,7 @@ A true(default)/false option determining whether datafiles listed on the dataset
:AllowUserManagementOfOrder
+++++++++++++++++++++++++++

A true/false (default) option determining whether the dataset datafile table display includes checkboxes enabling users to turn folder ordering and/or category ordering (if an order is defined by :CategoryOrder) on and off dynamically.
A true/false (default) option determining whether the dataset datafile table display includes checkboxes enabling users to turn folder ordering and/or category ordering (if an order is defined by :CategoryOrder) on and off dynamically.

.. _supported MicroProfile Config API source: https://docs.payara.fish/community/docs/Technical%20Documentation/MicroProfile/Config/Overview.html

50 changes: 50 additions & 0 deletions modules/dataverse-parent/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@
<module>../../pom.xml</module>
<module>../../scripts/zipdownload</module>
<module>../container-base</module>
<module>../dataverse-spi</module>
</modules>

<!-- Transitive dependencies, bigger library "bill of materials" (BOM) and
Expand Down Expand Up @@ -186,7 +187,16 @@
<maven-surefire-plugin.version>3.0.0-M5</maven-surefire-plugin.version>
<maven-failsafe-plugin.version>3.0.0-M5</maven-failsafe-plugin.version>
<maven-assembly-plugin.version>3.3.0</maven-assembly-plugin.version>
<maven-release-plugin.version>3.0.0-M7</maven-release-plugin.version>
<maven-gpg-plugin.version>3.0.1</maven-gpg-plugin.version>
<maven-site-plugin.version>4.0.0-M4</maven-site-plugin.version>
<maven-source-plugin.version>3.2.1</maven-source-plugin.version>
<maven-javadoc-plugin.version>3.4.1</maven-javadoc-plugin.version>
<maven-flatten-plugin.version>1.3.0</maven-flatten-plugin.version>

<maven-checkstyle-plugin.version>3.1.2</maven-checkstyle-plugin.version>
<nexus-staging-plugin.version>1.6.13</nexus-staging-plugin.version>
<pomchecker-maven-plugin.version>1.9.0</pomchecker-maven-plugin.version>

<!-- Container related -->
<fabric8-dmp.version>0.42.1</fabric8-dmp.version>
Expand Down Expand Up @@ -262,6 +272,46 @@
<artifactId>docker-maven-plugin</artifactId>
<version>${fabric8-dmp.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>${maven-site-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-source-plugin</artifactId>
<version>${maven-source-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>${maven-javadoc-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>${maven-gpg-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>flatten-maven-plugin</artifactId>
<version>${maven-flatten-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.kordamp.maven</groupId>
<artifactId>pomchecker-maven-plugin</artifactId>
<version>${pomchecker-maven-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>${nexus-staging-plugin.version}</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-release-plugin</artifactId>
<version>${maven-release-plugin.version}</version>
</plugin>
</plugins>
</pluginManagement>
</build>
Expand Down
Loading