diff --git a/.gitignore b/.gitignore index 36a8cec089b..b734f6da787 100644 --- a/.gitignore +++ b/.gitignore @@ -12,3 +12,5 @@ michael-local ehthumbs.db Thumbs.db .vagrant +*.pyc +scripts/api/py_api_wrapper/demo-data/* \ No newline at end of file diff --git a/doc/Sphinx/source/API/data-deposit.rst b/doc/Sphinx/source/API/data-deposit.rst new file mode 100644 index 00000000000..aed5456ed0d --- /dev/null +++ b/doc/Sphinx/source/API/data-deposit.rst @@ -0,0 +1,8 @@ +Data Deposit API +================ + +Please note that while Dataverse Network (DVN) 3.6 and higher support a +new Data Deposit API documented at +http://thedata.harvard.edu/guides/dataverse-api-main.html#data-deposit-api +the API has not yet been added Dataverse 4.0 Beta. That work will be +completed in https://redmine.hmdc.harvard.edu/issues/3385 diff --git a/doc/Sphinx/source/API/dataverse-apis.rst b/doc/Sphinx/source/API/data-sharing.rst similarity index 91% rename from doc/Sphinx/source/API/dataverse-apis.rst rename to doc/Sphinx/source/API/data-sharing.rst index ebd485bdc73..6e93cc438cf 100644 --- a/doc/Sphinx/source/API/dataverse-apis.rst +++ b/doc/Sphinx/source/API/data-sharing.rst @@ -1,13 +1,5 @@ -Dataverse Network APIs -++++++++++++++++++++++++ - -We strongly encourage anyone interested in building tools to -interoperate with the Dataverse Network to utilize our open source -APIs. Please visit our `website `__ for -examples of external apps that have been built to work with our APIs. - Data Sharing API -================== +================ As of version 3.0, a new API for programmatic access to the DVN data and metadata has been added. The API allows a remote, non-DVN @@ -307,17 +299,3 @@ type, file name and such) | Authorization header supplied, but the authenticated user is not | authorized to directly access the object protected by Access | Permissions and/or Access Restrictions (“Terms of Use”). - - -Data Deposit API -=========== - -As of version 3.6, a new API for programmatic deposit of data and metadata to the Dataverse Network will be added. The API will allow a remote, non-Dataverse Network archive/application to deposit files and metadata to a Dataverse Network installation. - -The latest information on this plugin is available here: -`https://redmine.hmdc.harvard.edu/issues/3108 `__ - - - - - diff --git a/doc/Sphinx/source/API/index.rst b/doc/Sphinx/source/API/index.rst index ad9a2913a80..ff372eb2e18 100644 --- a/doc/Sphinx/source/API/index.rst +++ b/doc/Sphinx/source/API/index.rst @@ -4,11 +4,17 @@ contain the root `toctree` directive. API Guide -======================================================= +========= + +We strongly encourage anyone interested in building tools to +interoperate with the Dataverse Network to utilize our open source +APIs. Please visit http://datascience.iq.harvard.edu/collaborations for +examples of external apps that have been built to work with our APIs. Contents: .. toctree:: :maxdepth: 2 - dataverse-apis + data-sharing + data-deposit diff --git a/doc/Sphinx/source/Developers/index.rst b/doc/Sphinx/source/Developers/index.rst index 501f3aadfec..9e547cd547d 100644 --- a/doc/Sphinx/source/Developers/index.rst +++ b/doc/Sphinx/source/Developers/index.rst @@ -12,5 +12,6 @@ Contents: :maxdepth: 2 dev-main + tools diff --git a/doc/Sphinx/source/Developers/tools.rst b/doc/Sphinx/source/Developers/tools.rst new file mode 100644 index 00000000000..9d245e38d76 --- /dev/null +++ b/doc/Sphinx/source/Developers/tools.rst @@ -0,0 +1,20 @@ +===== +Tools +===== + +PageKite +++++++++ + +PageKite is a fantastic service that can be used to share your +local development environment over the Internet on a public IP address. + +With PageKite running on your laptop, the world can access a URL such as +http://pdurbin.pagekite.me to see what you see at http://localhost:8080 + +Sign up at https://pagekite.net and follow the installation instructions or simply download https://pagekite.net/pk/pagekite.py + +The first time you run ``./pagekite.py`` a file at ``~/.pagekite.rc`` will be +created. You can edit this file to configure PageKite to serve up port 8080 +(the default GlassFish HTTP port) or the port of your choosing. + +According to https://pagekite.net/support/free-for-foss/ PageKite (very generously!) offers free accounts to developers writing software the meets http://opensource.org/docs/definition.php such as Dataverse. diff --git a/doc/Sphinx/source/User/appendix.rst b/doc/Sphinx/source/User/appendix.rst index 8c16e9bcd2f..eabc8c3fdf2 100644 --- a/doc/Sphinx/source/User/appendix.rst +++ b/doc/Sphinx/source/User/appendix.rst @@ -16,39 +16,12 @@ currently proposed metadata fields for 4.0 (per metadata block): - `Citation Metadata `__ (compliant with `DDI 2.5 `__ and `DataCite 3.0 `__) - `Social Science & Humanities Metadata (DDI 2.5 compliant) `__ - `Astronomy and Astrophysics Metadata `__ - : These metadata elements can be mapped/exported to the International Virtual Observatory Alliance’s (IVOA) Resource Metadata for the Virtual Observatory (VOResource Schema format) and based on `Virtual Observatory (VO) Discovery and Provenance Metadata `__) + : These metadata elements can be mapped/exported to the International Virtual Observatory Alliance’s (IVOA) + `VOResource Schema format `__ and is based on + `Virtual Observatory (VO) Discovery and Provenance Metadata `__ - `Biomedical Metadata `__ (based on `ISA-Tab `__ and `Stem Cell Commons `__) -DDI 2.5 --------- -The Dataverse metadata is compliant with the `DDI schema -version 2.5 `__. The Metadata fields in the Citation and Social Science blocks -associated with each Dataset contain most of the fields -in the study description section of the DDI. That way the Dataverse -metadata can be mapped easily to a DDI, and be exported into XML -format for preservation and interoperability. - -DataCite Schema 3.0 --------------------- - -Dataverse metadata is compliant with the `DataCite Metadata Schema v.3.0 `__ requirements. That way the Dataverse -metadata can be mapped easily to a DataCite, and be exported into XML -format for preservation and interoperability. - -DCMI Terms ------------ - -Dataverse metadata is compliant with `Dublin Core Metadata Initiative Terms `__ (DCMI Terms) requirements. That way the Dataverse -metadata can be mapped easily to DCMI Terms, and be exported into XML -format for preservation and interoperability. - - -FGDC/CSDGM (imports) ---------------------- - -For imports only, Dataverse data is compliant with the `Content Standard -for Digital Geospatial Metadata (CSDGM), Vers. 2 (FGDC-STD-001-1998) `__ (FGDC). diff --git a/pom.xml b/pom.xml index dfda62dd6ff..82d3ddc7250 100644 --- a/pom.xml +++ b/pom.xml @@ -61,7 +61,46 @@ 3.0.1 jar - + + + org.swordapp + sword2-server + 1.0 + jar + classes + + + org.swordapp + sword2-server + 1.0 + war + + + + org.apache.abdera + abdera-core + 1.1.3 + + + + org.apache.abdera + abdera-parser + 1.1.3 + + + + commons-fileupload + commons-fileupload + 1.3.1 + + + + xom + xom + 1.1 + + + javax javaee-api 7.0 diff --git a/scripts/api/data-deposit/create-dataset b/scripts/api/data-deposit/create-dataset new file mode 100755 index 00000000000..52096569541 --- /dev/null +++ b/scripts/api/data-deposit/create-dataset @@ -0,0 +1,7 @@ +#!/bin/bash +USERNAME=pete +PASSWORD=pete +DVN_SERVER=localhost:8181 +DATAVERSE_ALIAS=peteTop +curl --insecure --data-binary "@scripts/api/data-deposit/data/atom-entry-study.xml" -H "Content-Type: application/atom+xml" https://$USERNAME:$PASSWORD@$DVN_SERVER/dvn/api/data-deposit/v1/swordv2/collection/dataverse/$DATAVERSE_ALIAS \ +| xmllint -format - diff --git a/scripts/api/data-deposit/data/atom-entry-study.xml b/scripts/api/data-deposit/data/atom-entry-study.xml new file mode 100644 index 00000000000..8adfb574188 --- /dev/null +++ b/scripts/api/data-deposit/data/atom-entry-study.xml @@ -0,0 +1,41 @@ + + + + + Roasting at Home + Peets, John + Stumptown, Jane + + Coffee Bean State University + + Peets, J., & Stumptown, J. (2013). Roasting at Home. New England Journal of Coffee, 3(1), 22-34. + + 2013-07-11 + + + Considerations before you start roasting your own coffee at home. + + coffee + beverage + caffeine + + United States + Canada + + aggregate data + + Stumptown, Jane. 2011. Home Roasting. Coffeemill Press. + + Creative Commons CC-BY 3.0 (unported) http://creativecommons.org/licenses/by/3.0/ + + Peets, John. 2010. Roasting Coffee at the Coffee Shop. Coffeemill Press + diff --git a/scripts/api/data-deposit/data/example.zip b/scripts/api/data-deposit/data/example.zip new file mode 100644 index 00000000000..8870dd7d8c7 Binary files /dev/null and b/scripts/api/data-deposit/data/example.zip differ diff --git a/scripts/api/data-deposit/delete-dataset b/scripts/api/data-deposit/delete-dataset new file mode 100755 index 00000000000..bcbee607f5f --- /dev/null +++ b/scripts/api/data-deposit/delete-dataset @@ -0,0 +1,9 @@ +#!/bin/bash -x +USERNAME=pete +PASSWORD=pete +DVN_SERVER=localhost:8181 +#GLOBAL_ID=hdl:TEST/12345 +GLOBAL_ID=`scripts/api/data-deposit/list-datasets | xpath '//id/text()' | cut -d'/' -f11,12` +#curl --insecure -X DELETE https://$DVN_SERVER/api/datasets/$DATABASE_ID?key=$USERNAME +curl --insecure -i -X DELETE https://$USERNAME:$PASSWORD@$DVN_SERVER/dvn/api/data-deposit/v1/swordv2/edit/study/$GLOBAL_ID +#| xmllint -format - diff --git a/scripts/api/data-deposit/list-datasets b/scripts/api/data-deposit/list-datasets new file mode 100755 index 00000000000..91fb9dc4f8d --- /dev/null +++ b/scripts/api/data-deposit/list-datasets @@ -0,0 +1,7 @@ +#!/bin/bash +USERNAME=pete +PASSWORD=pete +DVN_SERVER=localhost:8181 +DATAVERSE_ALIAS=peteTop +curl --insecure https://$USERNAME:$PASSWORD@$DVN_SERVER/dvn/api/data-deposit/v1/swordv2/collection/dataverse/$DATAVERSE_ALIAS \ +| xmllint -format - diff --git a/scripts/api/data-deposit/service-document b/scripts/api/data-deposit/service-document new file mode 100755 index 00000000000..87ccdf8c18d --- /dev/null +++ b/scripts/api/data-deposit/service-document @@ -0,0 +1,2 @@ +#!/bin/bash +curl --insecure https://pete:pete@localhost:8181/dvn/api/data-deposit/v1/swordv2/service-document | xmllint -format - diff --git a/scripts/api/data-deposit/show-atom-entry b/scripts/api/data-deposit/show-atom-entry new file mode 100755 index 00000000000..bd0c588213d --- /dev/null +++ b/scripts/api/data-deposit/show-atom-entry @@ -0,0 +1,8 @@ +#!/bin/sh +USERNAME=pete +PASSWORD=pete +DVN_SERVER=localhost:8181 +#GLOBAL_ID=hdl:TEST/12345 +GLOBAL_ID=`scripts/api/data-deposit/list-datasets | xpath '//id/text()' | cut -d'/' -f11,12` +curl --insecure -s https://$USERNAME:$PASSWORD@$DVN_SERVER/dvn/api/data-deposit/v1/swordv2/edit/study/$GLOBAL_ID \ +| xmllint -format - diff --git a/scripts/api/data-deposit/show-files b/scripts/api/data-deposit/show-files new file mode 100755 index 00000000000..42dd477b434 --- /dev/null +++ b/scripts/api/data-deposit/show-files @@ -0,0 +1,3 @@ +#!/bin/sh +#scripts/api/data-deposit/show-statement | xpath "//entry/content/@*[name()='type' or name()='src']" +scripts/api/data-deposit/show-statement | xpath '//entry/id/text()' | cut -d'/' -f11,12 diff --git a/scripts/api/data-deposit/show-statement b/scripts/api/data-deposit/show-statement new file mode 100755 index 00000000000..9648c18df9b --- /dev/null +++ b/scripts/api/data-deposit/show-statement @@ -0,0 +1,8 @@ +#!/bin/sh +USERNAME=pete +PASSWORD=pete +DVN_SERVER=localhost:8181 +GLOBAL_ID=`scripts/api/data-deposit/list-datasets | xpath '//id/text()' | cut -d'/' -f11,12` +curl --insecure -s https://$USERNAME:$PASSWORD@$DVN_SERVER/dvn/api/data-deposit/v1/swordv2/statement/study/$GLOBAL_ID \ +| xmllint -format - \ +#| xpath '//entry/title' diff --git a/scripts/api/data-deposit/upload-file b/scripts/api/data-deposit/upload-file new file mode 100755 index 00000000000..935f8a82fcd --- /dev/null +++ b/scripts/api/data-deposit/upload-file @@ -0,0 +1,7 @@ +#!/bin/sh +USERNAME=pete +PASSWORD=pete +DVN_SERVER=localhost:8181 +GLOBAL_ID=`scripts/api/data-deposit/list-datasets | xpath '//id/text()' | cut -d'/' -f11,12` +curl -s --insecure --data-binary @scripts/api/data-deposit/data/example.zip -H "Content-Disposition: filename=example.zip" -H "Content-Type: application/zip" -H "Packaging: http://purl.org/net/sword/package/SimpleZip" https://$USERNAME:$PASSWORD@$DVN_SERVER/dvn/api/data-deposit/v1/swordv2/edit-media/study/$GLOBAL_ID \ +| xmllint -format - diff --git a/scripts/api/data/dataset-sample1.json b/scripts/api/data/dataset-sample1.json index 79d35126ab2..dd43fe4c394 100644 --- a/scripts/api/data/dataset-sample1.json +++ b/scripts/api/data/dataset-sample1.json @@ -1,48 +1,44 @@ { - "authority": "anAuthority", - "identifier": "dataset-one", - "protocol": "chadham-house-rule", - "initialVersion": { - "authors": [ - { - "displayOrder": 0, - "affiliation": { - "value": "IQSS" - }, - "name": { - "value": "Castro, Eleni" - } - } - ], - "UNF": "UNF", - "title": "Replication Data for: Building a Bridge Between Journal Articles and Research Data", - "distributionDate": "Distribution Date", - "productionDate": "Production Date", + "data": { "metadataBlocks": { "citation": { "fields": [ { - "value": "Replication Data for: Building a Bridge Between Journal Articles and Research Data", + "value": "sample dataset", "typeClass": "primitive", "multiple": false, "typeName": "title" }, { "value": [ - [ - { - "value": "Castro, Eleni", + { + "authorAffiliation": { + "value": "Top", "typeClass": "primitive", "multiple": false, - "typeName": "authorName" + "typeName": "authorAffiliation" }, - { - "value": "IQSS", + "authorName": { + "value": "Privileged, Pete", + "typeClass": "primitive", + "multiple": false, + "typeName": "authorName" + } + }, + { + "authorAffiliation": { + "value": "Coca-cola co", "typeClass": "primitive", "multiple": false, "typeName": "authorAffiliation" + }, + "authorName": { + "value": "Borrator, Colla", + "typeClass": "primitive", + "multiple": false, + "typeName": "authorName" } - ] + } ], "typeClass": "compound", "multiple": true, @@ -50,21 +46,23 @@ }, { "value": [ - "ecastro@fas.harvard.edu" + "pete@malinator.com" ], "typeClass": "primitive", "multiple": true, "typeName": "distributorContact" }, { - "value": "Research dataset for my publication on connecting journal articles and their underlying research data. Includes an analysis of current data publication practices.", + "value": "description description description description description description description description description ", "typeClass": "primitive", "multiple": false, - "typeName": "description" + "typeName": "dsDescription" }, { "value": [ - "data publication" + "Keyword1", + "KeywordTwo", + "TheThirdKeyWord" ], "typeClass": "primitive", "multiple": true, @@ -78,6 +76,80 @@ "multiple": true, "typeName": "subject" }, + { + "value": "notes notes notes notes notes notes notes notes notes notes notes ", + "typeClass": "primitive", + "multiple": false, + "typeName": "notesText" + }, + { + "value": [ + { + "otherIdAgency": { + "value": "otheridAgency", + "typeClass": "primitive", + "multiple": false, + "typeName": "otherIdAgency" + }, + "otherIdValue": { + "value": "otherid", + "typeClass": "primitive", + "multiple": false, + "typeName": "otherIdValue" + } + } + ], + "typeClass": "compound", + "multiple": true, + "typeName": "otherId" + }, + { + "value": "2014-04-01", + "typeClass": "primitive", + "multiple": false, + "typeName": "productionDate" + }, + { + "value": "2014-04-01", + "typeClass": "primitive", + "multiple": false, + "typeName": "productionPlace" + }, + { + "value": [ + { + "grantNumberAgency": { + "value": "NSF", + "typeClass": "primitive", + "multiple": false, + "typeName": "grantNumberAgency" + }, + "grantNumberValue": { + "value": "NSF12345", + "typeClass": "primitive", + "multiple": false, + "typeName": "grantNumberValue" + } + }, + { + "grantNumberAgency": { + "value": "NIH", + "typeClass": "primitive", + "multiple": false, + "typeName": "grantNumberAgency" + }, + "grantNumberValue": { + "value": "NIH99999", + "typeClass": "primitive", + "multiple": false, + "typeName": "grantNumberValue" + } + } + ], + "typeClass": "compound", + "multiple": true, + "typeName": "grantNumber" + }, { "value": "Privileged, Pete", "typeClass": "primitive", @@ -85,14 +157,54 @@ "typeName": "depositor" }, { - "value": "2014-05-01", + "value": "2014-05-06", "typeClass": "primitive", "multiple": false, "typeName": "dateOfDeposit" + }, + { + "value": [ + "Other reference number one", + "Other reference number two" + ], + "typeClass": "primitive", + "multiple": true, + "typeName": "otherReferences" } ], "displayName": "Citation Metadata" } - } - } -} \ No newline at end of file + }, + "authors": [ + { + "displayOrder": 0, + "affiliation": { + "value": null + }, + "name": { + "value": null + } + }, + { + "displayOrder": 0, + "affiliation": { + "value": null + }, + "name": { + "value": null + } + } + ], + "createTime": "2014-05-06 01:38:01 -04", + "UNF": "UNF", + "id": 4, + "version": 1, + "versionNumber": 1, + "versionMinorNumber": 0, + "versionState": "DRAFT", + "title": "sample dataset", + "distributionDate": "Distribution Date", + "productionDate": "Production Date" + }, + "status": "OK" +} diff --git a/scripts/api/py_api_wrapper/dataverse_api.py b/scripts/api/py_api_wrapper/dataverse_api.py new file mode 100644 index 00000000000..3ac375aeb24 --- /dev/null +++ b/scripts/api/py_api_wrapper/dataverse_api.py @@ -0,0 +1,241 @@ +""" +Use APIs from: +https://github.com/IQSS/dataverse/tree/master/scripts/api + +5/8/2013 - scratch work, examining API +""" +import urllib +import urllib2 +import json +from msg_util import * + +class DataverseAPILink: + """Used to test the Dataverse API described in github: + https://github.com/IQSS/dataverse/tree/master/scripts/api + """ + RETURN_MODE_STR = 'RETURN_MODE_STR' + RETURN_MODE_PYTHON = 'RETURN_MODE_PYTHON' + HTTP_GET = 'GET' + HTTP_POST = 'POST' + HTTP_DELETE = 'DELETE' + HTTP_METHODS = [HTTP_GET, HTTP_POST, HTTP_DELETE] + + def __init__(self, server_name, use_https, apikey=None): + """ + :param server_name: e.g. dataverse.org, dvn-build.hmdc.harvard.edu, etc. + :type server_name: str + :param use_https: Use https for api calls? + :type use_https: boolean + """ + self.server_name = server_name + self.use_https = use_https + self.apikey = apikey + self.update_server_name() + self.return_mode = self.RETURN_MODE_STR + + def set_return_mode_python(self): + """API calls return JSON text response as a Python object + Uses json.loads(json_str) + """ + self.return_mode = self.RETURN_MODE_PYTHON + + def set_return_mode_string(self): + """API calls return JSON responses as a string""" + self.return_mode = self.RETURN_MODE_STR + + + def update_server_name(self): + if self.server_name is None: + raise Exception('Server name is None!') + + if self.server_name.endswith('/'): # cut trailing slash + self.server_name = self.server_name[-1] + + server_name_pieces = self.server_name.split('//') + if len(server_name_pieces) > 1: + self.server_name = server_name_pieces[1] + + def get_server_name(self): + + if self.use_https: + return 'https://' + self.server_name + return 'http://' + self.server_name + + + def make_api_call(self, url_str, method, kwargs=None): + msg('url_str: [%s] method:[%s] kwargs:[%s]' % (url_str, method, kwargs)) + if url_str is None: + return None + if not method in self.HTTP_METHODS: + msgt('Error: Method not found: %s' % method) + + request = urllib2.Request(url_str) + request.get_method = lambda:method # GET, POST, DELETE + if kwargs: + request.add_data(urllib.urlencode(kwargs)) + + response = urllib2.urlopen(request) + + print response.info() + + if self.return_mode == self.RETURN_MODE_PYTHON: + json_response = json.loads(response.read()) + response.close() + return json_response + + json_str = response.read() + response.close() + return json_str + + + def get_dataverse_metadata(self, dv_id=None): + """List all dataverses using GET http://{{SERVER}}/api/dvs + :param dv_id: dataverse id or None. None lists all dataverses + :type dv_id: int, None, ':root' + :return: JSON, dataverses metadata + """ + msgt('get_dataverse_metadata: [%s]' % dv_id) + url_str = self.get_server_name() + '/api/dvs' + if dv_id is not None: + url_str += '/%s' % dv_id + return self.make_api_call(url_str, self.HTTP_GET) + + def list_dataverses(self): + msgt('list_dataverses') + return self.get_dataverse_metadata() + + def get_root_dataverse_metadata(self): + msgt('get_root_dataverse_metadata') + return self.get_dataverse_metadata(':root') + + + def get_dataset_metadata(self, dataset_id, dataset_version=None): + """List all dataverses using GET http://{{SERVER}}/api/datasets/?key={{apikey}} + @return: JSON, list of dataverses + """ + msgt('get_dataset_metadata') + if not self.apikey: + msg('Sorry! You need an api key!') + return + url_str = self.get_server_name() + '/api/datasets/%s' % dataset_id + if dataset_version: + url_str = self.get_server_name() + '/api/datasets/%s/versions/%s/metadata' % (dataset_id, dataset_version) + else: + url_str = self.get_server_name() + '/api/datasets/%s' % (dataset_id) + + url_str = '%s?key=%s' % (url_str, self.apikey) + + return self.make_api_call(url_str, self.HTTP_GET) + + + def list_datasets(self): + """List all datadryd using GET http://{{SERVER}}/api/datasets?key={{apikey}} + @return: JSON, list of dataverses + """ + msgt('list_datasets') + if not self.apikey: + msg('Sorry! You need an api key!') + return + url_str = self.get_server_name() + '/api/datasets?key=%s' % self.apikey + return self.make_api_call(url_str, self.HTTP_GET) + + + + def delete_dataverse_by_id(self, id_val): + msgt('delete_dataverse_by_id: %s' % id_val) + url_str = self.get_server_name() + '/api/dvs/%s' % id_val + kwargs = { 'key': self.apikey } + return self.make_api_call(url_str, self.HTTP_DELETE, kwargs) + + def get_user_data(self, uid_or_username=None): + """Get metadata for a specific user + GET http://{{SERVER}}/api/users/{{uid}} + """ + msgt('get_user_data: %s' % uid_or_username) + url_str = self.get_server_name() + '/api/users/%s' % uid_or_username + return self.make_api_call(url_str, self.HTTP_GET) + + # USERS + def list_users(self): + """List users + GET http://{{SERVER}}/api/users + """ + msgt('list_users') + + url_str = self.get_server_name() + '/api/users' + return self.make_api_call(url_str, self.HTTP_GET) + + + ''' + def make_dataverse(self, username, dvn_info_as_dict, dv_id=None): + """Make a dataverse + POST http://{{SERVER}}/api/dvs?key={{username}} + """ + msgt('make_dataverse: [username:%s] [dv_id:%s]' % (username, dv_id)) + url_str = self.get_server_name() + '/api/dvs?key=%s' % username + #kwargs = {'key': username} + return self.make_api_call(url_str, self.HTTP_POST, dvn_info_as_dict) + ''' +def write_to_file(content, fname): + open(fname, 'w').write(content) + print 'file written: %s' % fname + +def save_current_metadata(server_name): + dat = DataverseAPILink(server_name, use_https=False, apikey='admin') + + output_dir = 'demo-data' + + dv_json = dat.list_dataverses() + write_to_file(dv_json, '%s/2014_0513_dataverses.json' % output_dir) + + # Currently, troubleshooting list datasets api which is giving a null pointer exception + + dataset_ids = """113:41 121:42 91:30 87:29 84:28 83:27 61:22 71:33 62:23 35:21 + 56:19 53:44 49:45 46:18 40:36 22:13 32:20 26:10 29:38""".split() + + geo_list = [] + for ds in dataset_ids: + did, vid = ds.split(':') + geo_meta = dat.get_dataset_metadata(did,':latest') + geo_list.append(geo_meta.strip()) + write_to_file(geo_meta, '%s/2014_0513_dataset_id_%s.json' % (output_dir, did)) + #geo_content = '\n'.join(geo_list) + + +if __name__=='__main__': + + #server_with_api = 'dvn-build.hmdc.harvard.edu' + server_with_api = 'dataverse-demo.iq.harvard.edu' + + save_current_metadata(server_with_api) + """ + dat = DataverseAPILink(server_with_api, use_https=False, apikey='pete') + json_text = dat.list_dataverses() + print json_text + + print dat.list_datasets() + """ + """ + + dat.set_return_mode_python() + d = dat.list_dataverses() # python dictionary {} + print d.keys() + dv_names = [dv_info.get('name', '?') for dv_info in d['data']] + print dv_names + """ + + """ + dat.set_return_mode_python() + user_info = dat.list_users() + user_ids = [info['id'] for info in user_info['data'] if info['id'] is not None] + for uid in user_ids: + print dat.get_user_data(uid) + """ + + #print dat.get_root_dataverse_metadata() + #print dat.get_dataverse_metadata(':root') + #print dat.list_datasets() + #print dat.get_dataset_metadata(113,':latest') + #print dat.make_dataverse('pete') + #print dat.delete_dataverse_by_id() + \ No newline at end of file diff --git a/scripts/api/py_api_wrapper/msg_util.py b/scripts/api/py_api_wrapper/msg_util.py new file mode 100644 index 00000000000..78606dff4cc --- /dev/null +++ b/scripts/api/py_api_wrapper/msg_util.py @@ -0,0 +1,4 @@ +def msg(s): print s +def dashes(): msg(40*'-') +def msgt(s): dashes(); msg(s); dashes() + diff --git a/scripts/api/py_api_wrapper/readme.md b/scripts/api/py_api_wrapper/readme.md new file mode 100644 index 00000000000..ac2fc5cc86a --- /dev/null +++ b/scripts/api/py_api_wrapper/readme.md @@ -0,0 +1,75 @@ +# Python API Wrapper Guide + +(in progress) + +This a python class "DataverseAPILink" which may be used to make the API calls described in the dataverse/scripts/api/readme.md +Results of API calls may by returned as JSON (string format) or as python dictionaries. + +Given a Dataverse server name and optional apikey, the class uses python's basic urllib2 to make the API calls. +Generate the infamous _Pete_,_Uma_ and _Gabbi_. + + +## Quick example + +List the dataverses + + server_with_api = 'dataverse-demo.iq.harvard.edu' + dal = DataverseAPILink(server_with_api, use_https=False, apikey='admin') + json_text = dal.list_dataverses() + print json_text + +Output: + + { + "status":"OK", + "data":[ + { + "id":93, + "alias":"b", + "name":"b", + "affiliation":"b", + "contactEmail":"b@b", + "permissionRoot":false, + "creator":{ + "id":13, + "firstName":"b", + "lastName":"b", + "userName":"b", + "affiliation":"b", + "position":"b", + "email":"b@b" + }, + "description":"b", + "ownerId":1, + "creationDate":"2014-05-12 02:38:36 -04" + }, + + (etc, etc) + +Return the same list as a python object + + dat.set_return_mode_python() + d = dat.list_dataverses() # python dictionary {} + print d.keys() + dv_names = [dv_info.get('name', '?no name?') for dv_info in d['data']] + print dv_names + +Output: + + [u'status', u'data'] + [u'b', u'Beta Candidate', u'kc58', u'Kevin Smoke Test 5/8', u'Penultimate Smoke Test', u"Pete's public place", u"Pete's restricted data", u"Pete's secrets", u'Root', u'smoke 5/7', u'testadd', u'testauthor', u'Test Cliosed', u'Test Open', u'testpete', u'Top dataverse of Pete', u'Top dataverse of Uma', u"Uma's first", u"Uma's restricted"] + +### Users + +List Users: + + dat.set_return_mode_python() + user_info = dat.list_users() + print user_info + +Iterate through each user and pull the same data by 'id' + + user_ids = [info['id'] for info in user_info['data'] if info['id'] is not None] + for uid in user_ids: + print dat.get_user_data(uid) + diff --git a/scripts/installer/glassfish-setup.sh b/scripts/installer/glassfish-setup.sh index 513b329f275..7551f7acbb1 100755 --- a/scripts/installer/glassfish-setup.sh +++ b/scripts/installer/glassfish-setup.sh @@ -202,6 +202,8 @@ fi ./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.rserve.port=${RSERVE_PORT}" ./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.rserve.user=${RSERVE_USER}" ./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.rserve.password=${RSERVE_PASS}" +# Data Deposit API options +./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.datadeposit.hostname=${HOST_ADDRESS}" # enable comet support ./asadmin $ASADMIN_OPTS set server-config.network-config.protocols.protocol.http-listener-1.http.comet-support-enabled="true" diff --git a/scripts/search/files b/scripts/search/files new file mode 100755 index 00000000000..361c9843458 --- /dev/null +++ b/scripts/search/files @@ -0,0 +1,3 @@ +#!/bin/sh +curl http://localhost:8080/api/index +curl -s 'http://localhost:8983/solr/collection1/select?rows=100&wt=json&indent=true&q=*&fq=dvtype:files' | jq '.response.docs[] | {name_sort, id, parentid}' diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataset.java b/src/main/java/edu/harvard/iq/dataverse/Dataset.java index ee47d9d9c13..69b84f62c73 100644 --- a/src/main/java/edu/harvard/iq/dataverse/Dataset.java +++ b/src/main/java/edu/harvard/iq/dataverse/Dataset.java @@ -211,4 +211,8 @@ public T accept(Visitor v) { return v.visit(this); } + public String getGlobalId() { + return protocol + ":" + authority + "/" + getIdentifier(); + } + } diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java index 600013712cc..9fb7fbbafcf 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java @@ -264,7 +264,7 @@ public void init() { } } } - FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_INFO, "Add New Dataset", " - Enter metadata to create the dataset's citation. You can add more metadata about this dataset after it's created.")); + // FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_INFO, "Add New Dataset", " - Enter metadata to create the dataset's citation. You can add more metadata about this dataset after it's created.")); displayVersion = editVersion; } else { throw new RuntimeException("On Dataset page without id or ownerid."); // improve error handling diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetRelPublication.java b/src/main/java/edu/harvard/iq/dataverse/DatasetRelPublication.java index 8c02352af5d..6bf55445d57 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetRelPublication.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetRelPublication.java @@ -6,14 +6,6 @@ package edu.harvard.iq.dataverse; -import javax.persistence.Column; -import javax.persistence.Entity; -import javax.persistence.GeneratedValue; -import javax.persistence.GenerationType; -import javax.persistence.Id; -import javax.persistence.JoinColumn; -import javax.persistence.ManyToOne; -import javax.persistence.Version; /** * @@ -22,11 +14,13 @@ public class DatasetRelPublication { - @Column(columnDefinition = "TEXT") + private String text; private String idType; private String idNumber; private String url; + private String title; + private String description; private boolean replicationData; private int displayOrder; @@ -68,6 +62,23 @@ public String getUrl() { public void setUrl(String url) { this.url = url; } + + + public String getTitle() { + return title; + } + + public void setTitle(String title) { + this.title = title; + } + + public String getDescription() { + return description; + } + + public void setDescription(String description) { + this.description = description; + } public boolean isEmpty() { return ((text==null || text.trim().equals("")) diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java index ac843229ec3..5ffff921859 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java @@ -49,7 +49,23 @@ public void generateFileSystemName(DataFile dataFile) { dataFile.setFileSystemName(result.toString()); } - - + + /** + * @todo write this method for real. Don't just iterate through every single + * dataset! See https://redmine.hmdc.harvard.edu/issues/3988 + */ + public Dataset findByGlobalId(String globalId) { + Dataset foundDataset = null; + if (globalId != null) { + Query query = em.createQuery("select object(o) from Dataset as o order by o.id"); + List datasets = query.getResultList(); + for (Dataset dataset : datasets) { + if (globalId.equals(dataset.getGlobalId())) { + foundDataset = dataset; + } + } + } + return foundDataset; + } } diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetVersionUI.java b/src/main/java/edu/harvard/iq/dataverse/DatasetVersionUI.java index 9e715dd2ce8..9e2cf22a108 100644 --- a/src/main/java/edu/harvard/iq/dataverse/DatasetVersionUI.java +++ b/src/main/java/edu/harvard/iq/dataverse/DatasetVersionUI.java @@ -85,6 +85,8 @@ public DatasetVersionUI(DatasetVersion datasetVersion) { if (this.datasetRelPublications.isEmpty()) { for (DatasetFieldCompoundValue relPubVal : dsf.getDatasetFieldCompoundValues()) { DatasetRelPublication datasetRelPublication = new DatasetRelPublication(); + datasetRelPublication.setTitle(dsf.getDatasetFieldType().getTitle()); + datasetRelPublication.setDescription(dsf.getDatasetFieldType().getDescription()); for (DatasetField subField : relPubVal.getChildDatasetFields()) { if (subField.getDatasetFieldType().getName().equals(DatasetFieldConstant.publicationCitation)) { datasetRelPublication.setText(subField.getValue()); diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java index 1e54d97faac..84f6e8ef967 100644 --- a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java +++ b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java @@ -82,6 +82,11 @@ public enum ImageFormat { SQUARE, RECTANGLE } @Enumerated(EnumType.STRING) private ImageFormat logoFormat; + + public enum Alignment { LEFT, CENTER, RIGHT } + @Enumerated(EnumType.STRING) + private Alignment logoAlignment; + private String logoBackgroundColor; private String logo; private String tagline; private String linkUrl; @@ -198,6 +203,22 @@ public void setLogoFormat(ImageFormat logoFormat) { this.logoFormat = logoFormat; } + public Alignment getLogoAlignment() { + return logoAlignment; + } + + public void setLogoAlignment(Alignment logoAlignment) { + this.logoAlignment = logoAlignment; + } + + public String getLogoBackgroundColor() { + return logoBackgroundColor; + } + + public void setLogoBackgroundColor(String logoBackgroundColor) { + this.logoBackgroundColor = logoBackgroundColor; + } + public String getLogo() { return logo; } diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/CollectionDepositManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/CollectionDepositManagerImpl.java new file mode 100644 index 00000000000..6348d16e875 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/CollectionDepositManagerImpl.java @@ -0,0 +1,220 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.Dataset; +import edu.harvard.iq.dataverse.DatasetField; +import edu.harvard.iq.dataverse.DatasetFieldServiceBean; +import edu.harvard.iq.dataverse.DatasetFieldType; +import edu.harvard.iq.dataverse.DatasetFieldValidator; +import edu.harvard.iq.dataverse.DatasetFieldValue; +import edu.harvard.iq.dataverse.DatasetVersion; +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseServiceBean; +import edu.harvard.iq.dataverse.DataverseUser; +import edu.harvard.iq.dataverse.EjbDataverseEngine; +import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetCommand; +import java.io.File; +import java.io.IOException; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; +import java.util.UUID; +import java.util.logging.Logger; +import javax.ejb.EJB; +import javax.inject.Inject; +import javax.validation.ConstraintViolation; +import javax.validation.ConstraintViolationException; +import org.apache.commons.io.FileUtils; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.CollectionDepositManager; +import org.swordapp.server.Deposit; +import org.swordapp.server.DepositReceipt; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordConfiguration; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; +import org.swordapp.server.UriRegistry; + +public class CollectionDepositManagerImpl implements CollectionDepositManager { + + private static final Logger logger = Logger.getLogger(CollectionDepositManagerImpl.class.getCanonicalName()); + @EJB + DataverseServiceBean dataverseService; + @Inject + SwordAuth swordAuth; + @Inject + UrlManager urlManager; + @EJB + EjbDataverseEngine engineSvc; + @EJB + DatasetFieldServiceBean datasetFieldService; + + @Override + public DepositReceipt createNew(String collectionUri, Deposit deposit, AuthCredentials authCredentials, SwordConfiguration config) + throws SwordError, SwordServerException, SwordAuthException { + + DataverseUser dataverseUser = swordAuth.auth(authCredentials); + + urlManager.processUrl(collectionUri); + String dvAlias = urlManager.getTargetIdentifier(); + if (urlManager.getTargetType().equals("dataverse") && dvAlias != null) { + + logger.fine("attempting deposit into this dataverse alias: " + dvAlias); + + Dataverse dvThatWillOwnDataset = dataverseService.findByAlias(dvAlias); + + if (dvThatWillOwnDataset != null) { + + if (swordAuth.hasAccessToModifyDataverse(dataverseUser, dvThatWillOwnDataset)) { + + logger.fine("multipart: " + deposit.isMultipart()); + logger.fine("binary only: " + deposit.isBinaryOnly()); + logger.fine("entry only: " + deposit.isEntryOnly()); + logger.fine("in progress: " + deposit.isInProgress()); + logger.fine("metadata relevant: " + deposit.isMetadataRelevant()); + + if (deposit.isEntryOnly()) { + logger.fine("deposit XML received by createNew():\n" + deposit.getSwordEntry()); + // require title *and* exercise the SWORD jar a bit + Map> dublinCore = deposit.getSwordEntry().getDublinCore(); + if (dublinCore.get("title") == null || dublinCore.get("title").get(0) == null || dublinCore.get("title").get(0).isEmpty()) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "title field is required"); + } + + if (dublinCore.get("date") != null) { + String date = dublinCore.get("date").get(0); + if (date != null) { + /** + * @todo re-enable this. use + * datasetFieldValidator.isValid? + */ +// boolean isValid = DateUtil.validateDate(date); +// if (!isValid) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Invalid date: '" + date + "'. Valid formats are YYYY-MM-DD, YYYY-MM, or YYYY."); +// } + } + } + + /** + * @todo think about the implications of no longer using + * importStudy(), such as the comment below... do we + * really need to write the XML to disk? + */ + // instead of writing a tmp file, maybe importStudy() could accept an InputStream? + String tmpDirectory = config.getTempDirectory(); + if (tmpDirectory == null) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not determine temp directory"); + } + String uploadDirPath = tmpDirectory + File.separator + "import" + File.separator + dvThatWillOwnDataset.getId(); + File uploadDir = new File(uploadDirPath); + if (!uploadDir.exists()) { + if (!uploadDir.mkdirs()) { + logger.info("couldn't create directory: " + uploadDir.getAbsolutePath()); + throw new SwordServerException("Couldn't create upload directory."); + } + } + String tmpFilePath = uploadDirPath + File.separator + "newStudyViaSwordv2.xml"; + File tmpFile = new File(tmpFilePath); + try { + FileUtils.writeStringToFile(tmpFile, deposit.getSwordEntry().getEntry().toString()); + } catch (IOException ex) { + logger.info("couldn't write temporary file: " + tmpFile.getAbsolutePath()); + throw new SwordServerException("Couldn't write temporary file"); + } finally { + uploadDir.delete(); + } + + /** + * @todo properly create a dataset and datasetVersion + */ + Dataset dataset = new Dataset(); + dataset.setOwner(dvThatWillOwnDataset); + /** + * @todo don't hard code these! See also saving a + * dataset in GUI changes globalId to hard coded value + * (doi:10.5072/FK2/5555) + * https://redmine.hmdc.harvard.edu/issues/3993 + */ + dataset.setProtocol("doi"); + dataset.setAuthority("myAuthority"); + dataset.setIdentifier(UUID.randomUUID().toString()); + + DatasetVersion newDatasetVersion = dataset.getVersions().get(0); + newDatasetVersion.setVersionState(DatasetVersion.VersionState.DRAFT); + + List datasetFields = new ArrayList<>(); + DatasetField titleDatasetField = new DatasetField(); + DatasetFieldType titleDatasetFieldType = datasetFieldService.findByName("title"); + titleDatasetField.setDatasetFieldType(titleDatasetFieldType); + List datasetFieldValues = new ArrayList<>(); + DatasetFieldValue titleDatasetFieldValue = new DatasetFieldValue(titleDatasetField, dublinCore.get("title").get(0)); + datasetFieldValues.add(titleDatasetFieldValue); + titleDatasetField.setDatasetFieldValues(datasetFieldValues); + datasetFields.add(titleDatasetField); + + newDatasetVersion.setDatasetFields(datasetFields); + + try { + // there is no importStudy method in 4.0 :( + // study = studyService.importStudy(tmpFile, dcmiTermsFormatId, dvThatWillOwnStudy.getId(), vdcUser.getId()); + engineSvc.submit(new CreateDatasetCommand(dataset, dataverseUser)); + } catch (Exception ex) { +// StringWriter stringWriter = new StringWriter(); +// ex.printStackTrace(new PrintWriter(stringWriter)); +// String stackTrace = stringWriter.toString(); + /** + * @todo in DVN 3.x we printed the whole stack trace + * here. Is that really necessary? + */ +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't import study: " + stackTrace); + + Throwable cause = ex; + StringBuilder sb = new StringBuilder(); + sb.append(ex.getLocalizedMessage()); + while (cause.getCause() != null) { + cause = cause.getCause(); + if (cause instanceof ConstraintViolationException) { + ConstraintViolationException constraintViolationException = (ConstraintViolationException) cause; + for (ConstraintViolation violation : constraintViolationException.getConstraintViolations()) { + sb.append(" Invalid value: <<<").append(violation.getInvalidValue()).append(">>> for ") + .append(violation.getPropertyPath()).append(" at ") + .append(violation.getLeafBean()).append(" - ") + .append(violation.getMessage()); + } + } + } + logger.info(sb.toString()); + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't create dataset: " + sb.toString()); + } finally { + tmpFile.delete(); + uploadDir.delete(); + } + ReceiptGenerator receiptGenerator = new ReceiptGenerator(); + String baseUrl = urlManager.getHostnamePlusBaseUrlPath(collectionUri); + DepositReceipt depositReceipt = receiptGenerator.createReceipt(baseUrl, dataset); + return depositReceipt; + } else if (deposit.isBinaryOnly()) { + // get here with this: + // curl --insecure -s --data-binary "@example.zip" -H "Content-Disposition: filename=example.zip" -H "Content-Type: application/zip" https://sword:sword@localhost:8181/dvn/api/data-deposit/v1/swordv2/collection/dataverse/sword/ + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Binary deposit to the collection IRI via POST is not supported. Please POST an Atom entry instead."); + } else if (deposit.isMultipart()) { + // get here with this: + // wget https://raw.github.com/swordapp/Simple-Sword-Server/master/tests/resources/multipart.dat + // curl --insecure --data-binary "@multipart.dat" -H 'Content-Type: multipart/related; boundary="===============0670350989=="' -H "MIME-Version: 1.0" https://sword:sword@localhost:8181/dvn/api/data-deposit/v1/swordv2/collection/dataverse/sword/hdl:1902.1/12345 + // but... + // "Yeah, multipart is critically broken across all implementations" -- http://www.mail-archive.com/sword-app-tech@lists.sourceforge.net/msg00327.html + throw new UnsupportedOperationException("Not yet implemented"); + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "expected deposit types are isEntryOnly, isBinaryOnly, and isMultiPart"); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "user " + dataverseUser.getUserName() + " is not authorized to modify dataset"); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find dataverse: " + dvAlias); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not determine target type or identifier from URL: " + collectionUri); + } + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/CollectionListManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/CollectionListManagerImpl.java new file mode 100644 index 00000000000..c6b43f9ed19 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/CollectionListManagerImpl.java @@ -0,0 +1,92 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.Dataset; +import edu.harvard.iq.dataverse.DatasetServiceBean; +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseServiceBean; +import edu.harvard.iq.dataverse.DataverseUser; +import java.util.ArrayList; +import java.util.List; +import java.util.logging.Logger; +import javax.ejb.EJB; +import javax.inject.Inject; +import javax.xml.namespace.QName; +import org.apache.abdera.Abdera; +import org.apache.abdera.i18n.iri.IRI; +import org.apache.abdera.model.Entry; +import org.apache.abdera.model.Feed; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.CollectionListManager; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordConfiguration; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; +import org.swordapp.server.UriRegistry; + +public class CollectionListManagerImpl implements CollectionListManager { + + private static final Logger logger = Logger.getLogger(CollectionListManagerImpl.class.getCanonicalName()); + @EJB + DataverseServiceBean dataverseService; + @EJB + DatasetServiceBean datasetService; + @Inject + SwordAuth swordAuth; + @Inject + UrlManager urlManager; + + @Override + public Feed listCollectionContents(IRI iri, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordServerException, SwordAuthException, SwordError { + DataverseUser dataverseUser = swordAuth.auth(authCredentials); + + urlManager.processUrl(iri.toString()); + String dvAlias = urlManager.getTargetIdentifier(); + if (urlManager.getTargetType().equals("dataverse") && dvAlias != null) { + + Dataverse dv = dataverseService.findByAlias(dvAlias); + + if (dv != null) { + if (swordAuth.hasAccessToModifyDataverse(dataverseUser, dv)) { + Abdera abdera = new Abdera(); + Feed feed = abdera.newFeed(); + feed.setTitle(dv.getName()); + /** + * @todo how do I get a list of datasets belonging to a + * user? + */ +// Collection studies = dv.getOwnedStudies(); + List childDvObjects = dataverseService.findByOwnerId(dv.getId()); + childDvObjects.addAll(datasetService.findByOwnerId(dv.getId())); + List studies = new ArrayList<>(); + for (Object object : childDvObjects) { + if (object instanceof Dataset) { + studies.add((Dataset) object); + } + } + String baseUrl = urlManager.getHostnamePlusBaseUrlPath(iri.toString()); + for (Dataset study : studies) { + String editUri = baseUrl + "/edit/study/" + study.getGlobalId(); + String editMediaUri = baseUrl + "/edit-media/study/" + study.getGlobalId(); + Entry entry = feed.addEntry(); + entry.setId(editUri); + entry.setTitle(study.getLatestVersion().getTitle()); + entry.setBaseUri(new IRI(editUri)); + entry.addLink(editMediaUri, "edit-media"); + feed.addEntry(entry); + } + Boolean dvHasBeenReleased = dv.isReleased(); + feed.addSimpleExtension(new QName(UriRegistry.SWORD_STATE, "dataverseHasBeenReleased"), dvHasBeenReleased.toString()); + return feed; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "user " + dataverseUser.getUserName() + " is not authorized to list datasets in dataverse " + dv.getAlias()); + } + + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find dataverse: " + dvAlias); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't determine target type or identifer from URL: " + iri); + } + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ContainerManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ContainerManagerImpl.java new file mode 100644 index 00000000000..cf1ee53e852 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ContainerManagerImpl.java @@ -0,0 +1,516 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.Dataset; +import edu.harvard.iq.dataverse.DatasetServiceBean; +import edu.harvard.iq.dataverse.DatasetVersion; +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseUser; +import edu.harvard.iq.dataverse.EjbDataverseEngine; +import edu.harvard.iq.dataverse.IndexServiceBean; +import edu.harvard.iq.dataverse.engine.command.exception.CommandException; +import edu.harvard.iq.dataverse.engine.command.exception.CommandExecutionException; +import edu.harvard.iq.dataverse.engine.command.impl.DeleteDatasetCommand; +import edu.harvard.iq.dataverse.export.DDIExportServiceBean; +import java.io.File; +import java.util.List; +import java.util.Map; +import java.util.logging.Logger; +import javax.ejb.EJB; +import javax.ejb.EJBException; +import javax.inject.Inject; +import javax.naming.Context; +import javax.naming.InitialContext; +import javax.naming.NamingException; +import javax.persistence.EntityManager; +import javax.persistence.PersistenceContext; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.ContainerManager; +import org.swordapp.server.Deposit; +import org.swordapp.server.DepositReceipt; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordConfiguration; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; +import org.swordapp.server.UriRegistry; + +public class ContainerManagerImpl implements ContainerManager { + + private static final Logger logger = Logger.getLogger(ContainerManagerImpl.class.getCanonicalName()); + + @EJB + protected EjbDataverseEngine engineSvc; + @EJB + DatasetServiceBean datasetService; + @EJB + IndexServiceBean indexService; + @PersistenceContext(unitName = "VDCNet-ejbPU") + EntityManager em; + @EJB + DDIExportServiceBean ddiService; + @Inject + SwordAuth swordAuth; + @Inject + UrlManager urlManager; +// SwordConfigurationImpl swordConfiguration = new SwordConfigurationImpl(); + + @Override + public DepositReceipt getEntry(String uri, Map map, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordServerException, SwordError, SwordAuthException { + DataverseUser dataverseUser = swordAuth.auth(authCredentials); + logger.fine("getEntry called with url: " + uri); + urlManager.processUrl(uri); + String targetType = urlManager.getTargetType(); + if (!targetType.isEmpty()) { + logger.fine("operating on target type: " + urlManager.getTargetType()); + if ("study".equals(targetType)) { + String globalId = urlManager.getTargetIdentifier(); + Dataset dataset = null; + try { + dataset = datasetService.findByGlobalId(globalId); + } catch (EJBException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on global id (" + globalId + ") in URL: " + uri); + } + if (dataset != null) { + Dataverse dvThatOwnsStudy = dataset.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(dataverseUser, dvThatOwnsStudy)) { + ReceiptGenerator receiptGenerator = new ReceiptGenerator(); + String baseUrl = urlManager.getHostnamePlusBaseUrlPath(uri); + DepositReceipt depositReceipt = receiptGenerator.createReceipt(baseUrl, dataset); + if (depositReceipt != null) { + return depositReceipt; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not generate deposit receipt."); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + dataverseUser.getUserName() + " is not authorized to retrieve entry for " + dataset.getGlobalId()); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unsupported target type (" + targetType + ") in URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to determine target type from URL: " + uri); + } + } + + @Override + public DepositReceipt replaceMetadata(String uri, Deposit deposit, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + DataverseUser vdcUser = swordAuth.auth(authCredentials); + logger.fine("replaceMetadata called with url: " + uri); + urlManager.processUrl(uri); + String targetType = urlManager.getTargetType(); + if (!targetType.isEmpty()) { + logger.fine("operating on target type: " + urlManager.getTargetType()); + if ("dataverse".equals(targetType)) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Metadata replace of dataverse is not supported."); + } else if ("study".equals(targetType)) { + logger.fine("replacing metadata for study"); + logger.fine("deposit XML received by replaceMetadata():\n" + deposit.getSwordEntry()); + + String globalId = urlManager.getTargetIdentifier(); + +// EditStudyService editStudyService; + Context ctx; + try { + ctx = new InitialContext(); +// editStudyService = (EditStudyService) ctx.lookup("java:comp/env/editStudy"); + } catch (NamingException ex) { + logger.info("problem looking up editStudyService"); + throw new SwordServerException("problem looking up editStudyService"); + } +// StudyServiceLocal studyService; + try { + ctx = new InitialContext(); +// studyService = (StudyServiceLocal) ctx.lookup("java:comp/env/studyService"); + } catch (NamingException ex) { + logger.info("problem looking up studyService"); + throw new SwordServerException("problem looking up studyService"); + } +// Study studyToLookup; + Dataset studyToLookup = null; + try { + /** + * @todo: why doesn't + * editStudyService.setStudyVersionByGlobalId(globalId) + * work? + */ +// studyToLookup = studyService.getStudyByGlobalId(globalId); + } catch (EJBException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on global id (" + globalId + ") in URL: " + uri); + } + if (studyToLookup != null) { +// StudyLock lockOnStudyLookedup = studyToLookup.getStudyLock(); +// if (lockOnStudyLookedup != null) { +// String message = Util.getStudyLockMessage(lockOnStudyLookedup, studyToLookup.getGlobalId()); +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, message); +// } +// editStudyService.setStudyVersion(studyToLookup.getId()); +// Study studyToEdit = editStudyService.getStudyVersion().getStudy(); + Dataset studyToEdit = null; + Dataverse dvThatOwnsStudy = studyToEdit.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsStudy)) { + Map> dublinCore = deposit.getSwordEntry().getDublinCore(); + if (dublinCore.get("title") == null || dublinCore.get("title").get(0) == null || dublinCore.get("title").get(0).isEmpty()) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "title field is required"); + } + if (dublinCore.get("date") != null) { + String date = dublinCore.get("date").get(0); + if (date != null) { +// boolean isValid = DateUtil.validateDate(date); +// if (!isValid) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Invalid date: '" + date + "'. Valid formats are YYYY-MM-DD, YYYY-MM, or YYYY."); +// } + } + } + String tmpDirectory = swordConfiguration.getTempDirectory(); + if (tmpDirectory == null) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not determine temp directory"); + } + String uploadDirPath = tmpDirectory + File.separator + "import" + File.separator + studyToEdit.getId(); + Long dcmiTermsHarvetsFormatId = new Long(4); +// HarvestFormatType dcmiTermsHarvestFormatType = em.find(HarvestFormatType.class, dcmiTermsHarvetsFormatId); + String xmlAtomEntry = deposit.getSwordEntry().getEntry().toString(); +// File ddiFile = studyService.transformToDDI(xmlAtomEntry, dcmiTermsHarvestFormatType.getStylesheetFileName(), uploadDirPath); + // erase all metadata before running ddiService.mapDDI() because + // for multivalued fields (such as author) that function appends + // values rather than replacing them + /** + * @todo how best to replace metadata on a dataset in + * 4.0? + */ +// studyToEdit.getLatestVersion().setMetadata(new Metadata()); +// ddiService.mapDDI(ddiFile, studyToEdit.getLatestVersion(), true); + try { +// editStudyService.save(dvThatOwnsStudy.getId(), vdcUser.getId()); + } catch (EJBException ex) { + // OptimisticLockException +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to replace cataloging information for study " + studyToEdit.getGlobalId() + " (may be locked). Please try again later."); + } + ReceiptGenerator receiptGenerator = new ReceiptGenerator(); + String baseUrl = urlManager.getHostnamePlusBaseUrlPath(uri); + DepositReceipt depositReceipt = receiptGenerator.createReceipt(baseUrl, studyToEdit); + return depositReceipt; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + vdcUser.getUserName() + " is not authorized to modify dataverse " + dvThatOwnsStudy.getAlias()); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on global id (" + globalId + ") in URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unknown target type specified on which to replace metadata: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "No target specified on which to replace metadata: " + uri); + } + } + + @Override + public DepositReceipt replaceMetadataAndMediaResource(String string, Deposit dpst, AuthCredentials ac, SwordConfiguration sc) throws SwordError, SwordServerException, SwordAuthException { + throw new UnsupportedOperationException("Not supported yet."); + } + + @Override + public DepositReceipt addMetadataAndResources(String string, Deposit dpst, AuthCredentials ac, SwordConfiguration sc) throws SwordError, SwordServerException, SwordAuthException { + throw new UnsupportedOperationException("Not supported yet."); + } + + @Override + public DepositReceipt addMetadata(String string, Deposit dpst, AuthCredentials ac, SwordConfiguration sc) throws SwordError, SwordServerException, SwordAuthException { + throw new UnsupportedOperationException("Not supported yet."); + } + + @Override + public DepositReceipt addResources(String string, Deposit dpst, AuthCredentials ac, SwordConfiguration sc) throws SwordError, SwordServerException, SwordAuthException { + throw new UnsupportedOperationException("Not supported yet."); + } + + @Override + public void deleteContainer(String uri, AuthCredentials authCredentials, SwordConfiguration sc) throws SwordError, SwordServerException, SwordAuthException { +// swordConfiguration = (SwordConfigurationImpl) sc; + DataverseUser vdcUser = swordAuth.auth(authCredentials); + logger.fine("deleteContainer called with url: " + uri); + urlManager.processUrl(uri); + logger.fine("original url: " + urlManager.getOriginalUrl()); + if (!"edit".equals(urlManager.getServlet())) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "edit servlet expected, not " + urlManager.getServlet()); + } + String targetType = urlManager.getTargetType(); + if (!targetType.isEmpty()) { + logger.fine("operating on target type: " + urlManager.getTargetType()); + +// StudyServiceLocal studyService; + Context ctx; + try { + ctx = new InitialContext(); +// studyService = (StudyServiceLocal) ctx.lookup("java:comp/env/studyService"); + } catch (NamingException ex) { + logger.info("problem looking up studyService"); + throw new SwordServerException("problem looking up studyService"); + } + + if ("dataverse".equals(targetType)) { + /** + * @todo throw SWORD error recommending use of 4.0 "native" API + * to delete dataverses + */ +// String dvAlias = urlManager.getTargetIdentifier(); +// List userVDCs = vdcService.getUserVDCs(vdcUser.getId()); +// VDC dataverseToEmpty = vdcService.findByAlias(dvAlias); +// if (dataverseToEmpty != null) { +// if ("Admin".equals(vdcUser.getNetworkRole().getName())) { +// if (swordConfiguration.allowNetworkAdminDeleteAllStudies()) { +// +// /** +// * @todo: this is the deleteContainer method... +// * should move this to some sort of "emptyContainer" +// * method +// */ +// // curl --insecure -s -X DELETE https://sword:sword@localhost:8181/dvn/api/data-deposit/v1/swordv2/edit/dataverse/sword +// Collection studies = dataverseToEmpty.getOwnedStudies(); +// for (Study study : studies) { +// logger.info("In dataverse " + dataverseToEmpty.getAlias() + " about to delete study id " + study.getId()); +// studyService.deleteStudy(study.getId()); +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "DELETE on a dataverse is not supported"); +// } + +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Role was " + vdcUser.getNetworkRole().getName() + " but admin required."); +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't find dataverse to delete from URL: " + uri); +// } + } else if ("study".equals(targetType)) { + String globalId = urlManager.getTargetIdentifier(); + logger.info("globalId: " + globalId); + if (globalId != null) { + Dataset study = null; + try { + study = datasetService.findByGlobalId(globalId); + } catch (EJBException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on global id (" + globalId + ") in URL: " + uri); + } + if (study != null) { + Dataverse dvThatOwnsStudy = study.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsStudy)) { + DatasetVersion.VersionState studyState = study.getLatestVersion().getVersionState(); + if (studyState.equals(DatasetVersion.VersionState.DRAFT)) { + logger.info("destroying working copy version of study " + study.getGlobalId()); + /** + * @todo in DVN 3.x we had a convenient + * destroyWorkingCopyVersion method but the + * DeleteDatasetCommand is pretty scary... what + * if a released study has a new draft version? + * What we need is a + * DeleteDatasetVersionCommand, I suppose... + */ +// studyService.destroyWorkingCopyVersion(study.getLatestVersion().getId()); + try { + engineSvc.submit(new DeleteDatasetCommand(study, vdcUser)); + /** + * @todo re-index after deletion + * https://redmine.hmdc.harvard.edu/issues/3544#note-21 + */ + logger.info("dataset deleted"); + } catch (CommandExecutionException ex) { + // internal error + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Can't delete dataset: " + ex.getMessage()); + } catch (CommandException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Can't delete dataset: " + ex.getMessage()); + } + /** + * @todo think about how to handle non-drafts + */ + } else if (studyState.equals(DatasetVersion.VersionState.RELEASED)) { +// logger.fine("deaccessioning latest version of study " + study.getGlobalId()); +// studyService.deaccessionStudy(study.getLatestVersion()); + } else if (studyState.equals(DatasetVersion.VersionState.DEACCESSIONED)) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Lastest version of study " + study.getGlobalId() + " has already been deaccessioned."); + } else if (studyState.equals(DatasetVersion.VersionState.ARCHIVED)) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Lastest version of study " + study.getGlobalId() + " has been archived and can not be deleted or deaccessioned."); + } else if (studyState.equals(DatasetVersion.VersionState.IN_REVIEW)) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Lastest version of study " + study.getGlobalId() + " is in review and can not be deleted or deaccessioned."); + } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Operation not valid for study " + study.getGlobalId() + " in state " + studyState); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + vdcUser.getUserName() + " is not authorized to modify " + dvThatOwnsStudy.getAlias()); + } + } else { + throw new SwordError(404); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study to delete from URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unsupported delete target in URL:" + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "No target for deletion specified"); + } + } + + @Override + public DepositReceipt useHeaders(String uri, Deposit deposit, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + logger.fine("uri was " + uri); + logger.fine("isInProgress:" + deposit.isInProgress()); + DataverseUser vdcUser = swordAuth.auth(authCredentials); + urlManager.processUrl(uri); + String targetType = urlManager.getTargetType(); + if (!targetType.isEmpty()) { + logger.fine("operating on target type: " + urlManager.getTargetType()); + if ("study".equals(targetType)) { + String globalId = urlManager.getTargetIdentifier(); + if (globalId != null) { + Dataset studyToRelease = null; + try { +// studyToRelease = studyService.getStudyByGlobalId(globalId); + } catch (EJBException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on global id (" + globalId + ") in URL: " + uri); + } + if (studyToRelease != null) { + Dataverse dvThatOwnsStudy = studyToRelease.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsStudy)) { + if (!deposit.isInProgress()) { + /** + * We are considering a draft version of a study + * to be incomplete and are saying that sending + * isInProgress=false means the study version is + * complete and can be released. + * + * 9.2. Deposit Incomplete + * + * "If In-Progress is true, the server SHOULD + * expect the client to provide further updates + * to the item some undetermined time in the + * future. Details of how this is implemented is + * dependent on the server's purpose. For + * example, a repository system may hold items + * which are marked In-Progress in a workspace + * until such time as a client request indicates + * that the deposit is complete." -- + * http://swordapp.github.io/SWORDv2-Profile/SWORDProfile.html#continueddeposit_incomplete + */ + if (!studyToRelease.getLatestVersion().getVersionState().equals(DatasetVersion.VersionState.RELEASED)) { + /** + * @todo DVN 3.x had + * studyService.setReleased... what should + * we do in 4.0? + */ +// studyService.setReleased(studyToRelease.getId()); + ReceiptGenerator receiptGenerator = new ReceiptGenerator(); + String baseUrl = urlManager.getHostnamePlusBaseUrlPath(uri); + DepositReceipt depositReceipt = receiptGenerator.createReceipt(baseUrl, studyToRelease); + return depositReceipt; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Latest version of dataset " + globalId + " has already been released."); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Pass 'In-Progress: false' header to release a study."); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + vdcUser.getUserName() + " is not authorized to modify dataverse " + dvThatOwnsStudy.getAlias()); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study using globalId " + globalId); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to find globalId for study in URL:" + uri); + } + } else if ("dataverse".equals(targetType)) { + /** + * @todo support releasing of dataverses via SWORD + */ +// String dvAlias = urlManager.getTargetIdentifier(); +// if (dvAlias != null) { +// VDC dvToRelease = vdcService.findByAlias(dvAlias); +// if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvToRelease)) { +// if (dvToRelease != null) { +// String optionalPort = ""; +// URI u; +// try { +// u = new URI(uri); +// int port = u.getPort(); +// if (port != -1) { +// // https often runs on port 8181 in dev +// optionalPort = ":" + port; +// } +// } catch (URISyntaxException ex) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "unable to part URL"); +// } +// String hostName = System.getProperty("dvn.inetAddress"); +// String dvHomePage = "https://" + hostName + optionalPort + "/dvn/dv/" + dvToRelease.getAlias(); +// if (deposit.isInProgress()) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Changing a dataverse to 'not released' is not supported. Please change to 'not released' from the web interface: " + dvHomePage); +// } else { +// try { +// getVDCRequestBean().setVdcNetwork(dvToRelease.getVdcNetwork()); +// } catch (ContextNotActiveException ex) { +// /** +// * todo: observe same rules about dataverse +// * release via web interface such as a study +// * or a collection must be release: +// * https://redmine.hmdc.harvard.edu/issues/3225 +// */ +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Releasing a dataverse is not yet supported. Please release from the web interface: " + dvHomePage); +// } +// OptionsPage optionsPage = new OptionsPage(); +// if (optionsPage.isReleasable()) { +// if (dvToRelease.isRestricted()) { +// logger.fine("releasing dataverse via SWORD: " + dvAlias); +// /** +// * @todo: tweet and send email about +// * release +// */ +// dvToRelease.setReleaseDate(DateUtil.getTimestamp()); +// dvToRelease.setRestricted(false); +// vdcService.edit(dvToRelease); + DepositReceipt fakeDepositReceipt = new DepositReceipt(); +// IRI fakeIri = new IRI("fakeIriDvWasJustReleased"); +// fakeDepositReceipt.setEditIRI(fakeIri); +// fakeDepositReceipt.setVerboseDescription("Dataverse alias: " + dvAlias); + return fakeDepositReceipt; +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Dataverse has already been released: " + dvAlias); +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "dataverse is not releaseable"); +// } +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find dataverse based on alias in URL: " + uri); +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + vdcUser.getUserName() + " is not authorized to modify dataverse " + dvAlias); +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to find dataverse alias in URL: " + uri); +// } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "unsupported target type (" + targetType + ") in URL:" + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Target type missing from URL: " + uri); + } + } + + @Override + public boolean isStatementRequest(String uri, Map map, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + DataverseUser vdcUser = swordAuth.auth(authCredentials); + urlManager.processUrl(uri); + String servlet = urlManager.getServlet(); + if (servlet != null) { + if (servlet.equals("statement")) { + return true; + } else { + return false; + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to determine requested IRI from URL: " + uri); + } + + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java new file mode 100644 index 00000000000..bef7b3e2261 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java @@ -0,0 +1,510 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.DataFile; +import edu.harvard.iq.dataverse.Dataset; +import edu.harvard.iq.dataverse.DatasetServiceBean; +import edu.harvard.iq.dataverse.DatasetVersion; +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseUser; +import edu.harvard.iq.dataverse.EjbDataverseEngine; +import edu.harvard.iq.dataverse.FileMetadata; +import edu.harvard.iq.dataverse.engine.command.Command; +import edu.harvard.iq.dataverse.engine.command.exception.CommandException; +import edu.harvard.iq.dataverse.engine.command.impl.UpdateDatasetCommand; +import edu.harvard.iq.dataverse.ingest.IngestServiceBean; +import java.io.ByteArrayInputStream; +import java.io.File; +import java.io.FileOutputStream; +import java.io.IOException; +import java.io.InputStream; +import java.nio.file.Files; +import java.nio.file.Paths; +import java.nio.file.StandardCopyOption; +import java.util.ArrayList; +import java.util.List; +import java.util.Map; +import java.util.logging.Logger; +import java.util.zip.ZipEntry; +import java.util.zip.ZipInputStream; +import javax.ejb.EJB; +import javax.ejb.EJBException; +import javax.inject.Inject; +import javax.naming.Context; +import javax.naming.InitialContext; +import javax.naming.NamingException; +import javax.validation.ConstraintViolation; +import javax.validation.ConstraintViolationException; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.Deposit; +import org.swordapp.server.DepositReceipt; +import org.swordapp.server.MediaResource; +import org.swordapp.server.MediaResourceManager; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordConfiguration; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; +import org.swordapp.server.UriRegistry; + +public class MediaResourceManagerImpl implements MediaResourceManager { + + private static final Logger logger = Logger.getLogger(MediaResourceManagerImpl.class.getCanonicalName()); + @EJB + EjbDataverseEngine commandEngine; + @EJB + DatasetServiceBean datasetService; + @EJB + IngestServiceBean ingestService; + @Inject + SwordAuth swordAuth; + @Inject + UrlManager urlManager; + + @Override + public MediaResource getMediaResourceRepresentation(String uri, Map map, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + + DataverseUser vdcUser = swordAuth.auth(authCredentials); + urlManager.processUrl(uri); + String globalId = urlManager.getTargetIdentifier(); + if (urlManager.getTargetType().equals("study") && globalId != null) { +// EditStudyService editStudyService; + Context ctx; + try { + ctx = new InitialContext(); +// editStudyService = (EditStudyService) ctx.lookup("java:comp/env/editStudy"); + } catch (NamingException ex) { + logger.info("problem looking up editStudyService"); + throw new SwordServerException("problem looking up editStudyService"); + } + logger.fine("looking up study with globalId " + globalId); +// Study study = editStudyService.getStudyByGlobalId(globalId); + Dataset study = null; + if (study != null) { + /** + * @todo: support this + */ + boolean getMediaResourceRepresentationSupported = false; + if (getMediaResourceRepresentationSupported) { + Dataverse dvThatOwnsStudy = study.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsStudy)) { + InputStream fixmeInputStream = new ByteArrayInputStream("FIXME: replace with zip of all study files".getBytes()); + String contentType = "application/zip"; + String packaging = UriRegistry.PACKAGE_SIMPLE_ZIP; + boolean isPackaged = true; + MediaResource mediaResource = new MediaResource(fixmeInputStream, contentType, packaging, isPackaged); + return mediaResource; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "user " + vdcUser.getUserName() + " is not authorized to get a media resource representation of the dataset with global ID " + study.getGlobalId()); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Please use the Dataverse Network Data Sharing API instead"); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "couldn't find study with global ID of " + globalId); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Couldn't dermine target type or identifier from URL: " + uri); + } + } + + @Override + public DepositReceipt replaceMediaResource(String uri, Deposit deposit, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + /** + * @todo: Perhaps create a new version of a study here? + * + * "The server MUST effectively replace all the existing content in the + * item, although implementations may choose to provide versioning or + * some other mechanism for retaining the overwritten content." -- + * http://swordapp.github.io/SWORDv2-Profile/SWORDProfile.html#protocoloperations_editingcontent_binary + * + * Also, if you enable this method, think about the SwordError currently + * being returned by replaceOrAddFiles with shouldReplace set to true + * and an empty zip uploaded. If no files are unzipped the user will see + * a error about this but the files will still be deleted! + */ + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Replacing the files of a study is not supported. Please delete and add files separately instead."); + } + + @Override + public void deleteMediaResource(String uri, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + DataverseUser vdcUser = swordAuth.auth(authCredentials); + urlManager.processUrl(uri); + String targetType = urlManager.getTargetType(); + String fileId = urlManager.getTargetIdentifier(); + if (targetType != null && fileId != null) { + if ("file".equals(targetType)) { + String fileIdString = urlManager.getTargetIdentifier(); + if (fileIdString != null) { + Long fileIdLong; + try { + fileIdLong = Long.valueOf(fileIdString); + } catch (NumberFormatException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "File id must be a number, not '" + fileIdString + "'. URL was: " + uri); + } + if (fileIdLong != null) { + logger.fine("preparing to delete file id " + fileIdLong); +// StudyFile fileToDelete; + try { +// fileToDelete = studyFileService.getStudyFile(fileIdLong); + } catch (EJBException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to find file id " + fileIdLong); + } +// if (fileToDelete != null) { +// Study study = fileToDelete.getStudy(); +// StudyLock studyLock = study.getStudyLock(); +// if (studyLock != null) { +// String message = Util.getStudyLockMessage(studyLock, study.getGlobalId()); +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, message); +// } +// String globalId = study.getGlobalId(); +// VDC dvThatOwnsFile = fileToDelete.getStudy().getOwner(); +// if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsFile)) { +// EditStudyFilesService editStudyFilesService; +// try { +// Context ctx = new InitialContext(); +// editStudyFilesService = (EditStudyFilesService) ctx.lookup("java:comp/env/editStudyFiles"); +// } catch (NamingException ex) { +// logger.info("problem looking up editStudyFilesService"); +// throw new SwordServerException("problem looking up editStudyFilesService"); +// } +// editStudyFilesService.setStudyVersionByGlobalId(globalId); +// // editStudyFilesService.findStudyFileEditBeanById() would be nice +// List studyFileEditBeans = editStudyFilesService.getCurrentFiles(); +// for (Iterator it = studyFileEditBeans.iterator(); it.hasNext();) { +// StudyFileEditBean studyFileEditBean = (StudyFileEditBean) it.next(); +// if (studyFileEditBean.getStudyFile().getId().equals(fileToDelete.getId())) { +// logger.fine("marked for deletion: " + studyFileEditBean.getStudyFile().getFileName()); +// studyFileEditBean.setDeleteFlag(true); +// } else { +// logger.fine("not marked for deletion: " + studyFileEditBean.getStudyFile().getFileName()); +// } +// } +// editStudyFilesService.save(dvThatOwnsFile.getId(), vdcUser.getId()); +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + vdcUser.getUserName() + " is not authorized to modify " + dvThatOwnsFile.getAlias()); +// } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to find file id " + fileIdLong + " from URL: " + uri); +// } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to find file id in URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not file file to delete in URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unsupported file type found in URL: " + uri); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Target or identifer not specified in URL: " + uri); + } + } + + @Override + public DepositReceipt addResource(String uri, Deposit deposit, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordError, SwordServerException, SwordAuthException { + boolean shouldReplace = false; + return replaceOrAddFiles(uri, deposit, authCredentials, swordConfiguration, shouldReplace); + } + + DepositReceipt replaceOrAddFiles(String uri, Deposit deposit, AuthCredentials authCredentials, SwordConfiguration swordConfiguration, boolean shouldReplace) throws SwordError, SwordAuthException, SwordServerException { + DataverseUser vdcUser = swordAuth.auth(authCredentials); + + urlManager.processUrl(uri); + String globalId = urlManager.getTargetIdentifier(); + if (urlManager.getTargetType().equals("study") && globalId != null) { +// EditStudyService editStudyService; + Context ctx; + try { + ctx = new InitialContext(); +// editStudyService = (EditStudyService) ctx.lookup("java:comp/env/editStudy"); + } catch (NamingException ex) { + logger.info("problem looking up editStudyService"); + throw new SwordServerException("problem looking up editStudyService"); + } + logger.fine("looking up study with globalId " + globalId); + Dataset study = datasetService.findByGlobalId(globalId); + if (study == null) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study with global ID of " + globalId); + } +// StudyLock studyLock = study.getStudyLock(); +// if (studyLock != null) { +// String message = Util.getStudyLockMessage(studyLock, study.getGlobalId()); +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, message); +// } + Long studyId; + try { + studyId = study.getId(); + } catch (NullPointerException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "couldn't find study with global ID of " + globalId); + } + Dataverse dvThatOwnsStudy = study.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsStudy)) { +// editStudyService.setStudyVersion(studyId); +// editStudyService.save(dvThatOwnsStudy.getId(), vdcUser.getId()); +// +// EditStudyFilesService editStudyFilesService; +// try { +// editStudyFilesService = (EditStudyFilesService) ctx.lookup("java:comp/env/editStudyFiles"); +// } catch (NamingException ex) { +// logger.info("problem looking up editStudyFilesService"); +// throw new SwordServerException("problem looking up editStudyFilesService"); +// } +// editStudyFilesService.setStudyVersionByGlobalId(globalId); +// List studyFileEditBeans = editStudyFilesService.getCurrentFiles(); + List exisitingFilenames = new ArrayList(); +// for (Iterator it = studyFileEditBeans.iterator(); it.hasNext();) { +// StudyFileEditBean studyFileEditBean = (StudyFileEditBean) it.next(); + if (shouldReplace) { +// studyFileEditBean.setDeleteFlag(true); +// logger.fine("marked for deletion: " + studyFileEditBean.getStudyFile().getFileName()); + } else { +// String filename = studyFileEditBean.getStudyFile().getFileName(); +// exisitingFilenames.add(filename); + } + } +// editStudyFilesService.save(dvThatOwnsStudy.getId(), vdcUser.getId()); + + if (!deposit.getPackaging().equals(UriRegistry.PACKAGE_SIMPLE_ZIP)) { + throw new SwordError(UriRegistry.ERROR_CONTENT, 415, "Package format " + UriRegistry.PACKAGE_SIMPLE_ZIP + " is required but format specified in 'Packaging' HTTP header was " + deposit.getPackaging()); + } + + // Right now we are only supporting UriRegistry.PACKAGE_SIMPLE_ZIP but + // in the future maybe we'll support other formats? Rdata files? Stata files? + // That's what the uploadDir was going to be for, but for now it's commented out + // + String importDirString; + File importDir; + String swordTempDirString = swordConfiguration.getTempDirectory(); + if (swordTempDirString == null) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not determine temp directory"); + } else { + importDirString = swordTempDirString + File.separator + "import" + File.separator + study.getId().toString(); + importDir = new File(importDirString); + if (!importDir.exists()) { + if (!importDir.mkdirs()) { + logger.info("couldn't create directory: " + importDir.getAbsolutePath()); + throw new SwordServerException("couldn't create import directory"); + } + } + } + + if (true) { + DataFile dFile = new DataFile("application/octet-stream"); + dFile.setOwner(study); + datasetService.generateFileSystemName(dFile); +// if (true) { +// throw returnEarly("dataFile.getFileSystemName(): " + dFile.getFileSystemName()); +// } + InputStream depositInputStream = deposit.getInputStream(); + try { + Files.copy(depositInputStream, Paths.get(importDirString, dFile.getFileSystemName()), StandardCopyOption.REPLACE_EXISTING); + } catch (IOException ex) { + throw new SwordError("problem running Files.copy"); + } + study.getFiles().add(dFile); + + DatasetVersion editVersion = study.getEditVersion(); +// boolean metadataExtracted = false; +// try { +// metadataExtracted = ingestService.extractIndexableMetadata(importDir.getAbsolutePath() + File.separator + dFile.getFileSystemName(), dFile, editVersion); +// } catch (IOException ex) { +// throw returnEarly("couldn't extract metadata" + ex); +// } + FileMetadata fmd = new FileMetadata(); + fmd.setDataFile(dFile); + fmd.setLabel("myLabel"); + fmd.setDatasetVersion(editVersion); + dFile.getFileMetadatas().add(fmd); + + Command cmd; + cmd = new UpdateDatasetCommand(study, vdcUser); + try { + /** + * @todo at update time indexing is run but the file is not + * indexed. Why? Manually re-indexing later finds it. Fix + * this. Related to + * https://redmine.hmdc.harvard.edu/issues/3809 ? + */ + study = commandEngine.submit(cmd); + } catch (CommandException ex) { + throw returnEarly("couldn't update dataset"); + } catch (EJBException ex) { + Throwable cause = ex; + StringBuilder sb = new StringBuilder(); + sb.append(ex.getLocalizedMessage()); + while (cause.getCause() != null) { + cause = cause.getCause(); + sb.append(cause + " "); + if (cause instanceof ConstraintViolationException) { + ConstraintViolationException constraintViolationException = (ConstraintViolationException) cause; + for (ConstraintViolation violation : constraintViolationException.getConstraintViolations()) { + sb.append(" Invalid value: <<<").append(violation.getInvalidValue()).append(">>> for ") + .append(violation.getPropertyPath()).append(" at ") + .append(violation.getLeafBean()).append(" - ") + .append(violation.getMessage()); + } + } + } + throw returnEarly("EJBException: " + sb.toString()); + } + + } + + /** + * @todo remove this comment after confirming that the upstream jar + * now has our bugfix + */ + // the first character of the filename is truncated with the official jar + // so we use include the bug fix at https://github.com/IQSS/swordv2-java-server-library/commit/aeaef83 + // and use this jar: https://build.hmdc.harvard.edu:8443/job/swordv2-java-server-library-iqss/2/ + String uploadedZipFilename = deposit.getFilename(); + ZipInputStream ziStream = new ZipInputStream(deposit.getInputStream()); + ZipEntry zEntry; + FileOutputStream tempOutStream = null; +// List fbList = new ArrayList(); + + try { + // copied from createStudyFilesFromZip in AddFilesPage + while ((zEntry = ziStream.getNextEntry()) != null) { + // Note that some zip entries may be directories - we + // simply skip them: + if (!zEntry.isDirectory()) { + + String fileEntryName = zEntry.getName(); + logger.fine("file found: " + fileEntryName); + + String dirName = null; + String finalFileName = null; + + int ind = fileEntryName.lastIndexOf('/'); + + if (ind > -1) { + finalFileName = fileEntryName.substring(ind + 1); + if (ind > 0) { + dirName = fileEntryName.substring(0, ind); + dirName = dirName.replace('/', '-'); + } + } else { + finalFileName = fileEntryName; + } + + if (".DS_Store".equals(finalFileName)) { + continue; + } + + // http://superuser.com/questions/212896/is-there-any-way-to-prevent-a-mac-from-creating-dot-underscore-files + if (finalFileName.startsWith("._")) { + continue; + } + + File tempUploadedFile = new File(importDir, finalFileName); + tempOutStream = new FileOutputStream(tempUploadedFile); + + byte[] dataBuffer = new byte[8192]; + int i = 0; + + while ((i = ziStream.read(dataBuffer)) > 0) { + tempOutStream.write(dataBuffer, 0, i); + tempOutStream.flush(); + } + + tempOutStream.close(); + + // We now have the unzipped file saved in the upload directory; + // zero-length dta files (for example) are skipped during zip + // upload in the GUI, so we'll skip them here as well + if (tempUploadedFile.length() != 0) { + + if (true) { +// tempUploadedFile; +// UploadedFile uFile = tempUploadedFile; +// DataFile dataFile = new DataFile(); +// throw new SwordError("let's create a file"); + } +// StudyFileEditBean tempFileBean = new StudyFileEditBean(tempUploadedFile, studyService.generateFileSystemNameSequence(), study); +// tempFileBean.setSizeFormatted(tempUploadedFile.length()); + String finalFileNameAfterReplace = finalFileName; +// if (tempFileBean.getStudyFile() instanceof TabularDataFile) { + // predict what the tabular file name will be +// finalFileNameAfterReplace = FileUtil.replaceExtension(finalFileName); +// } + +// validateFileName(exisitingFilenames, finalFileNameAfterReplace, study); + // And, if this file was in a legit (non-null) directory, + // we'll use its name as the file category: + if (dirName != null) { +// tempFileBean.getFileMetadata().setCategory(dirName); + } + +// fbList.add(tempFileBean); + } + } else { + logger.fine("directory found: " + zEntry.getName()); + } + } + } catch (IOException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Problem with file: " + uploadedZipFilename); + } finally { + /** + * @todo shouldn't we delete this uploadDir? Commented out in + * DVN 3.x + */ +// if (!uploadDir.delete()) { +// logger.fine("Unable to delete " + uploadDir.getAbsolutePath()); +// } + } +// if (fbList.size() > 0) { +// StudyFileServiceLocal studyFileService; +// try { +// studyFileService = (StudyFileServiceLocal) ctx.lookup("java:comp/env/studyFileService"); +// } catch (NamingException ex) { +// logger.info("problem looking up studyFileService"); +// throw new SwordServerException("problem looking up studyFileService"); +// } + try { +// studyFileService.addFiles(study.getLatestVersion(), fbList, vdcUser); + } catch (EJBException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to add file(s) to study: " + ex.getMessage()); + } + ReceiptGenerator receiptGenerator = new ReceiptGenerator(); + String baseUrl = urlManager.getHostnamePlusBaseUrlPath(uri); + DepositReceipt depositReceipt = receiptGenerator.createReceipt(baseUrl, study); + return depositReceipt; + } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Problem with zip file '" + uploadedZipFilename + "'. Number of files unzipped: " + fbList.size()); + } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "user " + vdcUser.getUserName() + " is not authorized to modify study with global ID " + study.getGlobalId()); + return new DepositReceipt(); // added just to get this to compile 2014-05-14 + } +// } else { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to determine target type or identifier from URL: " + uri); +// } +// } + + // copied from AddFilesPage +// private void validateFileName(List existingFilenames, String fileName, Study study) throws SwordError { +// if (fileName.contains("\\") +// || fileName.contains("/") +// || fileName.contains(":") +// || fileName.contains("*") +// || fileName.contains("?") +// || fileName.contains("\"") +// || fileName.contains("<") +// || fileName.contains(">") +// || fileName.contains("|") +// || fileName.contains(";") +// || fileName.contains("#")) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Invalid File Name - cannot contain any of the following characters: \\ / : * ? \" < > | ; . Filename was '" + fileName + "'"); +// } +// if (existingFilenames.contains(fileName)) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Filename " + fileName + " already exists in study " + study.getGlobalId()); +// } +// } + private SwordError returnEarly(String error) { + SwordError swordError = new SwordError(error); + StackTraceElement[] emptyStackTrace = new StackTraceElement[0]; + swordError.setStackTrace(emptyStackTrace); + return swordError; + + } +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ReceiptGenerator.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ReceiptGenerator.java new file mode 100644 index 00000000000..6f55b594adc --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ReceiptGenerator.java @@ -0,0 +1,40 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.Dataset; +import java.util.logging.Logger; +import org.apache.abdera.i18n.iri.IRI; +import org.swordapp.server.DepositReceipt; + +public class ReceiptGenerator { + + private static final Logger logger = Logger.getLogger(ReceiptGenerator.class.getCanonicalName()); + + DepositReceipt createReceipt(String baseUrl, Dataset dataset) { + logger.fine("baseUrl was: " + baseUrl); + DepositReceipt depositReceipt = new DepositReceipt(); + String globalId = dataset.getGlobalId(); + /** + * @todo should these URLs continue to have "study" in them? Do we need + * to keep it as "study" for backwards compatibility or it ok to use + * "dataset"? http://irclog.iq.harvard.edu/dvn/2014-05-14#i_9404 + */ + String editIri = baseUrl + "/edit/study/" + globalId; + depositReceipt.setEditIRI(new IRI(editIri)); + /** + * @todo: should setLocation depend on if an atom entry or a zip file + * was deposited? (This @todo has been carried over from the DVN 3.x + * version.) + */ + depositReceipt.setLocation(new IRI(editIri)); + depositReceipt.setEditMediaIRI(new IRI(baseUrl + "/edit-media/study/" + globalId)); + depositReceipt.setStatementURI("application/atom+xml;type=feed", baseUrl + "/statement/study/" + globalId); + depositReceipt.addDublinCore("bibliographicCitation", dataset.getLatestVersion().getCitation()); + /** + * @todo is this still returning the database id? + * https://redmine.hmdc.harvard.edu/issues/3397 ? + */ + depositReceipt.setSplashUri(dataset.getPersistentURL()); + return depositReceipt; + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2CollectionServlet.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2CollectionServlet.java new file mode 100644 index 00000000000..6d9605a553e --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2CollectionServlet.java @@ -0,0 +1,37 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.io.IOException; +import java.rmi.ServerException; +import javax.inject.Inject; +import javax.servlet.ServletException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import org.swordapp.server.CollectionAPI; +import org.swordapp.server.servlets.SwordServlet; + +public class SWORDv2CollectionServlet extends SwordServlet { + + @Inject + CollectionDepositManagerImpl collectionDepositManagerImpl; + @Inject + CollectionListManagerImpl collectionListManagerImpl; + protected CollectionAPI api; + + public void init() throws ServletException { + super.init(); + this.api = new CollectionAPI(collectionListManagerImpl, collectionDepositManagerImpl, this.config); + } + + @Override + protected void doGet(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.get(req, resp); + } + + @Override + protected void doPost(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.post(req, resp); + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2ContainerServlet.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2ContainerServlet.java new file mode 100644 index 00000000000..5973b190981 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2ContainerServlet.java @@ -0,0 +1,66 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.io.IOException; +import javax.inject.Inject; +import javax.servlet.ServletException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import org.swordapp.server.ContainerAPI; +import org.swordapp.server.ContainerManager; +import org.swordapp.server.StatementManager; +import org.swordapp.server.servlets.SwordServlet; + +public class SWORDv2ContainerServlet extends SwordServlet { + + @Inject + ContainerManagerImpl containerManagerImpl; + @Inject + StatementManagerImpl statementManagerImpl; + private ContainerManager cm; + private ContainerAPI api; + private StatementManager sm; + + public void init() throws ServletException { + super.init(); + + // load the container manager implementation + this.cm = containerManagerImpl; + + // load the statement manager implementation + this.sm = statementManagerImpl; + + // initialise the underlying servlet processor + this.api = new ContainerAPI(this.cm, this.sm, this.config); + } + + @Override + protected void doGet(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.get(req, resp); + } + + @Override + protected void doHead(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.head(req, resp); + } + + @Override + protected void doPut(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.put(req, resp); + } + + @Override + protected void doPost(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.post(req, resp); + } + + @Override + protected void doDelete(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.delete(req, resp); + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2MediaResourceServlet.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2MediaResourceServlet.java new file mode 100644 index 00000000000..dd44278ccef --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2MediaResourceServlet.java @@ -0,0 +1,53 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.io.IOException; +import javax.inject.Inject; +import javax.servlet.ServletException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import org.swordapp.server.MediaResourceAPI; +import org.swordapp.server.servlets.SwordServlet; + +public class SWORDv2MediaResourceServlet extends SwordServlet { + + @Inject + MediaResourceManagerImpl mediaResourceManagerImpl; + protected MediaResourceAPI api; + + public void init() throws ServletException { + super.init(); + + // load the api + this.api = new MediaResourceAPI(mediaResourceManagerImpl, this.config); + } + + @Override + protected void doGet(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.get(req, resp); + } + + @Override + protected void doHead(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.head(req, resp); + } + + @Override + protected void doPost(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.post(req, resp); + } + + @Override + protected void doPut(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.put(req, resp); + } + + @Override + protected void doDelete(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.delete(req, resp); + } +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2ServiceDocumentServlet.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2ServiceDocumentServlet.java new file mode 100644 index 00000000000..5304c60480f --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2ServiceDocumentServlet.java @@ -0,0 +1,28 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.io.IOException; +import javax.inject.Inject; +import javax.servlet.ServletException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import org.swordapp.server.ServiceDocumentAPI; +import org.swordapp.server.servlets.SwordServlet; + +public class SWORDv2ServiceDocumentServlet extends SwordServlet { + + @Inject + ServiceDocumentManagerImpl serviceDocumentManagerImpl; + protected ServiceDocumentAPI api; + + public void init() throws ServletException { + super.init(); + this.api = new ServiceDocumentAPI(serviceDocumentManagerImpl, this.config); + } + + @Override + protected void doGet(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.api.get(req, resp); + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2StatementServlet.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2StatementServlet.java new file mode 100644 index 00000000000..b413309886c --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SWORDv2StatementServlet.java @@ -0,0 +1,35 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.io.IOException; +import javax.inject.Inject; +import javax.servlet.ServletException; +import javax.servlet.http.HttpServletRequest; +import javax.servlet.http.HttpServletResponse; +import org.swordapp.server.StatementAPI; +import org.swordapp.server.StatementManager; +import org.swordapp.server.servlets.SwordServlet; + +public class SWORDv2StatementServlet extends SwordServlet { + + @Inject + StatementManagerImpl statementManagerImpl; + private StatementManager sm; + private StatementAPI statementApi; + + public void init() throws ServletException { + super.init(); + + // load the container manager implementation + this.sm = statementManagerImpl; + + // initialise the underlying servlet processor + this.statementApi = new StatementAPI(this.sm, this.config); + } + + @Override + protected void doGet(HttpServletRequest req, HttpServletResponse resp) + throws ServletException, IOException { + this.statementApi.get(req, resp); + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ServiceDocumentManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ServiceDocumentManagerImpl.java new file mode 100644 index 00000000000..4f5d22795b5 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/ServiceDocumentManagerImpl.java @@ -0,0 +1,86 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseServiceBean; +import edu.harvard.iq.dataverse.DataverseUser; +import java.util.List; +import java.util.logging.Logger; +import javax.ejb.EJB; +import javax.inject.Inject; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.ServiceDocument; +import org.swordapp.server.ServiceDocumentManager; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordCollection; +import org.swordapp.server.SwordConfiguration; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; +import org.swordapp.server.SwordWorkspace; +import org.swordapp.server.UriRegistry; + +public class ServiceDocumentManagerImpl implements ServiceDocumentManager { + + private static final Logger logger = Logger.getLogger(ServiceDocumentManagerImpl.class.getCanonicalName()); + @EJB + DataverseServiceBean dataverseService; + @Inject + SwordAuth swordAuth; + @Inject + UrlManager urlManager; + + @Override + public ServiceDocument getServiceDocument(String sdUri, AuthCredentials authCredentials, SwordConfiguration config) + throws SwordError, SwordServerException, SwordAuthException { + + DataverseUser vdcUser = swordAuth.auth(authCredentials); + urlManager.processUrl(sdUri); + ServiceDocument service = new ServiceDocument(); + SwordWorkspace swordWorkspace = new SwordWorkspace(); + Dataverse rootDataverse = dataverseService.findRootDataverse(); + if (rootDataverse != null) { + String name = rootDataverse.getName(); + if (name != null) { + swordWorkspace.setTitle(name); + } + } + service.setMaxUploadSize(config.getMaxUploadSize()); + String hostnamePlusBaseUrl = urlManager.getHostnamePlusBaseUrlPath(sdUri); + if (hostnamePlusBaseUrl == null) { + ServiceDocument serviceDocument = new ServiceDocument(); + return serviceDocument; + } + /** + * @todo All dataverses? listing all dataverse here is horribly + * inefficient. We need the equivalent of this from 3.x: + * + * List vdcList = vdcService.getUserVDCs(vdcUser.getId()); + */ + List allDataverses = dataverseService.findAll(); + for (Dataverse dataverse : allDataverses) { + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dataverse)) { + String dvAlias = dataverse.getAlias(); + if (dvAlias != null && !dvAlias.isEmpty()) { + SwordCollection swordCollection = new SwordCollection(); + swordCollection.setTitle(dataverse.getName()); + swordCollection.setHref(hostnamePlusBaseUrl + "/collection/dataverse/" + dvAlias); + swordCollection.addAcceptPackaging(UriRegistry.PACKAGE_SIMPLE_ZIP); + /** + * @todo for backwards-compatibility with DVN 3.x, display + * terms of uses for root dataverse and the dataverse we + * are iterating over. What if the root dataverse is not the + * direct parent of the dataverse we're iterating over? Show + * the terms of use each generation back to the root? + * + * See also https://redmine.hmdc.harvard.edu/issues/3967 + */ + // swordCollection.setCollectionPolicy(dvnNetworkName + " deposit terms of use: " + vdcNetworkService.findRootNetwork().getDepositTermsOfUse() + "\n---\n" + dataverse.getName() + " deposit terms of use: " + dataverse.getDepositTermsOfUse()); + swordWorkspace.addCollection(swordCollection); + } + } + } + service.addWorkspace(swordWorkspace); + + return service; + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/StatementManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/StatementManagerImpl.java new file mode 100644 index 00000000000..8b7eeed0658 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/StatementManagerImpl.java @@ -0,0 +1,161 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.DataFile; +import edu.harvard.iq.dataverse.Dataset; +import edu.harvard.iq.dataverse.DatasetServiceBean; +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseUser; +import edu.harvard.iq.dataverse.FileMetadata; +import java.util.Date; +import java.util.HashMap; +import java.util.List; +import java.util.Map; +import java.util.logging.Logger; +import javax.ejb.EJB; +import javax.inject.Inject; +import org.apache.abdera.i18n.iri.IRI; +import org.apache.abdera.i18n.iri.IRISyntaxException; +import org.apache.abdera.model.AtomDate; +import org.swordapp.server.AtomStatement; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.ResourcePart; +import org.swordapp.server.Statement; +import org.swordapp.server.StatementManager; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordConfiguration; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; +import org.swordapp.server.UriRegistry; + +public class StatementManagerImpl implements StatementManager { + + private static final Logger logger = Logger.getLogger(StatementManagerImpl.class.getCanonicalName()); + + @EJB + DatasetServiceBean datasetService; + @Inject + SwordAuth swordAuth; + @Inject + UrlManager urlManager; + SwordConfigurationImpl swordConfiguration = new SwordConfigurationImpl(); + + @Override + public Statement getStatement(String editUri, Map map, AuthCredentials authCredentials, SwordConfiguration swordConfiguration) throws SwordServerException, SwordError, SwordAuthException { + this.swordConfiguration = (SwordConfigurationImpl) swordConfiguration; + swordConfiguration = (SwordConfigurationImpl) swordConfiguration; + if (authCredentials == null) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "auth credentials are null"); + } + if (swordAuth == null) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "swordAuth is null"); + } + + DataverseUser vdcUser = swordAuth.auth(authCredentials); + urlManager.processUrl(editUri); + String globalId = urlManager.getTargetIdentifier(); + if (urlManager.getTargetType().equals("study") && globalId != null) { + + logger.fine("request for sword statement by user " + vdcUser.getUserName()); + Dataset study = datasetService.findByGlobalId(globalId); +// try { +// study = studyService.getStudyByGlobalId(globalId); +// } catch (EJBException ex) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not find study based on global id (" + globalId + ") in URL: " + editUri); +// } + Long studyId; + try { + studyId = study.getId(); + } catch (NullPointerException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "couldn't find study with global ID of " + globalId); + } + + Dataverse dvThatOwnsStudy = study.getOwner(); + if (swordAuth.hasAccessToModifyDataverse(vdcUser, dvThatOwnsStudy)) { + String feedUri = urlManager.getHostnamePlusBaseUrlPath(editUri) + "/edit/study/" + study.getGlobalId(); + /** + * @todo is it safe to use this? + */ + String author = study.getLatestVersion().getAuthorsStr(); + String title = study.getLatestVersion().getTitle(); + // in the statement, the element is called "updated" + Date lastUpdatedFinal = new Date(); + Date lastUpdateTime = study.getLatestVersion().getLastUpdateTime(); + if (lastUpdateTime != null) { + lastUpdatedFinal = lastUpdateTime; + } else { + /** + * @todo In DVN 3.x lastUpdated was set on the service bean: + * https://github.com/IQSS/dvn/blob/8ca34aded90511730c35ca32ace844770c24c68e/DVN-root/DVN-web/src/main/java/edu/harvard/iq/dvn/core/study/StudyServiceBean.java#L1803 + * + * In 4.0, lastUpdateTime is always null. + */ + logger.info("lastUpdateTime was null, trying createtime"); + Date createtime = study.getLatestVersion().getCreateTime(); + if (createtime != null) { + lastUpdatedFinal = createtime; + } else { + logger.info("creatime was null, using \"now\""); + lastUpdatedFinal = new Date(); + } + } + AtomDate atomDate = new AtomDate(lastUpdatedFinal); + String datedUpdated = atomDate.toString(); + Statement statement = new AtomStatement(feedUri, author, title, datedUpdated); + Map states = new HashMap(); + states.put("latestVersionState", study.getLatestVersion().getVersionState().toString()); + /** + * @todo DVN 3.x had a studyLock. What's the equivalent in 4.0? + */ +// StudyLock lock = study.getStudyLock(); +// if (lock != null) { +// states.put("locked", "true"); +// states.put("lockedDetail", lock.getDetail()); +// states.put("lockedStartTime", lock.getStartTime().toString()); +// } else { +// states.put("locked", "false"); +// } + statement.setStates(states); + List fileMetadatas = study.getLatestVersion().getFileMetadatas(); + for (FileMetadata fileMetadata : fileMetadatas) { + DataFile studyFile = fileMetadata.getDataFile(); + // We are exposing the filename for informational purposes. The file id is what you + // actually operate on to delete a file, etc. + // + // Replace spaces to avoid IRISyntaxException + String fileNameFinal = fileMetadata.getLabel().replace(' ', '_'); + String studyFileUrlString = urlManager.getHostnamePlusBaseUrlPath(editUri) + "/edit-media/file/" + studyFile.getId() + "/" + fileNameFinal; + IRI studyFileUrl; + try { + studyFileUrl = new IRI(studyFileUrlString); + } catch (IRISyntaxException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Invalid URL for file ( " + studyFileUrlString + " ) resulted in " + ex.getMessage()); + } + ResourcePart resourcePart = new ResourcePart(studyFileUrl.toString()); + /** + * @todo get this working. show the actual file type + */ +// resourcePart.setMediaType(studyFile.getOriginalFileFormat()); + resourcePart.setMediaType("application/octet-stream"); + /** + * @todo: Why are properties set on a ResourcePart not + * exposed when you GET a Statement? + */ +// Map properties = new HashMap(); +// properties.put("filename", studyFile.getFileName()); +// properties.put("category", studyFile.getLatestCategory()); +// properties.put("originalFileType", studyFile.getOriginalFileType()); +// properties.put("id", studyFile.getId().toString()); +// properties.put("UNF", studyFile.getUnf()); +// resourcePart.setProperties(properties); + statement.addResource(resourcePart); + } + return statement; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "user " + vdcUser.getUserName() + " is not authorized to view study with global ID " + globalId); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Could not determine target type or identifier from URL: " + editUri); + } + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordAuth.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordAuth.java new file mode 100644 index 00000000000..849ea4b4e05 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordAuth.java @@ -0,0 +1,85 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import edu.harvard.iq.dataverse.Dataverse; +import edu.harvard.iq.dataverse.DataverseUser; +import edu.harvard.iq.dataverse.DataverseUserServiceBean; +import edu.harvard.iq.dataverse.PasswordEncryption; +import java.util.logging.Logger; +import javax.ejb.EJB; +import org.swordapp.server.AuthCredentials; +import org.swordapp.server.SwordAuthException; +import org.swordapp.server.SwordError; +import org.swordapp.server.SwordServerException; + +public class SwordAuth { + + private static final Logger logger = Logger.getLogger(SwordAuth.class.getCanonicalName()); + @EJB + DataverseUserServiceBean dataverseUserService; + + public DataverseUser auth(AuthCredentials authCredentials) throws SwordAuthException, SwordServerException { + + if (authCredentials != null) { + String username = authCredentials.getUsername(); + String password = authCredentials.getPassword(); + logger.fine("Checking username " + username + " ..."); + DataverseUser dataverseUser = dataverseUserService.findByUserName(username); + if (dataverseUser != null) { + String encryptedPassword = PasswordEncryption.getInstance().encrypt(password); + if (encryptedPassword.equals(dataverseUser.getEncryptedPassword())) { + return dataverseUser; + } else { + logger.fine("wrong password"); + throw new SwordAuthException(); + } + } else { + logger.fine("could not find username: " + username); + throw new SwordAuthException(); + } + } else { + // in DVN 3.x at least, it seems this was never reached... eaten somewhere by way of ServiceDocumentServletDefault -> ServiceDocumentAPI -> SwordAPIEndpoint + logger.info("no credentials provided"); + throw new SwordAuthException(); + } + } + + boolean hasAccessToModifyDataverse(DataverseUser dataverseUser, Dataverse dataverse) throws SwordError { + boolean authorized = false; + + /** + * @todo use actual roles + */ +// VDCRole role = vdcUser.getVDCRole(dv); +// String roleString = null; +// if (role != null) { +// roleString = role.getRole().getName(); +// if ("admin".equals(roleString)) { +// authorized = true; +// } else if ("contributor".equals(roleString) || "curator".equals(roleString) || "privileged viewer".equals(roleString)) { +// authorized = false; +// return early to avoid throwing exception when getting Service Document +// return authorized; +// } else { +// authorized = false; +// } +// } +// + if (dataverse.getCreator().equals(dataverseUser)) { + authorized = true; + return authorized; + } else { + authorized = false; + return authorized; + } + + /** + * @todo: for backwards compatibility with DVN 3.x do we need to throw + * this SWORD error? + */ +// if (!authorized) { +// throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "User " + dataverseUser.getUserName() + " with role of " + roleString + " is not authorized to modify dataverse " + dataverse.getAlias()); +// } else { +// return authorized; +// } + } +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java new file mode 100644 index 00000000000..3a579dbad3d --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/SwordConfigurationImpl.java @@ -0,0 +1,108 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.io.File; +import java.util.logging.Logger; +import org.swordapp.server.SwordConfiguration; + +public class SwordConfigurationImpl implements SwordConfiguration { + + private static final Logger logger = Logger.getLogger(SwordConfigurationImpl.class.getCanonicalName()); + + String getBaseUrlPath() { + return "/dvn/api/data-deposit/v1/swordv2"; + } + + @Override + public boolean returnDepositReceipt() { + return true; + } + + @Override + public boolean returnStackTraceInError() { + return true; + } + + @Override + public boolean returnErrorBody() { + return true; + } + + @Override + public String generator() { + return "http://www.swordapp.org/"; + } + + @Override + public String generatorVersion() { + return "2.0"; + } + + @Override + public String administratorEmail() { + return null; + } + + @Override + public String getAuthType() { + // using "Basic" here to match what's in SwordAPIEndpoint + return "Basic"; + } + + @Override + public boolean storeAndCheckBinary() { + return true; + } + + @Override + public String getTempDirectory() { + /** + * @todo is it safe to use dataverse.files.directory for this? + */ +// String tmpFileDir = System.getProperty("vdc.temp.file.dir"); + String tmpFileDir = System.getProperty("dataverse.files.directory"); + if (tmpFileDir != null) { + return tmpFileDir + File.separator + "sword"; + } else { + return null; + } + } + + @Override + public int getMaxUploadSize() { + int unlimited = -1; + /** + * @todo rename this from dvn to dataverse? + */ + String jvmOption = "dvn.dataDeposit.maxUploadInBytes"; + String maxUploadInBytes = System.getProperty(jvmOption); + if (maxUploadInBytes != null) { + try { + int maxUploadSizeInBytes = Integer.parseInt(maxUploadInBytes); + return maxUploadSizeInBytes; + } catch (NumberFormatException ex) { + logger.fine("Could not convert " + maxUploadInBytes + " from JVM option " + jvmOption + " to int. Setting Data Deposit APU max upload size limit to unlimited."); + return unlimited; + } + } else { + logger.fine("JVM option " + jvmOption + " is undefined. Setting Data Deposit APU max upload size limit to unlimited."); + return unlimited; + + } + } + + @Override + public String getAlternateUrl() { + return null; + } + + @Override + public String getAlternateUrlContentType() { + return null; + } + + @Override + public boolean allowUnauthenticatedMediaAccess() { + return false; + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/UrlManager.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/UrlManager.java new file mode 100644 index 00000000000..2c960d54ea9 --- /dev/null +++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/UrlManager.java @@ -0,0 +1,167 @@ +package edu.harvard.iq.dataverse.api.datadeposit; + +import java.net.URI; +import java.net.URISyntaxException; +import java.util.Arrays; +import java.util.List; +import java.util.logging.Logger; +import org.apache.commons.lang.StringUtils; +import org.swordapp.server.SwordError; +import org.swordapp.server.UriRegistry; + +public class UrlManager { + + private static final Logger logger = Logger.getLogger(UrlManager.class.getCanonicalName()); + String originalUrl; + SwordConfigurationImpl swordConfiguration = new SwordConfigurationImpl(); + String servlet; + String targetType; + String targetIdentifier; + int port; + + void processUrl(String url) throws SwordError { + logger.fine("URL was: " + url); + this.originalUrl = url; + URI javaNetUri; + try { + javaNetUri = new URI(url); + } catch (URISyntaxException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Invalid URL syntax: " + url); + } + if (!"https".equals(javaNetUri.getScheme())) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "https is required but protocol was " + javaNetUri.getScheme()); + } + this.port = javaNetUri.getPort(); + String[] urlPartsArray = javaNetUri.getPath().split("/"); + List urlParts = Arrays.asList(urlPartsArray); + String dataDepositApiBasePath; + try { + List dataDepositApiBasePathParts; + // 1 2 3 4 5 6 7 8 9 + // for example: /dvn/api/data-deposit/v1/swordv2/collection/dataverse/sword + dataDepositApiBasePathParts = urlParts.subList(0, 6); + dataDepositApiBasePath = StringUtils.join(dataDepositApiBasePathParts, "/"); + } catch (IndexOutOfBoundsException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Error processing URL: " + url); + } + if (!dataDepositApiBasePath.equals(swordConfiguration.getBaseUrlPath())) { + throw new SwordError(dataDepositApiBasePath + " found but " + swordConfiguration.getBaseUrlPath() + " expected"); + } + try { + this.servlet = urlParts.get(6); + } catch (ArrayIndexOutOfBoundsException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to determine servlet path from URL: " + url); + } + if (!servlet.equals("service-document")) { + List targetTypeAndIdentifier; + try { + // 6 7 8 + // for example: /collection/dataverse/sword + targetTypeAndIdentifier = urlParts.subList(7, urlParts.size()); + } catch (IndexOutOfBoundsException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "No target components specified in URL: " + url); + } + this.targetType = targetTypeAndIdentifier.get(0); + if (targetType != null) { + if (targetType.equals("dataverse")) { + String dvAlias; + try { + dvAlias = targetTypeAndIdentifier.get(1); + } catch (IndexOutOfBoundsException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "No dataverse alias provided in URL: " + url); + } + this.targetIdentifier = dvAlias; + } else if (targetType.equals("study")) { + String globalId; + try { + List globalIdParts = targetTypeAndIdentifier.subList(1, targetTypeAndIdentifier.size()); + globalId = StringUtils.join(globalIdParts, "/"); + } catch (IndexOutOfBoundsException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Invalid study global id provided in URL: " + url); + } + this.targetIdentifier = globalId; + } else if (targetType.equals("file")) { + String fileIdString; + try { + // a user might reasonably pass in a filename as well [.get(2)] since + // we expose it in the statement of a study but we ignore it here + fileIdString = targetTypeAndIdentifier.get(1); + } catch (IndexOutOfBoundsException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "No file id provided in URL: " + url); + } + this.targetIdentifier = fileIdString; + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "unsupported target type: " + targetType); + } + } else { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to determine target type from URL: " + url); + } + logger.fine("target type: " + targetType); + logger.fine("target identifier: " + targetIdentifier); + } + } + + String getHostnamePlusBaseUrlPath(String url) throws SwordError { + String optionalPort = ""; + URI u; + try { + u = new URI(url); + int port = u.getPort(); + if (port != -1) { + // https often runs on port 8181 in dev + optionalPort = ":" + port; + } + } catch (URISyntaxException ex) { + throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "unable to part URL"); + } + /** + * @todo do we need the equivalent of dvn.inetAddress in 4.0? + */ + String hostName = System.getProperty("dataverse.datadeposit.hostname"); + if (hostName == null) { + hostName = "localhost"; + } + return "https://" + hostName + optionalPort + swordConfiguration.getBaseUrlPath(); + } + + public String getOriginalUrl() { + return originalUrl; + } + + public void setOriginalUrl(String originalUrl) { + this.originalUrl = originalUrl; + } + + public String getServlet() { + return servlet; + } + + public void setServlet(String servlet) { + this.servlet = servlet; + } + + public String getTargetIdentifier() { + return targetIdentifier; + } + + public void setTargetIdentifier(String targetIdentifier) { + this.targetIdentifier = targetIdentifier; + } + + public String getTargetType() { + return targetType; + } + + public void setTargetType(String targetType) { + this.targetType = targetType; + } + + public int getPort() { + return port; + } + + public void setPort(int port) { + this.port = port; + } + +} diff --git a/src/main/java/edu/harvard/iq/dataverse/util/DatasetFieldWalker.java b/src/main/java/edu/harvard/iq/dataverse/util/DatasetFieldWalker.java index 6e592a0532c..b437aa37d6e 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/DatasetFieldWalker.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/DatasetFieldWalker.java @@ -5,6 +5,8 @@ import edu.harvard.iq.dataverse.DatasetFieldCompoundValue; import edu.harvard.iq.dataverse.DatasetFieldType; import edu.harvard.iq.dataverse.DatasetFieldValue; +import java.util.ArrayList; +import java.util.Collections; import java.util.Comparator; import java.util.List; import java.util.SortedSet; @@ -97,9 +99,9 @@ public void setL(Listener l) { } static private Iterable sort( List list, Comparator cmp ) { - SortedSet ret = new TreeSet<>(cmp); - ret.addAll( list ); - return ret; + ArrayList tbs = new ArrayList<>(list); + Collections.sort(tbs, cmp); + return tbs; } } diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonParser.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonParser.java index b43aec1f42f..453c22c0471 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonParser.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonParser.java @@ -165,10 +165,11 @@ public List parseCompoundValue( DatasetFieldType comp if ( json.getBoolean("multiple") ) { List vals = new LinkedList<>(); - for ( JsonArray arr : json.getJsonArray("value").getValuesAs(JsonArray.class) ) { + for ( JsonObject obj : json.getJsonArray("value").getValuesAs(JsonObject.class) ) { DatasetFieldCompoundValue cv = new DatasetFieldCompoundValue(); List fields = new LinkedList<>(); - for ( JsonObject childFieldJson : arr.getValuesAs(JsonObject.class) ) { + for ( String fieldName: obj.keySet() ) { + JsonObject childFieldJson = obj.getJsonObject(fieldName); DatasetField f = parseField( childFieldJson ); f.setParentDatasetFieldCompoundValue(cv); fields.add( f ); @@ -183,7 +184,9 @@ public List parseCompoundValue( DatasetFieldType comp DatasetFieldCompoundValue cv = new DatasetFieldCompoundValue(); List fields = new LinkedList<>(); - for ( JsonObject childFieldJson : json.getJsonArray("value").getValuesAs(JsonObject.class) ) { + JsonObject value = json.getJsonObject("value"); + for ( String key : value.keySet() ) { + JsonObject childFieldJson = value.getJsonObject(key); DatasetField f = parseField( childFieldJson ); f.setParentDatasetFieldCompoundValue(cv); fields.add( f ); diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java index e9309ee481b..27d370740e4 100644 --- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java +++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java @@ -30,11 +30,13 @@ import java.util.TreeSet; import static edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder.jsonObjectBuilder; +import java.math.BigDecimal; import java.util.Deque; import java.util.LinkedList; import java.util.Map; import javax.json.JsonArray; import javax.json.JsonObject; +import javax.json.JsonValue; /** * Convert objects to Json. @@ -225,17 +227,15 @@ public static String typeClassString( DatasetFieldType typ ) { return "primitive"; } - public static JsonObjectBuilder json( DatasetField dfv ) { - JsonObjectBuilder bld = jsonObjectBuilder(); + public static JsonObject json( DatasetField dfv ) { if ( dfv.isEmpty() ) { - bld.addNull("value"); - } else { + return null; + } else { JsonArrayBuilder fieldArray = Json.createArrayBuilder(); DatasetFieldWalker.walk(dfv, new DatasetFieldsToJson(fieldArray)); - bld.add( "value", fieldArray.build().getJsonObject(0) ); + JsonArray out = fieldArray.build(); + return out.getJsonObject(0); } - - return bld; } public static JsonObjectBuilder json( MetadataBlock blk ) { @@ -325,6 +325,8 @@ private static class DatasetFieldsToJson implements DatasetFieldWalker.Listener Deque objectStack = new LinkedList<>(); Deque valueArrStack = new LinkedList<>(); + JsonObjectBuilder result = null; + DatasetFieldsToJson( JsonArrayBuilder result ) { valueArrStack.push(result); diff --git a/src/main/java/log4j.properties b/src/main/java/log4j.properties new file mode 100644 index 00000000000..9c08addaf9e --- /dev/null +++ b/src/main/java/log4j.properties @@ -0,0 +1,23 @@ +log4j.rootLogger=FATAL, stderr + +# Base of all Jena classes +log4j.logger.org.atmosphere=FATAL + +# Example of switching on debug level logging for part of tree +# log4j.logger.com.hp.hpl.jena.graph.test=debug +# log4j.logger.com.hp.hpl.jena.reasoner=debug +# log4j.logger.com.hp.hpl.jena.reasoner.test=debug + +# Log format to standard out +log4j.appender.stdout=org.apache.log4j.ConsoleAppender +log4j.appender.stdout.layout=org.apache.log4j.PatternLayout +# Pattern to output the caller's file name and line number. +log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n + +# Log format to standard error +log4j.appender.stderr=org.apache.log4j.ConsoleAppender +log4j.appender.stderr.target=System.err +log4j.appender.stderr.layout=org.apache.log4j.PatternLayout +# Pattern to output the caller's file name and line number. +log4j.appender.stderr.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n + diff --git a/src/main/webapp/WEB-INF/web.xml b/src/main/webapp/WEB-INF/web.xml index 4840ef547b2..f2668fc79e0 100644 --- a/src/main/webapp/WEB-INF/web.xml +++ b/src/main/webapp/WEB-INF/web.xml @@ -7,7 +7,6 @@ id="WebApp_ID" version="2.5"> Dataverse - @@ -33,8 +32,8 @@ 1 - Push Servlet - org.primefaces.push.PushServlet + Push Servlet + org.primefaces.push.PushServlet @@ -55,30 +54,103 @@ *.xhtml - Push Servlet - /primepush/* + Push Servlet + /primepush/* - - - eot - application/vnd.ms-fontobject - - - otf - font/opentype - - - ttf - application/x-font-ttf - - - woff - application/x-font-woff - - - svg - image/svg+xml - + + + eot + application/vnd.ms-fontobject + + + otf + font/opentype + + + ttf + application/x-font-ttf + + + woff + application/x-font-woff + + + svg + image/svg+xml + + + + + config-impl + edu.harvard.iq.dataverse.api.datadeposit.SwordConfigurationImpl + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2ServiceDocumentServlet + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2ServiceDocumentServlet + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2ServiceDocumentServlet + /dvn/api/data-deposit/v1/swordv2/service-document/* + + + + collection-deposit-impl + edu.harvard.iq.dataverse.api.datadeposit.CollectionDepositManagerImpl + + + collection-list-impl + edu.harvard.iq.dataverse.api.datadeposit.CollectionListManagerImpl + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2CollectionServlet + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2CollectionServlet + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2CollectionServlet + /dvn/api/data-deposit/v1/swordv2/collection/* + + + + media-resource-impl + edu.harvard.iq.dataverse.api.datadeposit.MediaResourceManagerImpl + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2MediaResourceServlet + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2MediaResourceServlet + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2MediaResourceServlet + /dvn/api/data-deposit/v1/swordv2/edit-media/* + + + + statement-impl + edu.harvard.iq.dataverse.api.datadeposit.StatementManagerImpl + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2StatementServlet + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2StatementServlet + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2StatementServlet + /dvn/api/data-deposit/v1/swordv2/statement/* + + + + container-impl + edu.harvard.iq.dataverse.api.datadeposit.ContainerManagerImpl + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2ContainerServlet + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2ContainerServlet + + + edu.harvard.iq.dataverse.api.datadeposit.SWORDv2ContainerServlet + /dvn/api/data-deposit/v1/swordv2/edit/* + + + + diff --git a/src/main/webapp/dataset.xhtml b/src/main/webapp/dataset.xhtml index 0b480d68345..b1080045ad7 100644 --- a/src/main/webapp/dataset.xhtml +++ b/src/main/webapp/dataset.xhtml @@ -154,11 +154,11 @@
#{DatasetPage.datasetVersionUI.title.value} - - + + -
+
#{DatasetPage.displayCitation} @@ -172,29 +172,38 @@ #{DatasetPage.datasetVersionUI.description.value} -
-
+
+
- - Keyword(s) - #{DatasetPage.datasetVersionUI.keyword.displayValue} + + + #{DatasetPage.datasetVersionUI.keyword.displayValue} - - Subject(s) - #{DatasetPage.datasetVersionUI.subject.displayValue} + + + #{DatasetPage.datasetVersionUI.subject.displayValue} - Related Publication: #{DatasetPage.datasetVersionUI.relPublicationCitation} #{DatasetPage.datasetVersionUI.relPublicationId} #{DatasetPage.datasetVersionUI.relPublicationUrl} + + + #{DatasetPage.datasetVersionUI.relPublicationCitation} #{DatasetPage.datasetVersionUI.relPublicationId} + - - Notes -
- #{DatasetPage.datasetVersionUI.notes.value} -
+ + + #{DatasetPage.datasetVersionUI.notes.value}
@@ -433,6 +442,12 @@
+ + +
+ Tip: You can add more metadata about this dataset after it's created. +
+
diff --git a/src/main/webapp/datasetFieldForEditFragment.xhtml b/src/main/webapp/datasetFieldForEditFragment.xhtml index 1fc4d97ecda..4aa8e3d70b7 100644 --- a/src/main/webapp/datasetFieldForEditFragment.xhtml +++ b/src/main/webapp/datasetFieldForEditFragment.xhtml @@ -37,7 +37,7 @@ -