-
Notifications
You must be signed in to change notification settings - Fork 490
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
API cleanup #3358
Comments
d813105: Should be "Illegal command translates to |
Heard from @michbarsinai that this one is ready for code review! #3381 is the PR. |
When I run However, when I run the same tests on the 3358-api-cleanup branch (c83f555), I'm getting "Tests run: 23, Failures: 3, Errors: 0, Skipped: 0". Here is more of the failure output: If expected response codes are changing, I would expect to see this noted in the API Guide at least (and the tests above should be adjusted so they continue to pass). We could also consider updating the version of the API from "1" to "2" or whatever (however, see #3325 for challenges in bumping the API version). Please note that the list of tests above is not arbitrary. It's the list of tests that are executed at https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/ when after I kick off a build once a pull request has been merged into develop. See also #1936 (comment) . At https://waffle.io/IQSS/dataverse I'm putting this issue and pull request #3381 back in Development and assigning to @michbarsinai for consideration. |
The status of this issue is that I'm asking @michbarsinai to make sure DatasetsIT passes and he is having to mess around with JVM options. I recommend the following:
(Perhaps this should be documented in the Deveoper Guide.) Also, I believe @michbarsinai has agreed that status codes changing at least warrants a mention in the release notes. @kcondon indicated he needs to be given a heads up of what should be written in the release notes. I favor having the change be documented in the API Guide. |
@michbarsinai maybe this scope creep but #2219, #2431 and #2461 could be at least considered as well for API cleanup. |
Current status: DatasetsIT test working, these tests fail:
I'm pretty sure that's a setup issue at this point. Maybe run these on a pre-setup machine, or better yet, script it in a Docker container or something. |
@pdurbin - can you please run the test suite on a properly setup machine so we can green the PR? |
The stupid EJB timer prevented me from running my normal drop-database-and-get-set up again scripts that I've talked about #2443 at so I added a fix in 9b5f7af, which is the same as dc0af34 for #3336. Anyway, yes, the tests pass now. |
@michbarsinai I reviewed https://github.com/IQSS/dataverse/pull/3381/files as of 9e06128 and I was a little surprised by how much of an extensive rewrite this is. As long as you haven't introduced any typos anywhere I suppose the API should continue to "just work" but @kcondon would probably appreciate a list from you of what changed so he knows what to test. I'll go ahead and pass this to QA at https://waffle.io/IQSS/dataverse so you two can figure out what the testing strategy is. |
@michbarsinai any thoughts on rolling a fix for #3209 into your pull request? |
Sure, feel free to roll it in (I'm still traveling from the conference, might take some time to get to it). |
@michbarsinai I might, but let's discuss the details in the other issue. I just left a comment there: #3209 (comment) |
From Michael: Changes:
|
Here is the latest result from REST Assured testing: https://build.hmdc.harvard.edu:8443/job/phoenix.dataverse.org-apitest-develop/edu.harvard.iq$dataverse/56/testReport/edu.harvard.iq.dataverse.api/ . @kcondon and I looked at console output too and generally poked around. There are a lot of assumptions in the REST Assured tests. They expect the "burrito" key to be there so you can create users, for example. I'm happy to document all of them in the Dev Guide some day. |
@pdurbin we all expect a burrito!
|
@kcondon this morning I mentioned a "gotcha" that the REST Assured test suite has an assumption that ordinary users can create both datasets and dataverses in the root dataverse. On http://phoenix.dataverse.org I run https://github.com/IQSS/dataverse/blob/v4.5.1/scripts/search/tests/grant-authusers-add-on-root to set these permissions. It passes in the following to the root dataverse:
(The REST Assured tests weren't working on @raprasad 's laptop for the same reason. #2443 is related.) Anyway, I hope this helps. Post-install setup of phoenix is all here: https://github.com/IQSS/dataverse/blob/v4.5.1/scripts/deploy/phoenix.dataverse.org/post . The only other important thing is that the root dataverse needs to be published for the REST Assured test suite to run. |
Verified 1. moving test endpoint, 2. response code from 403 to 400 for perms issue using Phil's integration test, 3. by a combination of code review to confirm systemic nature of changes and spot checking calls using existing scripts (setup-all) and phil's integration test. Now will look at the last set of changes for coverage. |
I've asked Phil to review the test coverage for his integration tests and update a spreadsheet accordingly so we can better assess future work and asked Michael about RESTful api self discovery feature to make understanding the scope of an API easier. This is primarily future work though. |
OK found 1 regression: #3425 |
…ve status http error codes to add/replace api calls
OK aside from above regression and an inability to get one endpoint to work at all: Everything looks ready to merge. Following up with Michael on the above endpoint. |
@kcondon I took a quick look at the spreadsheet (1AA-BGtyQywbi8whmnBQnQGJCHU_rpPURNPXq4ttTToE so I can find it again) but too late, I suppose, since #3381 has already been merged. Unfortunately, I haven't been able to figure out a good way to measure code coverage for REST Assured tests (unit test coverage is shown at https://coveralls.io/github/IQSS/dataverse ). Please see #2746 and how I wrote "Generate code coverage reports for integration tests: pkainulainen/maven-examples#3 and http://www.petrikainulainen.net/programming/maven/creating-code-coverage-reports-for-unit-and-integration-tests-with-the-jacoco-maven-plugin/ ". It's an unsolved problem. |
@pdurbin Just a quick eyeball of which api endpoints are called by your integration tests is what I'm looking for. A more comprehensive, automated system down the line would be welcomed. |
@kcondon I just spent 10 minutes adding a "Phil's integration tests as of pull request 3381 being merged (Phil's take)" column to the spreadsheet. I hope it helps! |
@pdurbin Much better, thanks. Would be good to log specific api calls as part of test output and confirm in code those you've marked as "unsure". From your comment I know you are looking at automated code coverage reports but there is also room/ need for top-down checklist style coverage, such as the exercise we just went through for this ticket. Thanks again for the effort. |
While working towards JavaOne'16, where we present our API implementation, I found various inconsistencies. This issue is used for tracking their fixes.
The text was updated successfully, but these errors were encountered: