-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adds draft backend API, db integration, db schema migration #5
Conversation
…rations using atlas. Adds very simple tests.
…, updates .env.TEMPLATE with a few new values
❌ Deploy Preview for molevolvr failed.
|
Looking at this now. Do you know if Plumber can export OpenAPI types? If so, I could automatically generate typescript types for everything. I found this page but I can't tell if it's Plumber source code, a tutorial/example, a Plumblr plugin, or something else completely. Edit: Nevermind, I forgot that Swagger needs (I think) an OpenAPI schema to work, and you already have that set up. I see the schema at http://localhost:9050/openapi.json I'll change my question to, can we export that to a folder in the repo so I can read it statically from the frontend build tools without having to run the backend? Also just curious about the docs endpoint being Another edit: I see now that's just a default by Plumber I guess. Fine to leave that as is. |
Okay, looks like you could call If you want to add that into this PR, I could also setup the typescript generation here too. Hopefully shouldn't be too much effort. Unlike in past PRs, I think this would be actually relevant to the subject of the PR to tack on. |
Other than my above comments, this seems to work as expected, though I have no experience with Plumber. Maybe Dave or David could look over this? |
It works for me. Very cool. One last thought, maybe we rename We're referring to the backend as "backend" so "frontend" would be consistent with that too. |
Right, I can see how that's confusing...kind of out of habit, I put application-specific code in a container in a folder in the container's filesystem called I do like the idea of changing the |
Right, the OpenAPI spec is generated via reflection at runtime, which means the backend has to be running. I'll pull the OpenAPI spec as a file and add it to this PR. About the name of the spec file, I'd personally prefer I will say that the OpenAPI spec Plumber generates by default has limited usefulness; R is famously loose with its typing, so Plumber can't extract much of a schema for the return value from the API, and it can barely do it for the parameters, too, since base R doesn't support type hinting. Plumber's alternative is for you to add an annotation to each endpoint, Here are two possible options:
Option 1 is more straightforward IMO and makes the OpenAPI spec the source of truth rather than an artifact. While option 2 makes the code more verbose and requires a step where the spec is extracted into a file, there's something to be said for the response documentation being near the code that generates it. |
A third option is just to not deal with a spec at all, and I'll just manually make my typescript as I was planning to do anyway. Do you think it would |
Let me try converting the dump you just pushed to typescript and seeing how useful it is as well. |
Okay I think it may be of some help to me, even if it's just for getting the names of the endpoints and being able to get a TS API client out of it. I'm thinking I'll probably use it, even if I have to supplement it with hand-written types. I get the benefits of option 1, but I think writing an OpenAPI spec by hand for all of the endpoints and complex data structure's we're going to have is going to become a burden. It's fairly verbose and repetitive. Option 2 seems like it's a Plumber special syntax that's a little more terse? And I'm assuming since it gets converted to a spec anyway, it has the same level of checking (R type hinting? compile warnings?) as option 1, if any at all. I suppose ultimately, someone is going to have to do the work of defining the structure of this data, it's just a question of who. I'm happy to take on that task (especially since the frontend is enforcing this requirement by using TS), especially if doing the type annotations at the backend level doesn't really benefit you in any way. Would it help to have those annotations collocated with those endpoint functions? I feel like most of the structure info is really going to be coming from the package level anyway. Having the Swagger UI be more descriptive and type-safe would be a nice benefit though Maybe I just try out manually converting the spec to TS for a while -- seeing how useful and accurate it is -- before we make the effort to do one of the options and/or automate the TS conversion. In any case, let's hold off for this PR at least, this is getting too in the weeds. I might copy this discussion to an issue. Regarding "app" -> "frontend", since you said you have some upcoming changes relevant to this stuff, maybe we can wait until that's done before renaming the folder and associated docker commands. Possibly in a new PR. |
The third option sounds good, at least for now. Maybe once things have crystallized a bit more we can go back to looking at a server-side spec, if only for documenting the API and for validation, not necessarily for your frontend. |
This PR builds on #5, with two main objectives: - add SLURM, config, and supporting services to the stack - integrate async job processing into the backend via SLURM In addition to the main objectives, this PR includes a few tweaks to API response handling and the testing framework. A new field, `reason`, has been added to the `analyses` table to capture why a status was set. Currently it gets set to an exception traceback if the analysis throws an error. The PR includes a skeleton of how analyses would be processed in `backend/api/dispatch/submit.R` (called via `dispatchAnalysis()` in `backend/api/endpoints/analyses.R`), but it doesn't actually do any real work yet. Things to try: - bringing up the full stack and seeing if it builds and runs without issue - submitting analyses and seeing if the status eventually gets updated - trying the tests: after launching the stack, execute `./run_stack.sh shell` in a separate window, then run `./run_tests.sh`, which will perform a few integration tests
This PR replaces the "dummy" API from the previous PR with one that does the following:
analysis
endpointstats
, just for testingOther non-API-related changes:
shell
control command torun_stack.sh
that starts a shell in the backend containerThings you might want to test:
.env
file is in.env.TEMPLATE
./run_stack.sh
./run_stack.sh shell
in another window to get a shell into the backend./run_tests.sh