Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for apollo batching #433

Closed
macnibblet opened this issue Nov 15, 2018 · 24 comments
Closed

Support for apollo batching #433

macnibblet opened this issue Nov 15, 2018 · 24 comments
Labels
enhancement New feature or request transport Related to GraphQL on the wire

Comments

@macnibblet
Copy link

It would be awesome if we could detect batch operations based on a header and then parse the incoming request as an array of operations or a single operation.

@vektah vektah added the enhancement New feature or request label Nov 16, 2018
@macnibblet
Copy link
Author

macnibblet commented Dec 2, 2018

This is a really really ugly proof of concept, but it actually works.
https://gist.github.com/macnibblet/8c25cd7964c6c28fe09c40126424032c

Things one probably wants to take into account is the number of concurrent operations running.
I'm thinking a worker pool per request is probably a better solution, but not entirely sure.

Secondly, I have no idea how the tracing gets affected by this code.

Thirdly, No mutexes around the responses slice, https://stackoverflow.com/questions/49879322/can-i-concurrently-write-different-slice-elements not sure if it's a good idea but too lazy to abstract away the boilerplate

@mathewbyrne
Copy link
Contributor

Thanks @macnibblet — this is a great reference for anyone that is interested in implementing this now. It does highlight that handler could probably do with a refactor to make it a bit more swappable.

@macnibblet
Copy link
Author

@mathewbyrne If someone can confirm the tracing works with the way I have implemented it now I have no problem turning it from a proof of concept to production worthy code.

I could use this in production for a few applications we are running with this awesome library.

@macnibblet
Copy link
Author

@vektah Do you have any comments on how the tracing should work when batching? Else ill update my POC and create a merge request.

@vektah
Copy link
Collaborator

vektah commented Jan 7, 2019

I don't think batching can happen concurrently without a bit more work to establish sync boundaries.

Consider a batch that looks like this:

query q1 { user(id: 1) {name} }
mutation m1 { setName(id: 1, name: "foo") { name } }
query q2 { user(id: 1) {name} }
mutation m2 { setName(id: 1, name: "bar") { name } }
query q3 { user(id: 1) {name} }

I would expect to get back

q1: "previous"
m1: "foo"
q2: "foo"
m2: "bar"
p3: "bar"

The suggested implementation would likely do something very different.

I'm not sure how apollo tracing + batching should work, someone would need to fire up a server and have a look at whats in the extensions. It probably keeps each req/resp separate.

Also make sure you base any work on the next branch, rebasing between the two is painful.

@macnibblet
Copy link
Author

@vektah AFAIK you are not allowed to mix queries and mutations?

Regards to the tracing, do we want to split each operation into a single request or have them under the same request?

@vektah
Copy link
Collaborator

vektah commented Jan 8, 2019

@vektah AFAIK you are not allowed to mix queries and mutations?

Ahh right, lets make sure we have some validation covering that.

Regards to the tracing, do we want to split each operation into a single request or have them under the same request?

I think that makes sense, still worth having a look at what apollo server does with tracing turned on. apollo tracing is returned in the extensions key of the response, so I imagine its per query/mutation.

@xzyfer
Copy link

xzyfer commented Feb 15, 2019

We're interested in this feature too. Happy to collaborate on an implementation.

@sneko
Copy link

sneko commented Apr 24, 2019

Hi everyone, I'm also interested in this feature 😀

Any news on the advancement?

Thank you,

@stale
Copy link

stale bot commented Aug 28, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Aug 28, 2019
@frederikhors
Copy link
Collaborator

No stale.

@stale stale bot removed the stale label Aug 28, 2019
@wegged
Copy link

wegged commented Aug 28, 2019

Interested in this as well

@vektah vektah mentioned this issue Oct 1, 2019
10 tasks
@stale
Copy link

stale bot commented Oct 27, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Oct 27, 2019
@frederikhors
Copy link
Collaborator

Nope, stale.

@stale stale bot removed the stale label Oct 28, 2019
@stale
Copy link

stale bot commented Dec 27, 2019

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale label Dec 27, 2019
@sneko
Copy link

sneko commented Dec 27, 2019

Still interested 😉

@stale stale bot removed the stale label Dec 27, 2019
@frederikhors
Copy link
Collaborator

Can we disable stale for some issues?

@xzyfer
Copy link

xzyfer commented Feb 25, 2020

Would this be easier given the recent handler refactor?

@gbartlett
Copy link

yeah would love to see this as well!

@maxbeatty
Copy link

I'm interested in funding/sponsoring this enhancement. I'm not sure how it would work through the 99designs organization but happy to work with individuals directly. If anyone else is interested in pitching in, add an 👀 reaction!

@ghost
Copy link

ghost commented Nov 5, 2020

I would like this feature as well. To work better with the nautilus gateway batching https://gateway.nautilus.dev/advanced/batching

@lwc lwc added the transport Related to GraphQL on the wire label Sep 17, 2021
@xzyfer
Copy link

xzyfer commented Sep 17, 2021

@vektah AFAIK you are not allowed to mix queries and mutations?

Digging around the apollo-server implementation this limitation does not appear to be codified. From looking through some issues it appears this behaviour is intentional

batching order isn't a guaranteed order so mutation executed in batch will all run in parallel aside from an ordered serial list of mutations. I would recommend not batching with mutations on the client

Essentially it's the client responsibility to not create batched requests that would result in undesirable race conditions.

More context

@Sh1d0w
Copy link

Sh1d0w commented Oct 29, 2021

Any plans to implement that?

@sneko
Copy link

sneko commented Jan 5, 2022

Hi @lwc , it seems you are the latest 99designs team member who contributed over the last months to this project.

Do you know if there is any interest from your team for this batching feature?

Thank you,

@frederikhors frederikhors converted this issue into discussion #1932 Feb 4, 2022

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
enhancement New feature or request transport Related to GraphQL on the wire
Projects
None yet
Development

No branches or pull requests