Replies: 1 comment 1 reply
-
I looked at the implementation and it's not possible, although could be after minor change. import { batchDelegateToSchema } from "@graphql-tools/batch-delegate";
import { OperationTypeNode } from "graphql";
import { MeshContext, Resolvers } from "./.mesh";
const resolvers: Resolvers = {
Book: {
media: {
selectionSet: `{ mediaId }`,
async resolve(root, args, context, info) {
const schema = {
schema: context.Service.rawSource.schema,
executor: context.Service.rawSource.executor,
};
return batchDelegateToSchema({
schema,
operation: OperationTypeNode.QUERY,
fieldName: "getMediaByIds",
key: root.mediaId,
dataLoaderOptions: {
maxBatchSize: 50,
},
context,
info,
});
},
},
},
};
export default resolvers; |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi,
I am currently trying out the batch processing for N+1 query problem. It is working as expected but I would like to know how the batch size is controlled. Can the batch size be explicitly set?
Any further explanation on how the batching internally works is appreciated.
Thankyou
Beta Was this translation helpful? Give feedback.
All reactions