Feature request: support batching of AWS SDK calls with batch processor #2196
Labels
discussing
The issue needs to be discussed, elaborated, or refined
feature-request
This item refers to a feature request for an existing or new utility
need-customer-feedback
Requires more customers feedback before making or revisiting a decision
Use case
The batch processing feature helps with handling failures in batches. Besides this, another reason to use batching is the possibility to batch (AWS SDK) calls. Examples of this are PutEventsCommand for EventBridge and BatchWriteItemCommand. The possibility to batch such calls will reduce the costs and increase performance.
With the current implementation, the only possibility to support batching of AWS SDK calls is by customizing the batch processor, as is shown https://docs.powertools.aws.dev/lambda/typescript/latest/utilities/batch/#create-your-own-partial-processor in the official documentation.
Solution/User Experience
Currently for an unexperienced user it might not be obvious that the current batching approach is not ideal. Events are processed in batches, but they are still processed one-by-one. This feature would make the experience of working with batching within an AWS Lambda function more smooth.
It would help the user to not have to write some boilerplate code by supporting:
recordHandler
recordHandler
calls into params for the AWS SDK callAlternative solutions
As far as I can see there is no generic batching solution for AWS SDK calls, but each AWS SDK call that supports batching uses different input and output types. This means it will require custom code for each AWS SDK call. This seems like a lot of effort.
An alternative would be to support hiding some of the boilerplate code. Allowing to return a custom type from each
recordHandler
call and having a hook to process these and indicate which ones succeeded and failed. The currentclean
hook does not fully support this.Acknowledgment
Future readers
Please react with 👍 and your use case to help us understand customer demand.
The text was updated successfully, but these errors were encountered: