-
Notifications
You must be signed in to change notification settings - Fork 628
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Consumer stream #695
Consumer stream #695
Conversation
@hyperlink Is it normal to see sporadic test failures? Locally and via travis, we're seeing intermittent issues with brokers not being available. |
Thanks for the PR @lliss! Could you add some documentation and examples on using the stream? Regarding the tests I'm seeing the failures happen more often now that the tests run against different kafka versions though I haven't had a chance to investigate it. |
@lliss if you rebase you'll find the tests to be more stable. I will look at this PR in more detail soon. |
…multiple fetches.
The commit stream is a writable stream that you can pipe events to and it will commit them. This is useful so that you can pipe the ConsumerStream through a transform stream and be completely confident that the only way you will commit an offset is by successfully processing the message.
…stream so that the messages consumed can be observed.
…rm stream mode if and when it is piped to another source and cleaning up the _trasnform logic.
Thanks for your contribution. Closing this PR since this is continued in #732 any discussions can done in that PR. |
Designed to solve #371, this change reimplements the processing of data as a stream in such a way that back pressure should not cause dropped messages. The change is designed to be fully backwards compatible.