Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature idea: strategy for when the queue is full #20

Closed
cocowalla opened this issue Aug 1, 2017 · 4 comments
Closed

Feature idea: strategy for when the queue is full #20

cocowalla opened this issue Aug 1, 2017 · 4 comments

Comments

@cocowalla
Copy link
Contributor

At present, the BackgroundWorkerSink will drop messages when the queue is full. It would be good to be able to configure this behaviour so there is an option to block instead of drop.

@nblumhardt
Copy link
Member

I think this would be a great addition to the sink. It should be enough to represent it with a bool and switch behavior in Emit().

I'm not likely to pick this up in the short term, but can help with a PR if anyone decides to take it on.

@cocowalla
Copy link
Contributor Author

cocowalla commented Aug 3, 2017

@nblumhardt I'll have a crack at this one.

I also noticed there doesn't seem to be a test for the existing 'drop' behaviour, so I'll try to add tests for both the 'drop' and 'block' strategies.

The while (true) loop also doesn't seem a very elegant way to handle queue enumeration, so I'll switch to _queue.GetConsumingEnumerable() and call _queue.CompleteAdding() in Dispose(). This also means we can lose the CancellationToken, which will simplify the code a bit.

@cocowalla
Copy link
Contributor Author

@nblumhardt I submitted PR #21 for this

nblumhardt added a commit that referenced this issue Aug 7, 2017
Fulfills #20 - add option to block when the queue is full, instead of dropping events
@bartelink
Copy link
Member

@cocowalla I'm going to go out on a limb and declare this Done Done ;)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants