-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: added support for batching FlowMods #172
Conversation
subscribed to 'kytos.flow_manager.send_flows'
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Everything seems to work as expected, beyond allowing multiple overlapping requests to execute.
if batch_size > 0 | ||
else max(1, len(flows_zipped)) | ||
) | ||
batch_interval = max(0, batch_interval) | ||
for switch in switches: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nothing is stopping multiple requests from sending in flow_mods to the same switch, which could potentially overload the switch. Perhaps instead of performing batching here, we could use a per switch queue for the flow_mods received to rate limit.
Additionally, batch size doesn't make much sense as a per request parameter, but makes sense as a per switch parameter.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, @Ktmi that's a good idea. We might as well go ahead and start queueing per switch on flow_manager. Initially this was just adding the same client per request behavior.
It's worth also noting that we also have in our backlog to have per message type to support rate limiting on core queue but hasn't been prioritized kytos-ng/kytos#245
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Ktmi check out this thread kytos-ng/kytos#245 (comment), I'll need your help with this one if we'll go for the ideal solution, appreciated your review, help, and suggestions. In the meantime, I'll keep making progress on telemetry_int
. Otherwise, I can still handle it in the future. If you confirm that you can help out with that one on version 2023.2
, of course following the current planned priorities, we can close this PR here. Let me know.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Closing this PR in favor of kytos-ng/kytos#412. David is helping out with it. Thanks, David.
Closes #171
Summary
Local Tests
force: false
with batching, it didn't block the requestforce: true
with batching, it blocked the request as expectedEnd-to-End Tests