-
-
Notifications
You must be signed in to change notification settings - Fork 402
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
irc: use deque for "stack" of recent messages #2327
Conversation
Thanks for breaking CI, Coveralls. Thankfully it doesn't actually matter yet, because this is only a draft. |
Neat. |
Coveralls is fixed. Still keeping as draft until the base PR is completed. |
Further discussion on #2320 says we should keep thinking about this, perhaps converting the "stack" to prune messages not just by a dumb count, but by time range and/or maximum size in bytes. This can be considered Replaced if one of us gets the motivation to do the fancy way pre-8.0, but should be merged if we don't. |
Doesn't save any code lines, but does make the process of updating what has been said recently look a bit less voodoo-y. I very much prefer declaring how long the list of "messages" should be where it's initialized instead of pruning it later.
98660eb
to
16f8ca3
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice little improvement
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Perfection.
Description
Tin. Reading #2320 made me wonder why we used a list with manual pruning (
l = l[-10:]
after appending), and I couldn't think of a good reason. Updated tests made me pretty confident I could just swap to a deque withmaxlen
and get pruning for free.Checklist
make qa
(runsmake quality
andmake test
)master
branch (they reference code that wasn't changed in this patch).Notes
This is a draft because it's built on top of #2320. I will rebase and mark ready for review when that's squashed down.