-
-
Notifications
You must be signed in to change notification settings - Fork 232
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
chore(deps): bump aedes-persistence from 8.1.3 to 9.0.1 #732
chore(deps): bump aedes-persistence from 8.1.3 to 9.0.1 #732
Conversation
Bumps [aedes-persistence](https://github.com/moscajs/aedes-persistence) from 8.1.3 to 9.0.0. - [Release notes](https://github.com/moscajs/aedes-persistence/releases) - [Commits](moscajs/aedes-persistence@v8.1.3...v9.0.0) --- updated-dependencies: - dependency-name: aedes-persistence dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <support@github.com>
0f8d27e
to
4109cf8
Compare
We should drop nodejs 12 here too cc @seriousme |
@seriousme Tests are still failing here, any clue? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tests failing
Issue seems to be triggered by Lines 156 to 220 in 7c13920
I will have a look tonight as last weeks Easter leave is over ;-) |
No rush, thanks @seriousme :) |
I did some research and found two issues in Lines 448 to 511 in 7c13920
Here the problem is that 32 messages are being sent (2x highwatermark) out and only 17 come back. Reducing it to the high watermark makes this test pass. That would suggest an issue with backpressure. Looking at aedes-persistence there is little that can go wrong: https://github.com/moscajs/aedes-persistence/blob/master/persistence.js#L232-L234 It just creates a Readable stream from an array. I checked the array size and it is 32 at the moment that the Readable stream is created. Lines 156 to 220 in 7c13920
This seems to be a timing issue where results from outgoingClearMessageId come back in the wrong order in the test script, but I need to spend some more time on sorting this out.
Any suggestions would be gratefully appreciated ;-) Kind regards, |
So the problem seems related to the stream that is not handling backpressure correctly, right? Did you try with other nodejs versions like 16/18 to see if there is any difference?
Could it be the order is reversed for some reason? cc @gnought |
A good work, @seriousme |
Well this is where it gets interesting: test The other error is I tried to replicate the failure outside Mosca but I cannot get it to fail: const { Readable, Transform, pipeline } = require("stream")
const outgoing = []
const total = 32
// busy loop such that it cannot be optimized out
function doWork() {
let proof = 0
for (let i = 0; i < 1000000; i++) {
proof += Math.random()
}
return proof
}
for (let sent = 0; sent < total; sent++) {
outgoing.push({
cmd: 'publish',
topic: 'hello',
payload: 'message-' + sent,
qos: 1,
messageId: sent
})
}
function filter(packet, enc, next) {
setImmediate(()=> {
console.log(packet,doWork())
next()
})
}
const readable = new Readable.from(outgoing)
const done = () => console.log("done")
function through(transform) {
return new Transform({
objectMode: true,
transform
})
}
pipeline(readable, through(filter), done) Works like a charm, even on Node 14.. Any suggestions ? |
Btw: I also ran the above code using Tap, just to rule out that it was a Tap failure, and that did not fail either |
Ok I found it :-) It is indeed a race condition ! This can be observered if we change the function filter(packet, enc, next) {
setImmediate(() => {
console.log(packet, doWork())
const toRemove = outgoing.findIndex(p => p.messageId === packet.messageId)
if (toRemove > 0) {
outgoing.splice(toRemove, 1)
}
next()
})
} we only see 17 messages , numbers 0-16 ;-) The solution is trivial and was present in 8.1.3 although I did not notice this before :-( const readable = new Readable.from(outgoing) to: const readable = new Readable.from([].concat(outgoing)) As this will do a shallow clone of the array and the Readable stream can continue without interference by the splice in the filter. I will do a PR on Sorry for any inconvenience, |
I'm quite sure @mcollina knows why this happens as he is implied in Nodejs development About the second test failing there must be something weird going on on this function: https://github.com/moscajs/aedes/blob/main/lib/handlers/connect.js#L224 It's using persistence In the previous imlementation it was a bit different as it was actually copy the array before passing it to from2 to create the stream |
Sorry I have just read your answer, yeah the problem is the concat, without that there could be some race condition as you are working in the same array instead of creating a copy of it :) |
A newer version of aedes-persistence exists, but since this PR has been edited by someone other than Dependabot I haven't updated it. You'll get a PR for the updated version as normal once this PR is merged. |
Let's see how test goes now |
@seriousme interesting thing now seems there is a test failing on mac os nodejs 18 :( |
Thats 1 out of 9 variants, which makes it a bit tricky to debug. Esp. since the module itself is platform independent as it doesn't do any I/O ;-) Can we rule out any github runner deficiency ? E.g. can you rerun the failed github job ? |
Did a rerun on my own fork using Node 18 on MacOs latest and that succeeded. Kind regards, |
Bumps aedes-persistence from 8.1.3 to 9.0.0.
Release notes
Sourced from aedes-persistence's releases.
Commits
a374197
Release 9.0.06132db2
chore: es5 to es6, drop nodejs 12 (#82)883c249
chore: update dependencies (#81)Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)