Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

noise: Initial version of the noise chat plugin #68

Merged
merged 12 commits into from
Feb 4, 2020
Merged

noise: Initial version of the noise chat plugin #68

merged 12 commits into from
Feb 4, 2020

Conversation

cdecker
Copy link
Contributor

@cdecker cdecker commented Nov 14, 2019

Based on the WhatSat idea by @joostjager, this plugin implements a simple chat
protocol based on top of createonion and sendonion.

It still needs a bit of cleanup and needs some pending changes to c-lightning, but as soon as they are in I'll undraft this 😉

Depends: ElementsProject/lightning#3261 ElementsProject/lightning#3260 and needs to allow TLV payloads

noise/noise.py Outdated Show resolved Hide resolved
Copy link
Member

@darosior darosior left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can be undrafted now that ElementsProject/lightning#3261 ElementsProject/lightning#3260 were merged? :-)

noise/requirements.txt Outdated Show resolved Hide resolved
noise/noise.py Outdated Show resolved Hide resolved
@cdecker
Copy link
Contributor Author

cdecker commented Dec 24, 2019

Can be undrafted now that ElementsProject/lightning#3261 ElementsProject/lightning#3260 were merged? :-)

Absolutely, I'd like to clean things up a bit first, but we should be able to merge soon ;-)

@cdecker
Copy link
Contributor Author

cdecker commented Jan 19, 2020

Ok, had to work around a few quirks (zbase32 encoded signature from signmessage, better TLV encoding support), but this should not work pretty nicely:

  • The sender is recovered from the message (TLV type had to change so we don't collide with WhatSat)
  • A payment can now be sent alongside a message, using the keysend protocol (using a different TLV type again, since c-lightning does not allow types above 0xFFFFFFFF, I'll fix that asap)

This should be good to review @darosior and @renepickhardt :-)

Copy link
Member

@darosior darosior left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just a light mobile review, but it looks great to me.
I guess that now is time for some mainet testing ? :D

noise/README.org Show resolved Hide resolved
noise/noise.py Outdated Show resolved Hide resolved
noise/requirements.txt Outdated Show resolved Hide resolved
noise/onion.py Show resolved Hide resolved
@darosior
Copy link
Member

darosior commented Jan 21, 2020

@rsbondi
Copy link
Contributor

rsbondi commented Jan 21, 2020

This seems to be one behind on the receiving. If I do
l1-cli sendmsg ID3 "message 0"
l3-cli recvmsg 0 is waiting, and does not retrieve message 0

if I send another message from l1-cli and am now at id 1 the recvmsg 0 call to finally returns but with message id 1,

I can then call l3-cli recvmsg 0 and get message 0, but l3-cli recvmsg 1 is now waiting and not retrieving the received message until the next message is sent. I can repeat with the same results. so if I am receiving on id 10 I can call recvmsg with id 0-9 but can not get id 10.

The readme file says "You can read the last message received" with recvmsg but that is not what I am seeing, I am seeing one behind.

It seems to work if the receiver is expecting the next message with recvmsg -1(seems to work with any number <= next expected id) and it will wait and pull in messages as they come, but I am sure this will timeout at some point

@darosior
Copy link
Member

Got a crash because of a missing method for LegacyOnionPayload, used in response to htlc_accepted (encounetered while I was rebalancing two channels).

**BROKEN** lightningd: Plugin for htlc_accepted returned non-result response {"error": "Error while processing htlc_accepted: AttributeError(\"'LegacyOnionPayload' object has no attribute 'get'\",)", "id": 82, "jsonrpc": "2.0"}

@cdecker
Copy link
Contributor Author

cdecker commented Jan 25, 2020

This seems to be one behind on the receiving. If I do
l1-cli sendmsg ID3 "message 0"
l3-cli recvmsg 0 is waiting, and does not retrieve message 0

if I send another message from l1-cli and am now at id 1 the recvmsg 0 call to finally returns but with message id 1,
I can then call l3-cli recvmsg 0 and get message 0, but l3-cli recvmsg 1 is now waiting and not retrieving the received message until the next message is sent. I can repeat with the same results. so if I am receiving on id 10 I can call recvmsg with id 0-9 but can not get id 10.

The readme file says "You can read the last message received" with recvmsg but that is not what I am seeing, I am seeing one behind.

It seems to work if the receiver is expecting the next message with recvmsg -1(seems to work with any number <= next expected id) and it will wait and pull in messages as they come, but I am sure this will timeout at some point

This is working as expected, though the documentation might be sub-par: the argument to recvmsg is the last message you processed, so we don't assume a monotonically increasing message ID, but rather let the plugin assign arbitrary IDs, and the parameter acts almost as an ACK from the caller to the plugin that the previous message was processed correctly.

The reason why I went this way is that it matches the RPCs we have in c-lightning, and it makes it easier once we have different topics that we might want to tail. If we have several chat sessions open with different users for example we might want the next message (after receiving message i) from Alice, but the next incoming message (i+1) might be from Bob, so asking for i+1 would get us an incorrect message from a completely different chat session. With the high-water mark approach it makes a lot more sense to ask for "Messages from Alice since the message with ID i".

How could I formulate this in the README.md to be clearer?

@cdecker cdecker force-pushed the noise branch 4 times, most recently from 8f5695c to 9d4b677 Compare January 25, 2020 17:45
Based on the WhatSat idea by @joostjager, this plugin implements a simple chat
protocol based on top of `createonion` and `sendonion`.
This was counted as a failure before, but it actually means we delivered
successfully.
Since we don't pass the public key we just rely on the pubkey recovery and the
`checkmessage` interface to tell us whether it is a publicly known `node_id`
or not.
@codecov
Copy link

codecov bot commented Jan 25, 2020

Codecov Report

Merging #68 into master will increase coverage by 9.38%.
The diff coverage is 85.71%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #68      +/-   ##
==========================================
+ Coverage   38.24%   47.63%   +9.38%     
==========================================
  Files          11       15       +4     
  Lines        1391     1734     +343     
==========================================
+ Hits          532      826     +294     
- Misses        859      908      +49
Impacted Files Coverage Δ
noise/zbase32.py 100% <100%> (ø)
noise/primitives.py 58.69% <58.69%> (ø)
noise/onion.py 80.13% <80.13%> (ø)
noise/noise.py 99.19% <99.19%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 1c1b8fe...7920186. Read the comment docs.

@cdecker cdecker merged commit 3290e4e into master Feb 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants