Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: get local testing working, fix memory leak #89

Merged
merged 6 commits into from
Jan 10, 2022

Conversation

aalu1418
Copy link
Collaborator

@aalu1418 aalu1418 commented Jan 7, 2022

Main: fix memory leak by removing WS connection

Secondary: small doc updates + remove extra modulus in contract

Tested locally for 2+ hours: ✅ contract fix, ✅ removed WS, ✅ memory leak

corresponding core implementation: #89

@aalu1418 aalu1418 marked this pull request as ready for review January 7, 2022 16:15
examples/spec/ocr2-bootstrap.spec.toml Outdated Show resolved Hide resolved
examples/spec/ocr2-oracle.spec.toml Outdated Show resolved Hide resolved
ops/Pulumi.localnet.yaml Show resolved Hide resolved
pkg/solana/transmitter.go Outdated Show resolved Hide resolved
rpc.CommitmentConfirmed,
)

if err != nil {
c.lggr.Errorf("error on Transmit.SendAndConfirmTransaction: %s", err.Error())
}
// TODO: poll rpc for tx confirmation (WS connection unreliable)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Was there an agreement we are dropping WS support for the long term?

I'd prefer we actually have two implementations, both simpler but more resource-intensive RPC polling, and WS subscription so we can make incremental improvements to it and potentially switch again once stable enough.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sort of, there was consensus for dropping it for now.

I agree that the RPC polling is more resource intensive, but the WS implementation in the solana-go module was the cause of a memory leak which would take down the core node. Building a stable WS implementation would take a lot of time and effort, so it was shelved for now.

Copy link
Collaborator

@krebernisak krebernisak Jan 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'd just suggest we declare a TxConfirmer as an interface and then we implement the RPC TxConfirmer + some docs about what were the latest encountered issues using the WS TxConfirmer (no need to do any work on it now). That way we have a defined path of switching implementations if/when WS implementation becomes stable.

Copy link
Collaborator Author

@aalu1418 aalu1418 Jan 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that sounds like a good idea, but probably better addressed in a separate PR?
it will also likely tie into the tx management work for terra as well

Copy link
Collaborator

@krebernisak krebernisak Jan 10, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, no need to slow down this PR whose main purpose is to quickly fix the memory leak (get the network stable).

I'll open a GH Issue to track this suggestion for the future.
And yeah, we should define types like TxCofirmer on the Relay SDK level and use (implement) them for specific relays.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

might be easier to add comments to this open issue for adding a tx confirmer: #90

Copy link
Collaborator

@krebernisak krebernisak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@aalu1418 aalu1418 merged commit af5575a into develop Jan 10, 2022
@aalu1418 aalu1418 deleted the fix/docs-tests-memleak branch January 10, 2022 14:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants