-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(protocol): new way to calculate meta.difficulty (TKO-11) #15568
Conversation
dantaik
commented
Jan 25, 2024
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
What about just using If we want to make it unique per L2 than hashing in |
I like the idea. |
Wait, I think we want to make sure each L2 block has a different difficulty, otherwise L2 blocks proposed in the same L1 block always have the same min-tier... |
This reverts commit 501bb45.
I think realistically speaking, proposers will just avoid submitting blocks when it's not the cheapest tier. Because if a certain proof is cheap enough to not really make a difference on their profitability, that proof should just always be in the base tier because that's better security. And if the proof is very expensive, well it simply will not be profitable to submit a block with the fees users are used to paying for most of the other blocks. If a zk proof costs $20, well that's a pretty large spike in expected fees users have to pay that is not predictable. This compared to the pretty good predictability of L1 fees and L2 fees (at least for a short time) because of EIP-1559. So I think randomness in the proofs will result in a worse user experience because either users have to overpay all the time (bad ux because higher fees), or the proposers will simply wait one (or more) L1 blocks so the randomness is more favorable and then be able to submit their block when it is actually profitable (bad ux because now transactions take longer to finalize). |
That would imply 100% of the blocks, shall have 100% of the proofs (sgx, zk, guardian) immediately there together - otherwise makes no sense anyways, no ? Because simply noone would want to propose something which costs more than "needed" ? 🤔 |
I think it enforces at least some fixed configuration without randomness. For example SGX + guardian as the base proofs (because both SGX and guardian are very cheap to verify, so it doesn't make much sense not to always have them, it won't impact user fees in any significant way). Then it could still have SGX + guardian + ZK as a way to override the SGX + guardian tier, because ZK at the base level may be expensive (remains to be seen though), but that would only be done when needed if the blockhash is actually wrong, not randomly, so users never have to pay the ZK cost normally (but we do lower security and increase the time to finality by not having it part of the base proofs). |
In the future we can try some other ideas, for example, determining the minTier of block by the next proposeBlock transaction, or by using VRF. But for now, with minimal modification, we have the following 3 options:
Among the above 3 options. I still think #2 is preferred over the other 2. |
For the difficulty value itself I think it is indeed fine, especially because contracts shouldn't depend on it much as well. Random minimum tiers I think is problematic but I guess that's another discussion. |