Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge develop into o1js-main #2066

Merged
merged 1 commit into from
Apr 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/benches.yml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
if: github.event.label.name == 'benchmark'
steps:
- name: Checkout PR
uses: actions/checkout@v2
uses: actions/checkout@v4.1.1

# as action-rs does not seem to be maintained anymore, building from
# scratch the environment using rustup
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/coverage.yml.disabled
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:
timeout-minutes: 60
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2.3.4
- uses: actions/checkout@v4.1.1
with:
persist-credentials: false

Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/gh-page.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:

steps:
- name: Checkout Repository
uses: actions/checkout@v2
uses: actions/checkout@v4.1.1

# as action-rs does not seem to be maintained anymore, building from
# scratch the environment using rustup
Expand Down
4 changes: 2 additions & 2 deletions .github/workflows/rust.yml
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ jobs:
name: Run some basic checks and tests
steps:
- name: Checkout PR
uses: actions/checkout@v3
uses: actions/checkout@v4.1.1

# as action-rs does not seem to be maintained anymore, building from
# scratch the environment using rustup
Expand Down Expand Up @@ -62,7 +62,7 @@ jobs:
- name: Install cargo-spec for specifications
run: |
eval $(opam env)
cargo install cargo-spec
cargo install --locked cargo-spec

- name: Build the kimchi specification
run: |
Expand Down
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -25,4 +25,6 @@ _build

*.html
# If symlink created for kimchi-visu
tools/srs
tools/srs

.ignore
21 changes: 21 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,3 +48,24 @@ cargo fmt
```

These are enforced by GitHub PR checks, so be sure to have any errors produced by the above tools fixed before pushing the code to your pull request branch. Refer to `.github/workflows` for all PR checks.

## Branching policy

Generally, proof-systems intends to be synchronized with the mina repository (see their [README-branching.md](https://github.com/MinaProtocol/mina/blob/develop/README-branching.md)), and so its branching policy is quite similar. However several important (some, temporary) distinctions exist:

- `compatible`:
- Compatible with `rampup` in `mina`.
- Mina's `compatible`, similarly to mina's `master`, does not have `proof-systems`.
- `berkley`: future hardfork release, will be going out to berkeley.
- This is where hotfixes go.
- `develop`: matches mina's `develop`, soft fork-compatibility.
- Also used by `mina/o1js-main` and `o1js/main`.
- `master`: future feature work development, containing breaking changes. Anything that does not need to be released alongside mina.
- Note that `mina`'s `master` does not depend on `proof-systems` at all.
- `izmir`: next hardfork release after berkeley.
- In the future:
- `master`/`develop` will reverse roles and become something like gitflow.
- After Berkeley release `compatible` will become properly synced with `mina/compatible`.
- Direction of merge:
- Back-merging: `compatible` into `berkeley` into `develop` into `master`.
- Front-merging (introducing new features): other direction, but where you start depends on where the feature belongs.
1 change: 1 addition & 0 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ members = [
"utils",
"internal-tracing",
]
resolver = "2"

[profile.release]
lto = true
Expand Down
4 changes: 3 additions & 1 deletion book/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,6 @@ edition = "2021"
license = "Apache-2.0"

[build-dependencies]
cargo-spec = { version = "0.5.0" }
cargo-spec = { version = "0.5.0" }
time = { version = "~0.3.23" } # This crate is a known bad-actor for breaking rust version support.
plist = { version = "~1.5.0" } # This crate improperly constrains its bad-actor dependency (`time`).
73 changes: 26 additions & 47 deletions book/src/SUMMARY.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,15 +9,15 @@
- [Rings](./fundamentals/zkbook_rings.md)
- [Fields](./fundamentals/zkbook.md)
- [Polynomials](./fundamentals/zkbook_polynomials.md)
- [Multiplying polynomials](./fundamentals/zkbook_multiplying_polynomials.md)
- [Fast Fourier transform](./fundamentals/zkbook_fft.md)
- [Multiplying Polynomials](./fundamentals/zkbook_multiplying_polynomials.md)
- [Fast Fourier Transform](./fundamentals/zkbook_fft.md)

# Cryptographic tools
# Cryptographic Tools

- [Commitments](./fundamentals/zkbook_commitment.md)
- [Polynomial commitments](./plonk/polynomial_commitments.md)
- [Inner product argument](./plonk/inner_product.md)
- [Different functionnalities](./plonk/inner_product_api.md)
- [Polynomial Commitments](./plonk/polynomial_commitments.md)
- [Inner Product Argument](./plonk/inner_product.md)
- [Different Functionnalities](./plonk/inner_product_api.md)
- [Two Party Computation](./fundamentals/zkbook_2pc/overview.md)
- [Garbled Circuits](./fundamentals/zkbook_2pc/gc.md)
- [Basics](./fundamentals/zkbook_2pc/basics.md)
Expand All @@ -27,71 +27,50 @@
- [Half Gate](./fundamentals/zkbook_2pc/halfgate.md)
- [Full Description](./fundamentals/zkbook_2pc/fulldesc.md)
- [Fixed-Key-AES Hashes](./fundamentals/zkbook_2pc/fkaes.md)

- [Oblivious Transfer](./fundamentals/zkbook_2pc/ot.md)
- [Base OT](./fundamentals/zkbook_2pc/baseot.md)
- [OT Extension](./fundamentals/zkbook_2pc/ote.md)

- [Full Protocol](./fundamentals/zkbook_2pc/2pc.md)

# Proof systems

- [Overview](./fundamentals/proof_systems.md)
- [zk-SNARKs](./fundamentals/zkbook_plonk.md)
- [Custom constraints](./fundamentals/custom_constraints.md)
- [Proof Systems](./fundamentals/proof_systems.md)
- [zk-SNARKs](./fundamentals/zkbook_plonk.md)

# Background on PLONK

- [Overview](./plonk/overview.md)
- [Glossary](./plonk/glossary.md)
- [Glossary](./plonk/glossary.md)
- [Domain](./plonk/domain.md)
- [Lagrange basis in multiplicative subgroups](./plonk/lagrange.md)
- [Non-interaction with fiat-shamir](./plonk/fiat_shamir.md)
- [Lagrange Basis in Multiplicative Subgroups](./plonk/lagrange.md)
- [Non-Interactivity via Fiat-Shamir](./plonk/fiat_shamir.md)
- [Plookup](./plonk/plookup.md)
- [Maller's optimization](./plonk/maller.md)
- [Maller's Optimization](./plonk/maller.md)
- [Zero-Column Approach to Zero-Knowledge](./plonk/zkpm.md)

# Kimchi

- [Overview](./kimchi/overview.md)
- [Arguments](./kimchi/arguments.md)
- [Custom gates](./kimchi/gates.md)
- [Permutation](./kimchi/permut.md)
- [Lookup](./kimchi/lookup.md)

# Snarky
- [Arguments](./kimchi/arguments.md)
- [Final Check](./kimchi/final_check.md)
- [Maller's Optimization for Kimchi](./kimchi/maller_15.md)
- [Lookup Tables](./kimchi/lookup.md)
- [Extended Lookup Tables](./kimchi/extended-lookup-tables.md)
- [Custom Constraints](./kimchi/custom_constraints.md)
- [Custom Gates](./kimchi/gates.md)
- [Foreign Field Addition](./kimchi/foreign_field_add.md)
- [Foreign Field Multiplication](./kimchi/foreign_field_mul.md)
- [Keccak](./kimchi/keccak.md)

- [Overview](./snarky/overview.md)
- [API](./snarky/api.md)
- [snarky wrapper](./snarky/snarky-wrapper.md)
- [Kimchi backend](./snarky/kimchi-backend.md)
- [Vars](./snarky/vars.md)
- [Booleans](./snarky/booleans.md)
- [Circuit generation](./snarky/circuit-generation.md)
- [Witness generation](./snarky/witness-generation.md)

# Pickles & Inductive Proof Systems

- [Overview](./fundamentals/zkbook_ips.md)
- [Accumulation](./pickles/accumulation.md)
- [Deferred Computation](./pickles/deferred.md)
- [Passthough & Me-Only](./pickles/passthrough.md)

# RFCs

- [RFC 0: Alternative zero-knowledge](./plonk/zkpm.md)
- [RFC 1: Final check](./plonk/final_check.md)
- [RFC 2: Maller's optimization for kimchi](./plonk/maller_15.md)
- [RFC 3: Plookup integration in kimchi](./rfcs/3-lookup.md)
- [RFC 4: Extended lookup tables](./rfcs/extended-lookup-tables.md)
- [RFC 5: Foreign Field Addition](./rfcs/foreign_field_add.md)
- [RFC 6: Foreign Field Multiplication](./rfcs/foreign_field_mul.md)
- [RFC 7: Keccak](./rfcs/keccak.md)

# Specifications
# Technical Specifications

- [Poseidon hash](./specs/poseidon.md)
- [Polynomial commitment](./specs/poly-commitment.md)
- [Pasta curves](./specs/pasta.md)
- [Polynomial Commitment](./specs/poly-commitment.md)
- [Pasta Curves](./specs/pasta.md)
- [Kimchi](./specs/kimchi.md)
- [Universal Reference String (URS)](./specs/urs.md)
- [Pickles](./specs/pickles.md)
Expand Down
59 changes: 1 addition & 58 deletions book/src/fundamentals/custom_constraints.md
Original file line number Diff line number Diff line change
@@ -1,58 +1 @@
This section explains how to design and add a custom constraint to our `proof-systems` library.

PLONK is an AIOP. That is, it is a protocol in which the prover sends polynomials as messages and the verifier sends random challenges, and then evaluates the prover's polynomials and performs some final checks on the outputs.

PLONK is very flexible. It can be customized with constraints specific to computations of interest. For example, in Mina, we use a PLONK configuration called kimchi that has custom constraints for poseidon hashing, doing elliptic curve operations, and more.

A "PLONK configuration" specifies
- The set of types of constraints that you would like to be able to enforce. We will describe below how these types of constraints are specified.
- A number of "eq-able" columns `W`
- A number of "advice" columns `A`

Under such configuration, a circuit is specified by
- A number of rows `n`
- A vector `cs` of constraint-types of length `n`. I.e., a vector that specifies, for each row, which types of constraints should be enforced on that row.
- A vector `eqs : Vec<(Position, Position)>` of equalities to enforce, where `struct Position { row: usize, column: usize }`. E.g., if the pair `(Position { row: 0, col: 8 }, Position { row: 10, col: 2 })` is in `eqs`, then the circuit is saying the entries in those two positions should be equal, or in other words that they refer to the same value. This is where the distinction between "eq-able" and "advice" columns comes in. The `column` field of a position in the `eqs` array can only refer to one of the first `W` columns. Equalities cannot be enforced on entries in the `A` columns after that.

Then, given such a circuit, PLONK lets you produce proofs for the statement

> I know `W + A` "column vectors" of field elements `vs: [Vec<F>; W + A]` such that for each row index `i < n`, the constraint of type `cs[i]` holds on the values `[vs[0][i], ..., vs[W+A - 1][i], vs[0][i+1], ..., vs[W+A - 1][i+1]` and all the equalities in `eqs` hold. I.e., for `(p1, p2)` in `eqs` we have `vs[p1.col][p1.row] == vs[p2.col][p2.row]`. So, a constraint can check the values in two adjacent rows.

## Specifying a constraint

Mathematically speaking, a constraint is a multivariate polynomial over the variables $c_{\mathsf{Curr},i}, \dots, v_{\mathsf{Curr}, W+A-1}, v_{\mathsf{Next}, 0}, \dots, v_{\mathsf{Next}, W+A-1}$. In other words, there is one variable corresponding to the value of each column in the "current row" and one variable correspond to the value of each column in the "next row".

In Rust, $v_{r, i}$ is written `E::cell(Column::Witness(i), r)`. So, for example, the variable $v_{\mathsf{Next}, 3}$ is written
`E::cell(Column::Witness(3), CurrOrNext::Next)`.



let w = |i| v(Column::Witness(i));
Let's

## Defining a PLONK configuration

The art in proof systems comes from knowing how to design a PLONK configuration to ensure maximal efficiency for the sorts of computations you are trying to prove. That is, how to choose the numbers of columns `W` and `A`, and how to define the set of constraint types.

Let's describe the trade-offs involved here.

The majority of the proving time for the PLONK prover is in
- committing to the `W + A` column polynomials, which have length equal to the number of rows `n`
- committing to the "permutation accumulator polynomial, which has length `n`.
- committing to the quotient polynomial, which reduces to computing `max(k, W)` MSMs of size `n`, where `k` is the max degree of a constraint.
- performing the commitment opening proof, which is mostly dependent on the number of rows `n`.

So all in all, the proving time is approximately equal to the time to perform `W + A + 1 + max(k - 1, W)` MSMs of size `n`, plus the cost of an opening proof for polynomials of degree `n - 1`.

and maybe
- computing the combined constraint polynomial, which has degree `k * n` where `k` is the maximum degree of a constraint

- Increasing `W` and `A` increase proof size, and they potentially impact the prover-time as the prover must compute polynomial commitments to each column, and computing a polynomial commitment corresponds to doing one MSM (multi-scalar multiplication, also called a multi-exponentiation.)

However, often increasing the number of columns allows you to decrease the number of rows required for a given computation. For example, if you can perform one Poseidon hash in 36 rows with 5 total columns, then you can also perform it in 12 (= 36 / 3) rows with 15 (= 5 * 3) total columns.

**Decreasing the number of rows (even while keeping the total number of table entries the same) is desirable because it reduces the cost of the polynomial commitment opening proof, which is dominated by a factor linear in the number of rows, and barely depends on the number of columns.**

Increasing the number of columns also increases verifier time, as the verifier must perform one scalar-multiplication and one hash per column. Proof length is also affected by a larger number of columns, as more polynomials need to be committed and sent along to the verifier.

There is typically some interplay between these
# Custom constraints
24 changes: 11 additions & 13 deletions book/src/fundamentals/proof_systems.md
Original file line number Diff line number Diff line change
@@ -1,31 +1,29 @@
# Overview
# Proof Systems Design Overview

Many modern proof systems (and I think all that are in use) are constructed according to the following recipe.

1. You start out with a class of computations.

2. You devise a way to *arithmetize* those computations. That is, to express your computation as a statement about polynomials.

More specifically, you describe what is often called an "algebraic interactive oracle proof" (AIOP) that encodes your computation. An AIOP is a protocol describing an interaction between a prover and a verifier, in which the prover sends the verifier some "polynomial oracles" (basically a black box function that given a point evaluates a polynomial at that point), the verifier sends the prover random challenges, and at the end, the verifier queries the prover's polynomials at points of its choosing and makes a decision as to whether it has been satisfied by the proof.

3. An AIOP is an imagined interaction between parties. It is an abstract description of the protocol that will be "compiled" into a SNARK. There are several "non-realistic" aspects about it. One is that the prover sends the verifier black-box polynomials that the verifier can evaluate. These polynomials have degree comparable to the size of the computation being verified. If we implemented these "polynomial oracles" by having the prover really send the $O(n)$ size polynomials (say by sending all their coefficients), then we would not have a zk-SNARK at all, since the verifier would have to read this linearly sized polynomial so we would lose succinctness, and the polynomials would not be black-box functions, so we may lose zero-knowledge.

Instead, when we concretely instantiate the AIOP, we have the prover send constant-sized, hiding *polynomial commitments*. Then, in the phase of the AIOP where the verifier queries the polynomials, the prover sends an *opening proof* for the polynomial commitments which the verifier can check, thus simulating the activity of evaluating the prover's polynomials on your own.

So this is the next step of making a SNARK: instantiating the AIOP with a polynomial commitment scheme of one's choosing. There are several choices here and these affect the properties of the SNARK you are constructing, as the SNARK will inherit efficiency and setup properties of the polynomial commitment scheme used.

4. An AIOP describes an interactive protocol between the verifier and the prover. In reality, typically, we also want our proofs to be non-interactive.
4. An AIOP describes an interactive protocol between the verifier and the prover. In reality, typically, we also want our proofs to be non-interactive.

This is accomplished by what is called the [Fiat--Shamir transformation](). The basic idea is this: all that the verifier is doing is sampling random values to send to the prover. Instead, to generate a "random" value, the prover simulates the verifier by hashing its messages. The resulting hash is used as the "random" challenge.

At this point we have a fully non-interactive proof. Let's review our steps.

1. Start with a computation.

2. Translate the computation into a statement about polynomials and design a corresponding AIOP.

3. Compile the AIOP into an interactive protocol by having the prover send hiding polynomial commitments instead of polynomial oracles.

4. Get rid of the verifier-interaction by replacing it with a hash function. I.e., apply the Fiat--Shamir transform.

3. Compile the AIOP into an interactive protocol by having the prover send hiding polynomial commitments instead of polynomial oracles.

4. Get rid of the verifier-interaction by replacing it with a hash function. I.e., apply the Fiat--Shamir transform.
Loading
Loading