Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

o1vm/pickles: prover #2603

Merged
merged 2 commits into from
Oct 4, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
29 changes: 23 additions & 6 deletions o1vm/src/pickles/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,11 @@ use ark_ff::UniformRand;
use kimchi::circuits::domains::EvaluationDomains;
use kimchi_msm::expr::E;
use log::debug;
use mina_curves::pasta::VestaParameters;
use mina_poseidon::{
constants::PlonkSpongeConstantsKimchi,
sponge::{DefaultFqSponge, DefaultFrSponge},
};
use o1vm::{
cannon::{self, Meta, Start, State},
cannon_cli,
Expand All @@ -12,11 +17,14 @@ use o1vm::{
witness::{self as mips_witness},
Instruction,
},
pickles::proof::ProofInputs,
pickles::{
proof::{Proof, ProofInputs},
prover,
},
preimage_oracle::PreImageOracle,
};
use poly_commitment::{ipa::SRS, SRS as _};
use std::{fs::File, io::BufReader, process::ExitCode};
use std::{fs::File, io::BufReader, process::ExitCode, time::Instant};
use strum::IntoEnumIterator;

use mina_curves::pasta::{Fp, Vesta};
Expand Down Expand Up @@ -60,7 +68,7 @@ pub fn main() -> ExitCode {
env_logger::Builder::from_env(env_logger::Env::default().default_filter_or("info")).init();

let domain_fp = EvaluationDomains::<Fp>::create(DOMAIN_SIZE).unwrap();
let _srs: SRS<Vesta> = {
let srs: SRS<Vesta> = {
let mut srs = SRS::create(DOMAIN_SIZE);
srs.add_lagrange_basis(domain_fp.d1);
srs
Expand Down Expand Up @@ -93,8 +101,7 @@ pub fn main() -> ExitCode {

let mut curr_proof_inputs: ProofInputs<Vesta> = ProofInputs::new(DOMAIN_SIZE);
while !mips_wit_env.halt {
let instr: Instruction = mips_wit_env.step(&configuration, &meta, &start);
debug!("Instruction {:?} has been executed", instr);
let _instr: Instruction = mips_wit_env.step(&configuration, &meta, &start);
// FIXME: add selectors
for (scratch, scratch_chunk) in mips_wit_env
.scratch_state
Expand All @@ -117,11 +124,21 @@ pub fn main() -> ExitCode {

if curr_proof_inputs.evaluations.instruction_counter.len() == DOMAIN_SIZE {
// FIXME
let start_iteration = Instant::now();
debug!("Limit of {DOMAIN_SIZE} reached. We make a proof, verify it (for testing) and start with a new branch new chunk");
let _proof: Proof<Vesta> = prover::prove::<
Vesta,
DefaultFqSponge<VestaParameters, PlonkSpongeConstantsKimchi>,
DefaultFrSponge<Fp, PlonkSpongeConstantsKimchi>,
_,
>(domain_fp, &srs, curr_proof_inputs, &mut rng);
debug!(
"Proof generated in {elapsed} μs",
elapsed = start_iteration.elapsed().as_micros()
);
curr_proof_inputs = ProofInputs::new(DOMAIN_SIZE);
}
}

// TODO: Logic
ExitCode::SUCCESS
}
1 change: 1 addition & 0 deletions o1vm/src/pickles/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@
//! ```

pub mod proof;
pub mod prover;

#[cfg(test)]
mod tests;
10 changes: 10 additions & 0 deletions o1vm/src/pickles/proof.rs
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
use kimchi::curve::KimchiCurve;
use poly_commitment::{ipa::OpeningProof, PolyComm};

pub struct WitnessColumns<G> {
pub scratch: [G; crate::interpreters::mips::witness::SCRATCH_SIZE],
Expand All @@ -23,3 +24,12 @@ impl<G: KimchiCurve> ProofInputs<G> {
}
}
}

// FIXME: should we blind the commitment?
pub struct Proof<G: KimchiCurve> {
pub commitments: WitnessColumns<PolyComm<G>>,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we'll need more explicit names when we'll have more polynomials than the witness columns, but ok for now

pub zeta_evaluations: WitnessColumns<G::ScalarField>,
pub zeta_omega_evaluations: WitnessColumns<G::ScalarField>,
/// IPA opening proof
pub opening_proof: OpeningProof<G>,
}
221 changes: 221 additions & 0 deletions o1vm/src/pickles/prover.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,221 @@
use ark_ec::AffineRepr;
use ark_ff::{PrimeField, Zero};
use ark_poly::{univariate::DensePolynomial, Evaluations, Polynomial, Radix2EvaluationDomain as D};
use kimchi::{
circuits::domains::EvaluationDomains, curve::KimchiCurve, groupmap::GroupMap,
plonk_sponge::FrSponge,
};
use mina_poseidon::{sponge::ScalarChallenge, FqSponge};
use poly_commitment::{
commitment::{absorb_commitment, PolyComm},
ipa::{DensePolynomialOrEvaluations, OpeningProof, SRS},
OpenProof as _, SRS as _,
};
use rand::{CryptoRng, RngCore};
use rayon::iter::{IntoParallelIterator, IntoParallelRefIterator, ParallelIterator};

use super::proof::{Proof, ProofInputs, WitnessColumns};

/// Make a PlonKish proof for the given circuit. As inputs, we get the execution
/// trace consisting of evaluations of polynomials over a certain domain
/// `domain`.
///
/// The proof is made of the following steps:
/// 1. For each column, we create a commitment and absorb it in the sponge.
/// 2. FIXME: we compute the quotient polynomial.
/// 3. We evaluate each polynomial (columns + quotient) to two challenges ζ and ζω.
/// 4. We make a batch opening proof using the IPA PCS.
///
/// The final proof consists of the opening proof, the commitments and the
/// evaluations at ζ and ζω.
// FIXME: add constraints (come with the quotient polynomial)
// TODO: we might need blinders when the evaluation of columns are zeroes.
pub fn prove<
G: KimchiCurve,
EFqSponge: FqSponge<G::BaseField, G, G::ScalarField> + Clone,
EFrSponge: FrSponge<G::ScalarField>,
RNG,
>(
domain: EvaluationDomains<G::ScalarField>,
srs: &SRS<G>,
inputs: ProofInputs<G>,
rng: &mut RNG,
) -> Proof<G>
where
<G as AffineRepr>::BaseField: PrimeField,
RNG: RngCore + CryptoRng,
{
let num_chunks = 1;
let omega = domain.d1.group_gen;

let mut fq_sponge = EFqSponge::new(G::other_curve_sponge_params());

////////////////////////////////////////////////////////////////////////////
// Round 1: Creating and absorbing column commitments
////////////////////////////////////////////////////////////////////////////

// FIXME: add selectors
// FIXME: evaluate on a domain higher than d1 for the quotient polynomial.
let ProofInputs { evaluations } = inputs;
let polys = {
let WitnessColumns {
scratch,
instruction_counter,
error,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is the error a leftover from folding ?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No, it is the error code of the process, updated after each instruction.

// FIXME
selector: _,
} = evaluations;
let eval_col = |evals: Vec<G::ScalarField>| {
Evaluations::<G::ScalarField, D<G::ScalarField>>::from_vec_and_domain(evals, domain.d1)
.interpolate()
};
// Doing in parallel
let scratch = scratch.into_par_iter().map(eval_col).collect::<Vec<_>>();
WitnessColumns {
scratch: scratch.try_into().unwrap(),
instruction_counter: eval_col(instruction_counter),
error: eval_col(error.clone()),
// FIXME
selector: eval_col(error),
}
};
let commitments = {
let WitnessColumns {
scratch,
instruction_counter,
error,
// FIXME
selector: _,
} = &polys;
// Note: we do not blind. We might want in the near future in case we
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AFAIK it's not planned for this proof to be zk

// have a column with only zeroes.
let comm = |poly: &DensePolynomial<G::ScalarField>| srs.commit_non_hiding(poly, num_chunks);
// Doing in parallel
let scratch = scratch.par_iter().map(comm).collect::<Vec<_>>();
WitnessColumns {
scratch: scratch.try_into().unwrap(),
instruction_counter: comm(instruction_counter),
error: comm(error),
// FIXME
selector: comm(error),
}
};

// Absorbing the commitments - Fiat Shamir
// We do not parallelize as we need something deterministic.
for comm in commitments.scratch.iter() {
absorb_commitment(&mut fq_sponge, comm)
}
absorb_commitment(&mut fq_sponge, &commitments.instruction_counter);
absorb_commitment(&mut fq_sponge, &commitments.error);

////////////////////////////////////////////////////////////////////////////
// Round 2: Creating and committing to the quotient polynomial
////////////////////////////////////////////////////////////////////////////

// FIXME: add quotient polynomial

////////////////////////////////////////////////////////////////////////////
// Round 3: Evaluations at ζ and ζω
////////////////////////////////////////////////////////////////////////////

let zeta_chal = ScalarChallenge(fq_sponge.challenge());
let (_, endo_r) = G::endos();

let zeta = zeta_chal.to_field(endo_r);
let zeta_omega = zeta * omega;

// FIXME: add selectors
let evals = |point| {
let WitnessColumns {
scratch,
instruction_counter,
error,
// FIXME
selector: _,
} = &polys;
let eval = |poly: &DensePolynomial<G::ScalarField>| poly.evaluate(point);
let scratch = scratch.par_iter().map(eval).collect::<Vec<_>>();
WitnessColumns {
scratch: scratch.try_into().unwrap(),
instruction_counter: eval(instruction_counter),
error: eval(error),
// FIXME
selector: eval(error),
}
};
// All evaluations at ζ
let zeta_evaluations = evals(&zeta);
// All evaluations at ζω
let zeta_omega_evaluations = evals(&zeta_omega);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's unclear to me which poly should be evaluated at zeta_omega
Also this info can (should?) be derived from the constraints rather than hardcoded

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also this info can (should?) be derived from the constraints rather than hardcoded

Yes, we will probably use it with the lookup argument soon.


// Absorbing evaluations with a sponge for the other field
// We initialize the state with the previous state of the fq_sponge
let fq_sponge_before_evaluations = fq_sponge.clone();
let mut fr_sponge = EFrSponge::new(G::sponge_params());
fr_sponge.absorb(&fq_sponge.digest());

for (zeta_eval, zeta_omega_eval) in zeta_evaluations
.scratch
.iter()
.zip(zeta_omega_evaluations.scratch.iter())
{
fr_sponge.absorb(zeta_eval);
fr_sponge.absorb(zeta_omega_eval);
}
fr_sponge.absorb(&zeta_evaluations.instruction_counter);
fr_sponge.absorb(&zeta_omega_evaluations.instruction_counter);
fr_sponge.absorb(&zeta_evaluations.error);
fr_sponge.absorb(&zeta_omega_evaluations.error);
// FIXME: add selectors

////////////////////////////////////////////////////////////////////////////
// Round 4: Opening proof w/o linearization polynomial
////////////////////////////////////////////////////////////////////////////

// Preparing the polynomials for the opening proof
let mut polynomials: Vec<_> = polys.scratch.into_iter().collect();
polynomials.push(polys.instruction_counter);
polynomials.push(polys.error);
// FIXME: add selectors
let polynomials: Vec<_> = polynomials
.iter()
.map(|poly| {
(
DensePolynomialOrEvaluations::DensePolynomial(poly),
// We do not have any blinder, therefore we set to 0.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought the field PolyComm contains the commitment
This comment makes me think it contains the blinder

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed earlier this week, PolyComm is abstracted over any type T, and sometimes a scalar is used.

PolyComm {
elems: vec![G::ScalarField::zero()],
},
)
})
.collect();

// poly scale
let v_chal = fr_sponge.challenge();
let v = v_chal.to_field(endo_r);
// eval scale
let u_chal = fr_sponge.challenge();
let u = u_chal.to_field(endo_r);

let group_map = G::Map::setup();

// Computing the opening proof for the IPA PCS
let opening_proof = OpeningProof::open::<_, _, D<G::ScalarField>>(
srs,
&group_map,
polynomials.as_slice(),
&[zeta, zeta_omega],
v,
u,
fq_sponge_before_evaluations,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why is that sponge state given ?

rng,
);

Proof {
commitments,
zeta_evaluations,
zeta_omega_evaluations,
opening_proof,
}
}