Skip to content

Commit

Permalink
Sync branch with latest changes from upstream
Browse files Browse the repository at this point in the history
  • Loading branch information
chiro-hiro committed Jul 12, 2024
1 parent 11c7d78 commit 848ed02
Show file tree
Hide file tree
Showing 70 changed files with 418 additions and 660 deletions.
361 changes: 92 additions & 269 deletions Cargo.lock

Large diffs are not rendered by default.

14 changes: 7 additions & 7 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,13 @@ members = [
resolver = "2"

[workspace.dependencies]
ark-algebra-test-templates = "0.3.0"
ark-bn254 = { version = "0.3.0" }
ark-ec = { version = "0.3.0", features = ["parallel"] }
ark-ff = { version = "0.3.0", features = ["parallel", "asm"] }
ark-poly = { version = "0.3.0", features = ["parallel"] }
ark-serialize = "0.3.0"
ark-std = "0.3.0"
ark-algebra-test-templates = "0.4.2"
ark-bn254 = { version = "0.4.0" }
ark-ec = { version = "0.4.2", features = [ "parallel" ] }
ark-ff = { version = "0.4.2", features = [ "parallel", "asm" ] }
ark-poly = { version = "0.4.2", features = [ "parallel" ] }
ark-serialize = { version = "0.4.2", features = ["derive"] }
ark-std = "0.4.0"
bcs = "0.1.3"
base64 = "0.21.5"
bitvec = "1.0.0"
Expand Down
12 changes: 6 additions & 6 deletions book/src/fundamentals/zkbook_groups.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ First let's do a quick definition and then some examples.

- $e$ is an identity for $*$. I.e., $e * x = x * e = x$ for all $x$.

- $(\quad)^{-1}$ is an inverse map for $*$ with identity $e$. I.e., $x * x^{-1} = x^{-1} * x = e$ for all $x$.
- $(\quad)^{-1}$ is an inverse map for $*$ with identity $e$. I.e., $x * x^{-1} = x^{-1} * x = e$ for all $x$.

So basically, an invertible binary operation. Definition in hand, we can see some examples:

Expand Down Expand Up @@ -82,7 +82,7 @@ This is the general sense of what is called **scalar-multiplication** or sometim

### Cyclic groups

A cyclic group $G$ is a special kind of abelian group. It is an abelian group generated by a single element $g \in G$. That is, a cyclic group $G$ (generated by $g \in G$) is one in which for every $h \in G$ we have $h = n g$ for some $n \in \mathbb{Z}$.
A cyclic group $G$ is a special kind of abelian group. It is an abelian group generated by a single element $g \in G$. That is, a cyclic group $G$ (generated by $g \in G$) is one in which for every $h \in G$ we have $h = n g$ for some $n \in \mathbb{Z}$.

### Groups in cryptography

Expand Down Expand Up @@ -128,7 +128,7 @@ Now, what are some concrete groups that we can safely make the no-relation or co

Well, the most efficiently implementable groups that people have come up with -- and that we believe satisfy the above assumptions for $\mathcal{A}$ being the class of realistic computations and $\varepsilon$ being something like $1/2^{128}$ -- are elliptic curves over finite fields.

Giving a complete definition of what an elliptic curve is requires a lot of math, and is not very useful from the point of view of cryptography. So we will give a definition that is not complete, but more useful.
Giving a complete definition of what an elliptic curve is requires a lot of math, and is not very useful from the point of view of cryptography. So we will give a definition that is not complete, but more useful.

An elliptic curve $E$ over a field $F$ is a set of the form

Expand Down Expand Up @@ -196,9 +196,9 @@ $$
\{ (X, Y, Z) \in F^3 \mid (Y/Z)^2 = (X/Z)^3 + a(X/Z) + b \}
$$

If you think about it, this is saying that $(X/Z, Y/Z)$ is a point on the original curve in affine form. In other words, in projective form we let the first two coordinates get scaled by some arbitrary scaling factor $Z$, but we keep track of it as the third coordinate.
If you think about it, this is saying that $(X/Z, Y/Z)$ is a point on the original curve in affine form. In other words, in projective form we let the first two coordinates get scaled by some arbitrary scaling factor $Z$, but we keep track of it as the third coordinate.

To be clear, this means curve points have many different representations. If $(x, y, z)$ is a curve point in projective coordinates, and $s$ is any element of $F$, then $(sx,sy,sz)$ is another representation of the same curve point.
To be clear, this means curve points have many different representations. If $(x, y, z)$ is a curve point in projective coordinates, and $s$ is any element of $F$, then $(sx,sy,sz)$ is another representation of the same curve point.

This means curve points require more space to store, but it makes the group operation much more efficient to compute, as we can avoid having to do any field divisions, which are expensive.

Expand Down Expand Up @@ -234,6 +234,6 @@ so the triple $(X, Y, Z)$ corresponds to the affine point $(X/Z^2, Y/Z^3)$. Thes

- todo

- Implement `fn decompress<F: SquareRootField>(c: (F, bool)) -> (F, F)`
- Implement `fn decompress<F: Field>(c: (F, bool)) -> (F, F)`

-
8 changes: 4 additions & 4 deletions book/src/specs/kimchi.md
Original file line number Diff line number Diff line change
Expand Up @@ -2035,7 +2035,7 @@ pub struct ProofEvaluations<Evals> {
#[serde_as]
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(bound = "G: ark_serialize::CanonicalDeserialize + ark_serialize::CanonicalSerialize")]
pub struct LookupCommitments<G: AffineCurve> {
pub struct LookupCommitments<G: AffineRepr> {
/// Commitments to the sorted lookup table polynomial (may have chunks)
pub sorted: Vec<PolyComm<G>>,
/// Commitment to the lookup aggregation polynomial
Expand All @@ -2048,7 +2048,7 @@ pub struct LookupCommitments<G: AffineCurve> {
#[serde_as]
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(bound = "G: ark_serialize::CanonicalDeserialize + ark_serialize::CanonicalSerialize")]
pub struct ProverCommitments<G: AffineCurve> {
pub struct ProverCommitments<G: AffineRepr> {
/// The commitments to the witness (execution trace)
pub w_comm: [PolyComm<G>; COLUMNS],
/// The commitment to the permutation polynomial
Expand All @@ -2063,7 +2063,7 @@ pub struct ProverCommitments<G: AffineCurve> {
#[serde_as]
#[derive(Debug, Clone, Serialize, Deserialize)]
#[serde(bound = "G: ark_serialize::CanonicalDeserialize + ark_serialize::CanonicalSerialize")]
pub struct ProverProof<G: AffineCurve, OpeningProof> {
pub struct ProverProof<G: AffineRepr, OpeningProof> {
/// All the polynomial commitments required in the proof
pub commitments: ProverCommitments<G>,

Expand Down Expand Up @@ -2091,7 +2091,7 @@ pub struct ProverProof<G: AffineCurve, OpeningProof> {
#[serde(bound = "G: ark_serialize::CanonicalDeserialize + ark_serialize::CanonicalSerialize")]
pub struct RecursionChallenge<G>
where
G: AffineCurve,
G: AffineRepr,
{
/// Vector of scalar field elements
#[serde_as(as = "Vec<o1_utils::serialization::SerdeAs>")]
Expand Down
5 changes: 3 additions & 2 deletions curves/Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,13 @@ edition = "2021"
license = "Apache-2.0"

[dependencies]
ark-ec = { version = "0.4.2", features = ["parallel"] }
ark-ff = { version = "0.4.2", features = ["parallel", "asm"] }
ark-ec.workspace = true
ark-ff.workspace = true

[dev-dependencies]
rand = { version = "0.8.5", default-features = false }
ark-test-curves = "0.4.2"
ark-algebra-test-templates = "0.4.2"
ark-serialize="0.4.2"
ark-std = "0.4.0"
num-bigint = "0.4.6"
10 changes: 5 additions & 5 deletions curves/src/pasta/curves/tests.rs
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
use std::str::FromStr;

use crate::pasta::{Fp, Pallas};
use ark_algebra_test_templates::{curves::*, groups::*};
use ark_algebra_test_templates::{ groups::*};
use ark_ec::AffineRepr;
use ark_std::test_rng;
use num_bigint::BigUint;
Expand All @@ -26,7 +26,7 @@ fn test_pallas_projective_group() {

#[test]
fn test_pallas_generator() {
let generator = pallas::Pallas::prime_subgroup_generator();
let generator = pallas::Pallas::generator();
assert!(generator.is_on_curve());
assert!(generator.is_in_correct_subgroup_assuming_on_curve());
}
Expand All @@ -42,7 +42,7 @@ fn test_regression_vesta_biguint_into_returns_canonical_representation() {
"12418654782883325593414442427049395787963493412651469444558597405572177144507",
)
.unwrap();
let p1 = Pallas::new(p_x, p_y, false);
let p1 = Pallas::new(p_x, p_y);
let p_x_biguint: BigUint = p1.x.into();
let p_y_biguint: BigUint = p1.y.into();

Expand All @@ -66,7 +66,7 @@ fn test_regression_vesta_addition_affine() {
"12418654782883325593414442427049395787963493412651469444558597405572177144507",
)
.unwrap();
let p1 = Pallas::new(p1_x, p1_y, false);
let p1 = Pallas::new(p1_x, p1_y);

let p2_x = Fp::from_str(
"20444556541222657078399132219657928148671392403212669005631716460534733845831",
Expand All @@ -76,7 +76,7 @@ fn test_regression_vesta_addition_affine() {
"12418654782883325593414442427049395787963493412651469444558597405572177144507",
)
.unwrap();
let p2 = Pallas::new(p2_x, p2_y, false);
let p2 = Pallas::new(p2_x, p2_y);

// The type annotation ensures we have a point with affine coordinates,
// relying on implicit conversion if the addition outputs a point in a
Expand Down
40 changes: 20 additions & 20 deletions folding/src/checker.rs
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ use crate::{
instance_witness::Instance,
ExpExtension, FoldingConfig, Radix2EvaluationDomain, RelaxedInstance, RelaxedWitness,
};
use ark_ec::AffineCurve;
use ark_ec::AffineRepr;
use ark_ff::{Field, Zero};
use ark_poly::Evaluations;
use kimchi::circuits::{expr::Variable, gate::CurrOrNext};
Expand Down Expand Up @@ -85,33 +85,33 @@ pub trait Provide<C: FoldingConfig> {
fn resolve(
&self,
inner: FoldingCompatibleExprInner<C>,
domain: Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
) -> Vec<<C::Curve as AffineCurve>::ScalarField>;
domain: Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
) -> Vec<<C::Curve as AffineRepr>::ScalarField>;
}

impl<C: FoldingConfig> Provide<C> for Provider<C>
where
C::Witness: Index<
C::Column,
Output = Evaluations<
<C::Curve as AffineCurve>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
<C::Curve as AffineRepr>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
>,
>,
C::Witness: Index<
C::Selector,
Output = Evaluations<
<C::Curve as AffineCurve>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
<C::Curve as AffineRepr>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
>,
>,
C::Instance: Index<C::Challenge, Output = <C::Curve as AffineCurve>::ScalarField>,
C::Instance: Index<C::Challenge, Output = <C::Curve as AffineRepr>::ScalarField>,
{
fn resolve(
&self,
inner: FoldingCompatibleExprInner<C>,
domain: Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
) -> Vec<<C::Curve as AffineCurve>::ScalarField> {
domain: Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
) -> Vec<<C::Curve as AffineRepr>::ScalarField> {
let domain_size = domain.size as usize;
match inner {
FoldingCompatibleExprInner::Constant(c) => {
Expand Down Expand Up @@ -145,24 +145,24 @@ where
C::Witness: Index<
C::Column,
Output = Evaluations<
<C::Curve as AffineCurve>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
<C::Curve as AffineRepr>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
>,
>,
C::Witness: Index<
C::Selector,
Output = Evaluations<
<C::Curve as AffineCurve>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
<C::Curve as AffineRepr>::ScalarField,
Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
>,
>,
C::Instance: Index<C::Challenge, Output = <C::Curve as AffineCurve>::ScalarField>,
C::Instance: Index<C::Challenge, Output = <C::Curve as AffineRepr>::ScalarField>,
{
fn resolve(
&self,
inner: FoldingCompatibleExprInner<C>,
domain: Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
) -> Vec<<C::Curve as AffineCurve>::ScalarField> {
domain: Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
) -> Vec<<C::Curve as AffineRepr>::ScalarField> {
match inner {
FoldingCompatibleExprInner::Extensions(ext) => match ext {
ExpExtension::U => {
Expand Down Expand Up @@ -204,8 +204,8 @@ pub trait Checker<C: FoldingConfig>: Provide<C> {
fn check_rec(
&self,
exp: FoldingCompatibleExpr<C>,
domain: Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
) -> Vec<<C::Curve as AffineCurve>::ScalarField> {
domain: Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
) -> Vec<<C::Curve as AffineRepr>::ScalarField> {
let e2 = exp.clone();
let res = match exp {
FoldingCompatibleExpr::Atom(inner) => self.resolve(inner, domain),
Expand Down Expand Up @@ -249,7 +249,7 @@ pub trait Checker<C: FoldingConfig>: Provide<C> {
fn check(
&self,
exp: &FoldingCompatibleExpr<C>,
domain: Radix2EvaluationDomain<<C::Curve as AffineCurve>::ScalarField>,
domain: Radix2EvaluationDomain<<C::Curve as AffineRepr>::ScalarField>,
) {
let res = self.check_rec(exp.clone(), domain);
for (i, row) in res.iter().enumerate() {
Expand Down
4 changes: 2 additions & 2 deletions folding/src/columns.rs
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@
//! scheme as they describe the basic expressiveness of the system.

use crate::FoldingConfig;
use ark_ec::AffineCurve;
use ark_ec::AffineRepr;
use derivative::Derivative;
use kimchi::circuits::expr::Variable;

Expand All @@ -28,7 +28,7 @@ pub enum ExtendedFoldingColumn<C: FoldingConfig> {
/// The error term introduced in the "relaxed" instance.
Error,
/// A constant value in our expression
Constant(<C::Curve as AffineCurve>::ScalarField),
Constant(<C::Curve as AffineRepr>::ScalarField),
/// A challenge used by the PIOP or the folding scheme.
Challenge(C::Challenge),
/// A list of randomizer to combine expressions
Expand Down
18 changes: 9 additions & 9 deletions folding/src/expressions.rs
Original file line number Diff line number Diff line change
Expand Up @@ -276,7 +276,7 @@ use crate::{
quadraticization::{quadraticize, ExtendedWitnessGenerator, Quadraticized},
FoldingConfig, ScalarField,
};
use ark_ec::AffineCurve;
use ark_ec::AffineRepr;
use ark_ff::One;
use derivative::Derivative;
use itertools::Itertools;
Expand Down Expand Up @@ -366,7 +366,7 @@ pub enum ExpExtension<C: FoldingConfig> {
Debug(bound = "C: FoldingConfig")
)]
pub enum FoldingCompatibleExprInner<C: FoldingConfig> {
Constant(<C::Curve as AffineCurve>::ScalarField),
Constant(<C::Curve as AffineRepr>::ScalarField),
Challenge(C::Challenge),
Cell(Variable<C::Column>),
/// extra nodes created by folding, should not be passed to folding
Expand Down Expand Up @@ -748,7 +748,7 @@ impl<C: FoldingConfig> FoldingExp<C> {
Mul(e1, e2)
}
// TODO: Replace with `Pow`
FoldingExp::Pow(_, 0) => Atom(Constant(<C::Curve as AffineCurve>::ScalarField::one())),
FoldingExp::Pow(_, 0) => Atom(Constant(<C::Curve as AffineRepr>::ScalarField::one())),
FoldingExp::Pow(e, 1) => e.into_compatible(),
FoldingExp::Pow(e, i) => {
let e = e.into_compatible();
Expand Down Expand Up @@ -930,7 +930,7 @@ pub fn extract_terms<C: FoldingConfig>(exp: FoldingExp<C>) -> Box<dyn Iterator<I
Pow(_, 0) => Box::new(
[Term {
exp: FoldingExp::Atom(ExtendedFoldingColumn::Constant(
<C::Curve as AffineCurve>::ScalarField::one(),
<C::Curve as AffineRepr>::ScalarField::one(),
)),
sign: Sign::Pos,
}]
Expand Down Expand Up @@ -1003,7 +1003,7 @@ pub fn folding_expression<C: FoldingConfig>(

impl<F, Config: FoldingConfig> From<ConstantExprInner<F>> for FoldingCompatibleExprInner<Config>
where
Config::Curve: AffineCurve<ScalarField = F>,
Config::Curve: AffineRepr<ScalarField = F>,
Config::Challenge: From<ChallengeTerm>,
{
fn from(expr: ConstantExprInner<F>) -> Self {
Expand All @@ -1024,7 +1024,7 @@ where
impl<F, Col, Config: FoldingConfig<Column = Col>> From<ExprInner<ConstantExprInner<F>, Col>>
for FoldingCompatibleExprInner<Config>
where
Config::Curve: AffineCurve<ScalarField = F>,
Config::Curve: AffineRepr<ScalarField = F>,
Config::Challenge: From<ChallengeTerm>,
{
// TODO: check if this needs some special treatment for Extensions
Expand All @@ -1045,7 +1045,7 @@ where
impl<F, Col, Config: FoldingConfig<Column = Col>>
From<Operations<ExprInner<ConstantExprInner<F>, Col>>> for FoldingCompatibleExpr<Config>
where
Config::Curve: AffineCurve<ScalarField = F>,
Config::Curve: AffineRepr<ScalarField = F>,
Config::Challenge: From<ChallengeTerm>,
{
fn from(expr: Operations<ExprInner<ConstantExprInner<F>, Col>>) -> Self {
Expand All @@ -1071,7 +1071,7 @@ where
impl<F, Col, Config: FoldingConfig<Column = Col>> From<Operations<ConstantExprInner<F>>>
for FoldingCompatibleExpr<Config>
where
Config::Curve: AffineCurve<ScalarField = F>,
Config::Curve: AffineRepr<ScalarField = F>,
Config::Challenge: From<ChallengeTerm>,
{
fn from(expr: Operations<ConstantExprInner<F>>) -> Self {
Expand All @@ -1097,7 +1097,7 @@ impl<F, Col, Config: FoldingConfig<Column = Col>>
From<Operations<ExprInner<Operations<ConstantExprInner<F>>, Col>>>
for FoldingCompatibleExpr<Config>
where
Config::Curve: AffineCurve<ScalarField = F>,
Config::Curve: AffineRepr<ScalarField = F>,
Config::Challenge: From<ChallengeTerm>,
{
fn from(expr: Operations<ExprInner<Operations<ConstantExprInner<F>>, Col>>) -> Self {
Expand Down
8 changes: 4 additions & 4 deletions folding/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
//! [expressions].
// TODO: the documentation above might need more descriptions.

use ark_ec::AffineCurve;
use ark_ec::AffineRepr;
use ark_ff::{Field, One, Zero};
use ark_poly::{EvaluationDomain, Evaluations, Radix2EvaluationDomain};
use error_term::{compute_error, ExtendedEnv};
Expand Down Expand Up @@ -59,8 +59,8 @@ pub mod checker;
// complexity for clippy.
// Should be moved into FoldingConfig, but associated type defaults are unstable
// at the moment.
type ScalarField<C> = <<C as FoldingConfig>::Curve as AffineCurve>::ScalarField;
type BaseField<C> = <<C as FoldingConfig>::Curve as AffineCurve>::BaseField;
type ScalarField<C> = <<C as FoldingConfig>::Curve as AffineRepr>::ScalarField;
type BaseField<C> = <<C as FoldingConfig>::Curve as AffineRepr>::BaseField;

// 'static seems to be used for expressions. Can we get rid of it?
pub trait FoldingConfig: Debug + 'static {
Expand Down Expand Up @@ -91,7 +91,7 @@ pub trait FoldingConfig: Debug + 'static {
type Structure: Clone;

type Env: FoldingEnv<
<Self::Curve as AffineCurve>::ScalarField,
<Self::Curve as AffineRepr>::ScalarField,
Self::Instance,
Self::Witness,
Self::Column,
Expand Down
Loading

0 comments on commit 848ed02

Please sign in to comment.