Preliminaries

This page is a description of the cryptographic components we use as building blocks to design the protocol.

Polynomial commitment:

A polynomial commitment is a useful cryptographic tool, used to commit to a polynomial. Intuitively, the commitment will bind and hide the content of it, namely, the polynomial. Furthermore, the prover has the ability to give an evaluation point of the polynomial and the proof of its correctness, without leaking other information of the polynomial.

Formally, a polynomial commitment consists of three algorithms:

  • Commit(ϕ(.))(\phi(.)): the algorithm outputs a commitment RR of the polynomial (ϕ(.))(\phi(.)).

  • Prove(ϕ,x,R)(\phi, \vec{x},R): given an evaluation point ϕ(x)\phi(\vec{x}), the algorithm outputs a tuple<x,ϕ(x),πx><\vec{x}, \phi(\vec{x}), \pi_{\vec{x}}> , where πx\pi_{\vec{x}} is the proof.

  • VerifyEval(x,ϕ(x),πx,R)(\vec{x}, \phi(\vec{x}), \pi_{\vec{x}}, R): given (x,ϕ(x),πx,R)(\vec{x}, \phi(\vec{x}), \pi_{\vec{x}}, R) the algorithm checks if ϕ(x)\phi(\vec{x}) is the correct evaluation. The algorithm outputs accept or reject.

Linear code:

A linear error-correcting code with message length kk and codeword length nn is a linear subspace CFnC ∈ F^n, such that there exists an injective mapping from message to codeword EC:FkCE_C : F^k → C, which is called the encoder of the code.

  • Any linear combination of codewords is also a codeword.

  • The rate of the code is defined as kn\frac{k}{n} .

  • The distance between two codewords u,vu, v is the hamming distance denoted as (u,v)∆(u, v).

  • The minimum distance is d=minu,v(u,v)d = min_{u,v} ∆(u, v).

Such a code is denoted as [n,k,d][n, k, d] linear code, and we also refer to dn\frac{d}{n} as the relative distance of the code.

Tensor code:

Let CC be a [n,k,d][n, k, d] linear code, the tensor code C2C^{⊗2} of dimension 2 is the linear code in Fn2F^{n^2} with message length k2k^2 , codeword length n2n^2 , and distance ndnd. We can view the codeword as a n×nn × n matrix. We define the encoding function below:

  1. A message of length k×kk×k is parsed as a k×kk×k matrix. Each row of the matrix is encoded using ECE_C , resulting in a codeword C1C_1 of size k×nk × n.

  2. Each column of C1C_1 is then encoded again using ECE_C . The result C2C_2 of size n×nn × n is the codeword of the tensor code.

Last updated