Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
18 changes: 18 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
translations/es/content/addresses.aux
translations/es/content/advancedSchnorr.aux
translations/es/content/appendix.aux
translations/es/content/basicConcepts.aux
translations/es/content/blockchain.aux
translations/es/content/commitments.aux
translations/es/content/escrowedMarket.aux
translations/es/content/introduction.aux
translations/es/content/multisig.aux
translations/es/content/transactions.aux
translations/es/content/txKnowledgeProofs.aux
translations/es/content/txTangle.aux
translations/es/front/abstract.aux
translations/es/main.aux
translations/es/main.log
translations/es/main.out
translations/es/main.pdf
translations/es/main.toc
1,393 changes: 1,393 additions & 0 deletions translations/es/back/references.bib

Large diffs are not rendered by default.

40 changes: 40 additions & 0 deletions translations/es/content/Monero.tex
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
/chapter{Monero the Cryptocurrency}
/label{chapter:Monero-cryptocurrency}

final chapter ties it all together (? technical enough to skip rest of report? might make it too long - aim to redirect to other parts of report)


-curve ed25519
-address creation
-basic
-subaddresses
-multisig
-payment ID (integrated addresses)
-receipt of old transactions
-scan blockchain, look at output addresses, use view key and transaction public key to see if any spend keys match
-organize results
-building transaction
-structure of a transaction (how the data is serialized/organized and transmitted)
-obtain addresses (or subaddresses) of intended recipients, specify amounts intended for each
-from sending amounts + fee, select a set of owned outputs with amounts sum(a)>=(sum(b)+fee), and set change = sum(a) - (sum(b)+fee)
-construct outputs:
-create transaction public key (or keys if at least one subaddress and >1 output)
-create one-time output address for each output
-commit to each output amount C(b)
-create D-H 'amount' and 'mask' terms for each output commitment
-range proof: borromean ring signature on each output amount
-if payment ids, encrypt them
-construct inputs:
-create pseudo output commitments if transaction type rcttypesimple
-select ring members for each input mlsag
-calculate key images for each input
-build MLSAG signatures on each input, signing a message that contains all other transaction info
-fill out transaction data structure appropriately (continuous throughout transaction procedure)
-submission of transaction
-verified, placed in mempool
-mining into blockchain
-transactions organized into merkle tree, hashed
-nonce searched until difficulty reached
-block published to network
-block accepted or rejected (orphaned); consensus mechanism
-[potential] signature data pruned
165 changes: 165 additions & 0 deletions translations/es/content/Transactions.tex

Large diffs are not rendered by default.

165 changes: 165 additions & 0 deletions translations/es/content/addresses.tex

Large diffs are not rendered by default.

400 changes: 400 additions & 0 deletions translations/es/content/advancedSchnorr.tex

Large diffs are not rendered by default.

360 changes: 360 additions & 0 deletions translations/es/content/appendix.tex

Large diffs are not rendered by default.

519 changes: 519 additions & 0 deletions translations/es/content/basicConcepts.tex

Large diffs are not rendered by default.

491 changes: 491 additions & 0 deletions translations/es/content/blockchain.tex

Large diffs are not rendered by default.

136 changes: 136 additions & 0 deletions translations/es/content/bulletproofs.tex
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
\chapter{Bulletproofs (WIP)}
\label{chapter:bulletproofs}
%this chapter was not finished since I decided not to include a bulletproofs explanation in ztm2


%----------
\section{Vector Knowledge Proof}
\label{sec:vectorzkproof}

Prove knowledge of all elements in vector set $\boldsymbol{V}$ containing $m$ vectors of size $N$ ($m \neq N$), committed to in $(C_1, C_2,..., C_m)$ where $C_i = v_{i,1}*H_1 + v_{i,2}*H_2 + ... + v_{i,N}*H_N$. The discrete logarithms with respect to $G$ of generators in vector $\boldsymbol{H} = \langle H_1,..., H_N \rangle$, $\boldsymbol{\lambda} = \langle \lambda_1, ..., \lambda_N \rangle$, are unknown (and e.g. the discrete log of $H_2$ with respect to $H_4$). Verifiers know there are $m*N$ elements, but gain no information about them.

\subsubsection*{Non-interactive proof}

\begin{enumerate}
\item Generate a vector (size N+1) of random integers \(\boldsymbol{\alpha} \in_R \mathbb{Z}_l\), and compute the inner product $C_{\alpha} = \boldsymbol{\alpha} \bullet \textbf{H}$.\footnote{The inner product (a.k.a. dot product) between vectors \(\boldsymbol{v} = \langle 1, 2, 3 \rangle\) and \(\boldsymbol{z} = \langle 4, 5, 6 \rangle\) is \(\boldsymbol{v} \bullet \boldsymbol{z} = 1*4 + 2*5 + 3*6 = 32\).}
\item Calculate the {\em challenge} $c = \mathcal{H}(T_{vec},\boldsymbol{V},C_{\alpha})$.\footnote{We explicitly do domain separation here with $T_{vec}$, and key prefixing with $\boldsymbol{V}$. For the rest of the chapter it is implied with ellipses (...). Those ellipses could also include a message $\mathfrak{m}$ to make the proof a signature.}
\item Define the {\em response vector} (size N+1) $\textbf{r}$, with an element $r_j$ for each generator $H_j$ (imagine vectors listed out in rows, and here we take the column sum with one $v_{i,j}$ from each vector $i$) (note each one is multiplied by $c$ raised to power $i$)
\[r_j = \alpha_j - \sum^{m}_{i=1} c^i*v_{i,j}\]
\item Publish the proof pair $(c, \boldsymbol{r})$.
\end{enumerate}


\subsubsection*{Verification}

\begin{enumerate}
\item Calculate the challenge: \(c' = \mathcal{H}(...,\textbf{r} \bullet \textbf{H} + \sum^{m}_{i=1} (c')^i*C_i)\).
\item If $c = c'$ then the prover must know $\boldsymbol{V}$ (except with negligible probability).
\end{enumerate}

\subsubsection*{Why it works}

\begin{align*}
\textbf{r} \bullet \textbf{H} &= C_{\alpha} - \sum^{m}_{i=1} (c')^i*C_i \\
\sum^{N}_{j=1} [r_j*H_j] &= C_{\alpha} - \sum^{m}_{i=1} (c')^i*C_i \\
\sum^{N}_{j=1} [(\alpha_j - \sum^{m}_{i=1} c^i*v_{i,j})*H_j] &= C_{\alpha} - \sum^{m}_{i=1} (c')^i*C_i \\
(\sum^{N}_{j=1} \alpha_j*H_j) - \sum^{N}_{j=1} [(\sum^{m}_{i=1} c^i*v_{i,j})*H_j] &= C_{\alpha} - \sum^{m}_{i=1} (c')^i*C_i \\
C_{\alpha} - \sum^{m}_{i=1} c^i*[\sum^{N}_{j=1} v_{i,j}*H_j] &= C_{\alpha} - \sum^{m}_{i=1} (c')^i*C_i \\
C_{\alpha} - \sum^{m}_{i=1} c^i*C_i &= C_{\alpha} - \sum^{m}_{i=1} (c')^i*C_i \\
\end{align*}

Verifiers can be confident that provers know all elements of $\boldsymbol{V}$ (except with negligible probability) thanks to a confluence of features in this proof (combined with the logic from basic Schnorr Signatures, explained in Section \ref{sec:schnorr-fiat-shamir}).%Verifiers can compute $R' = C_{\alpha} + \sum^{m}_{i=1} c^i*C_i$ before the prover, so providing $c$ is like saying ``I challenge you to respond with the discrete logarithm vector $\boldsymbol{r}$ of $R'$."

\subsubsection*{Justification of Features}

\begin{enumerate}
\item If we do not use $\boldsymbol{H}$ or powers of $c$, and the response was instead $r = \sum \boldsymbol{\alpha} + c* \sum_{i=1}^{m} (\sum_{j=1}^{N} v_{i,j})$, then provers have only demonstrated they know the sum of all vector elements (at most). Or, in other words, just the discrete logarithm of $K = \sum_{i=1}^{m} C_i$ as in a normal Schnorr signature.

\item Adding in $\boldsymbol{H}$ makes $r_j = \alpha_j + c* \sum_{i=1}^{m} v_{i,j}$, which ensures provers must at least know the sum at each vector index $j$. While there may exist many other $\boldsymbol{r'}$ vectors that produce $R = \boldsymbol{r'} \bullet \boldsymbol{H} = r'_1*\lambda_1*G + r'_2*\lambda_2*G + ...$, it is very difficult to find any of them using just the sum of all elements (step 1) without knowing $\boldsymbol{\lambda}$ (which we assume to be unknown).\footnote{Even if two or more vector indexes have the same sum ($\sum_{i=1}^{m} v_{i,a} = \sum_{i=1}^{m} v_{i,b} = k$), $k$ will not be revealed since in the equation pair $r_a = \alpha_a + c*k$ and $r_b = \alpha_b + c*k$ there are three unknowns ($\alpha_a$, $\alpha_b$, and $k$), and each additional equation $r_q = \alpha_q + c*k$ adds another unknown ($\alpha_q$).}

\item Including powers of $c$ turns each response element $r_j$ into a polynomial function of $c$, $r_j(c) = \alpha_j + v_{1,j}*c + v_{2,j}*c^2 + ... + v_{m,j}*c^m$. Now, given $c$, there are many combinations of ($\alpha'_j, v'_{1,j}, ..., v'_{m,j}$) that will produce $r_j$, but the probability of guessing one is negligible since $r_j$ is itself unknown (and can be anything between 1 and $l$ [$l$ is the elliptic curve subgroup]). Even if the sum at each vector index is known (step 2), guessing a useful factorization becomes increasingly difficult as the size of $\boldsymbol{V}$ rises (usually the only one that works will be the actual elements of $\boldsymbol{V}$). Moreover, $c$ is unknown in advance so provers can't report a random $r_j$ (recall Section \ref{sec:schnorr-fiat-shamir}). This means provers must know all elements of $\boldsymbol{V}$.\footnote{A person with partial knowledge of $\boldsymbol{V}$ can increase his chances of faking the proof (by only a negligible amount in most cases). For an extreme example, if he knows all elements of $\boldsymbol{V}$ except one, he knows all commitments $C_j$ and their blinding factors, and he knows the missing element is in the range ($q$ to $p$) where ($p - q < l$), then his chances of guessing that element (and checking by computing $C'_j$) are higher than solving the discrete logarithm problem via guess and check. Though in this case he ultimately knows all elements of $\boldsymbol{V}$ anyway. This logic extends to other partial-knowledge scenarios.}
\end{enumerate}


%----------
\section{Inner Product Proof}

Suppose we have two vectors $\boldsymbol{v}$ and $\boldsymbol{z}$, each with $N$ elements. Their inner product is $s = \boldsymbol{v} \bullet \boldsymbol{z}$, we have their commitments $[C_s = x_s G + s H_1, C_v = (x_v, \boldsymbol{v}) \bullet (G,\boldsymbol{H}), C_z = (x_z, \boldsymbol{z}) \bullet (G,\boldsymbol{H})]$\footnote{Our notation $(x_v, \boldsymbol{v}) \bullet (G,\boldsymbol{H})$ here means blinding factor $x_v$ is appended to the front of $\boldsymbol{v}$ for the inner product, and likewise with generator $G$.}, and we want to prove the inner product equation holds (and that we know all the elements) without revealing any information (aside from N) about $(s, \boldsymbol{v}, \boldsymbol{z})$. In other words, prove the value in $C_s$ is the inner product of the vectors, size N, in $C_v$ and $C_z$. Note that now we include blinding factors $x_s$, $x_v$, and $x_z$, which are not strictly part of the vectors being considered. This concept will be useful in later sections, where it is important to hide the original values with a random mask.

Basically, we do two vector knowledge proofs (Section \ref{sec:vectorzkproof}) for $\boldsymbol{v}$ and $\boldsymbol{z}$, and an extra bit for the inner product.


\subsubsection*{Non-interactive proof}

\begin{enumerate}
\item Generate two vectors (size N) and four integers \((\alpha_{v,0}, \boldsymbol{\alpha_v}, \alpha_{z,0}, \boldsymbol{\alpha_z}, \alpha_{s,0}, \alpha_{s,1}) \in_R \mathbb{Z}_l\), and compute
\begin{align*}
C_{\alpha}^{v} &= (\alpha_{v,0},\boldsymbol{\alpha_v}) \bullet (G,\textbf{H}) \\
C_{\alpha}^{z} &= (\alpha_{z,0}, \boldsymbol{\alpha_z}) \bullet (G,\textbf{H}) \\
C_{\alpha}^{s,0} &= \alpha_{s,0}*G + (\boldsymbol{\alpha_v} \bullet \boldsymbol{\alpha_z})*H_1 \\
C_{\alpha}^{s,1} &= \alpha_{s,1}*G + (\boldsymbol{v} \bullet \boldsymbol{\alpha_z} + \boldsymbol{z} \bullet \boldsymbol{\alpha_v})*H_1
\end{align*}
\item Calculate the {\em challenge} $c = \mathcal{H}(...,C_{\alpha}^{v},C_{\alpha}^{z},C_{\alpha}^{s,0},C_{\alpha}^{s,1})$.
\item Define the {\em response}, $\boldsymbol{r}$, containing vectors (size N) $\boldsymbol{r_v}, \boldsymbol{r_z}$, with $r_{v,0}, r_{z,0}$ for the vectors' blinding factors, and $r_s$ for the inner product blinding factor
\begin{align*}
r_{v,0} &= \alpha_{v,0} - c*x_v \\
r_{z,0} &= \alpha_{z,0} - c*x_z \\
r_{v,j} &= \alpha_{v,j} - c*v_j \\
r_{z,j} &= \alpha_{z,j} - c*z_j \\
r_s &= \alpha_{s,0} + c*\alpha_{s,1} + c^2*x_s
\end{align*}
\item Publish the proof $(c, \boldsymbol{r}, C^{s,0}_{\alpha}, C^{s,1}_{\alpha})$.
\end{enumerate}

\subsubsection*{Verification}

\begin{enumerate}
\item Calculate the challenge:
\[c' = \mathcal{H}(...,[(r_{v,0},\boldsymbol{r_v}) \bullet (G,\textbf{H}) + c*C_v],[(r_{z,0},\boldsymbol{r_z}) \bullet (G,\textbf{H}) + c*C_z],[C^{s,0}_{\alpha}],[C^{s,1}_{\alpha}])\]
\item Compute\footnote{Since $C^{s,0}_{\alpha}$ and $C^{s,1}_{\alpha}$ are tied together, it isn't possible (within our knowledge) to move $R_s$ into the challenge computation.}
\begin{align*}
%R_v &= (r_{v,0},\boldsymbol{r_v}) \bullet \textbf{H} \\
%R'_v &= C_{\alpha}^{v} + c'*C_v \\
%R_z &= (r_{z,0},\boldsymbol{r_z}) \bullet \textbf{H} \\
%R'_z &= C_{\alpha}^{z} + c'*C_z \\
R_s &= r_s*G + (\boldsymbol{r_v} \bullet \boldsymbol{r_z})*H_1 \\
R'_s &= C_{\alpha}^{s,0} + c'*C_{\alpha}^{s,1} + c'^2*C_s
\end{align*}
\item If $c = c'$, and $R_s = R'_s$, then the prover must know $\boldsymbol{v}, \boldsymbol{z}$, and $s$, and the inner product $s = \boldsymbol{v} \bullet \boldsymbol{z}$ must hold (except with negligible probability).
\end{enumerate}

\subsubsection*{Why it works (inner product component)}

\begin{align*}
r_s*G + (\boldsymbol{r_v} \bullet \boldsymbol{r_z})*H_1 &= C_{\alpha}^{s,0} + c*C_{\alpha}^{s,1} + c^2*C_s \\
(\alpha_{s,0} + c \alpha_{s,1} + c^2 x_s)*G + (\boldsymbol{\alpha_v} \bullet \boldsymbol{\alpha_z} + c[\boldsymbol{v} \bullet \boldsymbol{\alpha_z} + \boldsymbol{z} \bullet \boldsymbol{\alpha_v}] + c^2 [\boldsymbol{v} \bullet \boldsymbol{z}])*H_1 &= C_{\alpha}^{s,0} + c*C_{\alpha}^{s,1} + c^2*C_s \\
C_{\alpha}^{s,0} + c*C_{\alpha}^{s,1} + c^2 (x_s*G + [\boldsymbol{v} \bullet \boldsymbol{z}]*H_1) &= C_{\alpha}^{s,0} + c*C_{\alpha}^{s,1} + c^2*C_s \\
c^2 (x_s*G + [\boldsymbol{v} \bullet \boldsymbol{z}]*H_1) &= c^2(x_s*G + s*H_1) \\
\boldsymbol{v} \bullet \boldsymbol{z} &= s
\end{align*}

Responses $\boldsymbol{r_v}$ and $\boldsymbol{r_z}$ prove knowledge of vectors $\boldsymbol{v}$ and $\boldsymbol{z}$, so using them in $R_s \stackrel{?}{=} R'_s$ proves the inner product holds. Readers can explore the algebraic logic to confirm this for themselves.


\section{Condensed Vector Knowledge Proof}
\label{sec:condensedvectorproof}

Our original vector knowledge proof was $(c, \boldsymbol{r})$, where the proof size was linear with vector dimension. Given $N = 500$ (and $m = 1$), the proof will take up $(1 + 500)*32 = 16032$ bytes. We can condense the proof size with an approach exposed by Bootle et. al. \cite{bootle-efficient-zkcircuits}. It will be logarithmic with vector dimension (proof size $\approx 6*log_2(N)$, so for N = 500 proof size around 54*32 = 1728 bytes).


\subsection*{Non-interactive proof}

\begin{enumerate}
\item With the intent of minimizing proof size (optimizing verification time is more complex), factor $N$ into $q$ prime numbers ordered largest to smallest. If $N = 500$, $\boldsymbol{f} = \langle 500, 5, 5, 5, 2, 2 \rangle $ and indexed $0$ to $q = 5$ (0\nth term is the original N).
\item For $i = 1$ to $q$ (index 0 corresponds to original vectors),
\begin{enumerate}
\item Chunk $\boldsymbol{v}^{i-1}$ and $\boldsymbol{G}^{i-1}$ into smaller vectors size $f[i]$
\[ \langle v^{i-1}[1], ..., v^{i-1}[f[i-1]] \rangle \xrightarrow{} \langle \langle v^{i-1}[1],...,v^{i-1}[f[i] \rangle,\langle \rangle,...,\langle v^{i-1}[f[i-1] - f[i] ],..., v^{i-1}[f[i-1]\rangle \rangle \]
\end{enumerate}
\[\begin{pmatrix}
1 & 2 & 3 \\
4 & 5 & 6
\end{pmatrix}\]
\end{enumerate}


\subsection*{Verification}
Loading