Skip to content

perf/refac: Groth16 MPC setup improvements #1428

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
5 tasks
ivokub opened this issue Feb 19, 2025 · 7 comments · May be fixed by #1475
Open
5 tasks

perf/refac: Groth16 MPC setup improvements #1428

ivokub opened this issue Feb 19, 2025 · 7 comments · May be fixed by #1475
Assignees
Labels
consolidate strengthen an existing feature good first issue Good for newcomers perf
Milestone

Comments

@ivokub
Copy link
Collaborator

ivokub commented Feb 19, 2025

Collecting tasks from #1372:

  • currently in Phase 2 in Initialize method we first obtain coeffTau1, coeffTau2, coeffAlphaTau1 and coeffBetaTau1 and then compute the updated Phase2 key from the circuit description (the big loop which iterates over constraints). This means that we first need to allocate possibly very big slices and then perform computations on the slice elements. Maybe (needs to be benchmarked) it is fine if we only allocate every slice individually and then iterate for every slice the constraints separately. For large circuits this could potentially save tens of GBs of memory we need to allocate.
  • for the serialization we could additionally implement io.UnsafeReaderFrom and io.BinaryDumper interfaces. The idea is that when the sequencer stores the contributions, then it doesn't have to do expensive checks when restoring state from storage.
    • However, when we implement io.UnsafeReaderFrom then we need to consider that the Verify methods of Phase1 and Phase2 do not explicitly do subgroup checks and depend that the subgroups checks are done at deserialization. So keep in mind when implementing
  • allow providing randomness source directly to the contribution. We currently by default use rand.Reader which is secure, but in some cases would wan't to use something else (for example for some ceremonial contributions ).
  • in srsCommons.update method we currently don't parallelize the computations. We could do it quite nicely as every loop depends directly on tau, so it should be sufficient if we only compute the starting points tau, tau^k, tau^2k etc.
  • we should be able to provide the hash function as a parameter when computing challenge. It would allow to do cool things like SNARK proofs of ceremony contribution correctness
@ivokub ivokub added consolidate strengthen an existing feature good first issue Good for newcomers perf labels Feb 19, 2025
@ivokub ivokub added this to the v0.13.0 milestone Feb 19, 2025
@ivokub ivokub self-assigned this Feb 19, 2025
@ivokub ivokub mentioned this issue Feb 19, 2025
16 tasks
@Tabaie
Copy link
Contributor

Tabaie commented Feb 19, 2025

Not sure if this belongs here but once we make sure it matches the performance of the regular setup, we can reduce the latter to a wrapper for a trivial MPC (with only one participant, no verification and no beacon contribution) and remove a lot of duplication.

@ivokub
Copy link
Collaborator Author

ivokub commented Feb 19, 2025

Not sure if this belongs here but once we make sure it matches the performance of the regular setup, we can reduce the latter to a wrapper for a trivial MPC (with only one participant, no verification and no beacon contribution) and remove a lot of duplication.

That would be imo very cool, I think definitely worth considering

@Manmeetkaur1525
Copy link

Hey is the issue open , may i work on it ?

@ivokub
Copy link
Collaborator Author

ivokub commented Mar 20, 2025

Hey is the issue open , may i work on it ?

Hi - currently no-one is directly working on it, but I'd recommend starting with some easier issues. Implementing these changes are somewhat difficult so that it would be backwards compatible and would follow the style we have in gnark. For example I'd recommend #1175.

@Manmeetkaur1525
Copy link

Sure, thanks for the suggestion! I'll start with issue #1175 and look into it.

@txhsl
Copy link

txhsl commented Mar 24, 2025

I have few idea about the beacon contribution in MPC. It sounds like a public-and-trustable contribution from third parties. But when refering the following codes, I have some questions.

func (p *Phase1) Seal(beaconChallenge []byte) SrsCommons {
newContribs := mpcsetup.BeaconContributions(p.hash(), []byte("Groth16 MPC Setup - Phase 1"), beaconChallenge, 3)
p.parameters.update(&newContribs[0], &newContribs[1], &newContribs[2])
return p.parameters
}

func (p *Phase2) Seal(commons *SrsCommons, evals *Phase2Evaluations, beaconChallenge []byte) (groth16.ProvingKey, groth16.VerifyingKey) {
// final contributions
contributions := mpcsetup.BeaconContributions(p.hash(), []byte("Groth16 MPC Setup - Phase2"), beaconChallenge, 1+len(p.Sigmas))
p.update(&contributions[0], contributions[1:])

Is it only a seed of randomness? Do we need it publicly verifiable (as described in the doc)? E.g. can I give it a simple string or something else that is publicly acknowledged by verifiers?

@ivokub
Copy link
Collaborator Author

ivokub commented Mar 25, 2025

I have few idea about the beacon contribution in MPC. It sounds like a public-and-trustable contribution from third parties. But when refering the following codes, I have some questions.

gnark/backend/groth16/bn254/mpcsetup/phase1.go

Lines 158 to 162 in b51a3d4

func (p *Phase1) Seal(beaconChallenge []byte) SrsCommons {
newContribs := mpcsetup.BeaconContributions(p.hash(), []byte("Groth16 MPC Setup - Phase 1"), beaconChallenge, 3)
p.parameters.update(&newContribs[0], &newContribs[1], &newContribs[2])
return p.parameters
}
gnark/backend/groth16/bn254/mpcsetup/setup.go

Lines 27 to 31 in b51a3d4

func (p *Phase2) Seal(commons *SrsCommons, evals *Phase2Evaluations, beaconChallenge []byte) (groth16.ProvingKey, groth16.VerifyingKey) {

// final contributions
contributions := mpcsetup.BeaconContributions(p.hash(), []byte("Groth16 MPC Setup - Phase2"), beaconChallenge, 1+len(p.Sigmas))
p.update(&contributions[0], contributions[1:])
Is it only a seed of randomness? Do we need it publicly verifiable (as described in the doc)? E.g. can I give it a simple string or something else that is publicly acknowledged by verifiers?

It should be fine if you use seed which is agreed by contributors and proof verifiers. Publicly verifiable seed is one option, but this could be done on social layer.

@crStiv crStiv linked a pull request Apr 9, 2025 that will close this issue
16 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
consolidate strengthen an existing feature good first issue Good for newcomers perf
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants