Honey, I Shrunk the Proof: Enabling On-chain Verification for RISC Zero & Bonsai
Paul Gafni
Last week, we posted and verified our first proof on Ethereum's Sepolia Testnet! This is a huge step toward empowering builders to harness the power of ZKPs in their app development. We're excited to enable builders to make use of complex logic in on-chain applications, without ballooning gas costs. Being able to post and verify proofs on-chain makes it possible for Ethereum developers to build off-chain scaling solutions with Bonsai today.
If you're building on Ethereum, you should be thinking about using a zk coprocessor to reduce your gas costs. The core idea is that zkVMs create the opportunity to move the complex part of your application logic off-chain. This means you can have contracts that rely on complex application logic without having to pay gas for that application logic.
Cost will vary a bit based on your application, but for this demo, our costs were:
In today's prices, that's less than $15 USD to post & verify a proof on-chain!
$15 may sound expensive for posting & verifying a single proof, but keep in mind that a single proof can represent an arbitrarily large computation or an arbitrarily large batch of computations.
Want to prove the validity of a digital signature? $15.
Want to prove the validity of 1 million digital signatures? Also $15.
When you use a zk coprocessor, you can increase the complexity of your app without impacting gas costs.
Q: Wait, what's a zk coprocessor?
A zk coprocessor is a tool for using ZKPs to offloading computation from on-chain to off-chain. ZK coprocessors are the answer for enabling complex application logic for on-chain applications in a way that will actually scale.
Q: How well does this scale?
A: Gas costs for posting and verifying proofs on-chain *do not depend on the complexity of the computation*. Whether your computation takes 1 million steps or 4 billion steps -- whether it represents one transaction of 1 million transactions -- the gas cost for posting and verifying the proof will be the same. There are two ways that costs will vary with the complexity of the application:
The gas cost to post & verify the proof depends on the size of the journal (i.e., the public outputs that are being proven).
The computational cost to generate a proof depends on the complexity of the computation.
Q: I thought RISC Zero's proofs were too big for on-chain verification. How is this possible?
Enabling on-chain verification involved two major engineering milestones:
a STARK-to-SNARK wrapper
a Solidity verifier
Our STARK-to-SNARK wrapper reduces our proof size from hundreds of kilobytes to hundreds of bytes. The key idea here is to verify our STARK proof inside a Groth16 prover. Then, we can verify the (much smaller) Groth16 proof on-chain.
With this in place, posting proofs on-chain becomes feasible, as does on-chain verification. With our on-chain verifier contract up and running, using Bonsai as a zk coprocessor is now a real thing.
Q: If I'm building an app with Bonsai, is on-chain verification available to me?
As of this writing, Bonsai is set up to return small proofs that can be cheaply verified on-chain. If you're generating proofs using Bonsai, you can easily post & verify these proofs on Sepolia today. Request Bonsai access and check out the Bonsai Quick Start page to start building.
Q: Is this ready to use in production?
No. Although you can do experiments and proofs of concept with on-chain verification of RISC Zero proofs today, this is not ready for production. In particular, we have not performed a trusted setup ceremony for Groth16 for our SNARK-to-STARK code, and so our use of Groth16 should not yet be considered secure
Q: What’s next?
We're continuing to optimize the performance of our prover -- check out our datasheet.
We're working with OP Labs to add zk to their stack.
We're working to stabilize our APIs for the zkVM and Bonsai.
For now, why not start building? You can request access to our remote proving service here, and you can start building using local proving options right now. As always, find us on Discord if you have questions or want to connect.