Exploring the potential of TEEs
ICYMI, TEEs are the latest crypto craze after dunking on 100 million dollar raises and the complicated social relationships between the ETH clique.
After explaining TEEs more broadly in a recent blog, we decided to dive deeper and invite speakers who are already using TEEs to share their experiences.
You can listen to the recording of the space here, or read on for all the insights.
On TEEs in general
Our speakers explained that TEE, short for Trusted Execution Environments, is a type of hardware-based security. In their most common form, a chip has a secure compartment. All the computation inside that compartment is inaccessible even to the host. This isolation of data and compute from the rest of the chip is a key characteristic of TEEs.
Interestingly, TEEs are not a new technology but have been around for decades. As Dan mentions, Apple has used the term secure enclaves to talk about TEEs in many of its keynotes for years to assure users of their privacy.
Why are TEEs becoming popular in crypto?
On that topic, Esli explains that as the AI and crypto hype continued, we quickly realized that ZK technology was too expensive and wouldn’t scale for verifying compute. Fortunately, TEEs have become widely available and can easily be integrated with other cryptographic technologies like MPC or ZK-Proofs.
“It’s not new tech. We’re already using it whenever you open an app on your ledger or unlock your phone using your fingerprint.” - Nukri
What’s more, TEEs aren’t just widely available; they are also much easier to implement for developers while allowing them to run more complex computations inside them. For example, all existing open-source code can be run inside of TEEs without modification.
So, in short, TEEs are popular in crypto now because:
- Widely available
- Affordable esp compared to use of ZKPs
- High Performance
- Verifiability for general compute
- Fast learning curve
- Ability to integrate with other cryptographic tech
- Necessary to make AI x Crypto work
David added to the AI topic that there was a lot of excitement around using autonomous agents in Web3, and rightfully so. However, once you entrust an AI agent with your data, you’ll want reassurance that all models are running as they are supposed to. TEEs then become a quick and cheap way to offer these guarantees without the necessity of trusting a centralized company. Beyond AI agents, any RAG setup will also benefit from verification provided by TEEs. You can read some more about RAG in the context of SQD here.
On using TEEs in Products
Currently, SQD has not yet integrated with TEEs. But it’s part of our roadmap, and there are three aspects we’ll likely rely on them:
- Trustless ingestion: allowing other participants to add onchain data to the SQD data lake
- Verification of query results: by running queries inside of TEEs
- Trustless indexing: solves the scenario where one users wants to rely on someone else’s indexer without needing to trust it.
For investors, as David explains, it appears that confidentiality is playing a big role, but what’s often lost is the verification aspect. With ZK, we could prove that computation was done correctly, but it was expensive and took time. TEEs offer a faster, practical way to verify data hasn’t been tampered with. This then opens up two categories:
- Co-processing: where smart contracts are hooked up with off-chain sources, expanding their capabilities, basically making them smarter
- Trustless computing: An example of that is Super protocol, which allows users to run apps in a decentralized cloud, offering verifiable trustlessness.
Speaking of Super, Nukri provides some areas where its customers have found a decentralized compute platform useful, including intellectual property management, digital marketing, and AI.
In Marlin, TEEs ensure the integrity of computation, providing devs a way to verify on- and off-chain that computation has happened while adding an additional layer of confidentiality.
Beyond deAI...
So far, you could end up thinking that TEEs are only helpful for doing AI x crypto things, but there is much more, from block building to decentralized frontends. Decentralized frontends are a topic that makes a comeback ever so often, but this time, chances are they are here to stay.
“An increase in exploits and compromises of large dApps has lead to an increasing popularity of decentralized frontends.” - Esli
After all, the piece that gets exploited is usually the most centralized part, and for most dApps it is the frontend hosted in the cloud. While previous iterations of decentralized frontends were complex, with TEEs, one can easily host whole dynamic websites, allowing users to ensure they interact with the correct app without UX compromise.
Surely there are downsides, though?
As Dan shares, there have been multiple exploits on intel TEEs in the past, which has given rise to the perception that TEEs are vulnerable. In the end, everything has trade-offs.
“There’s always a need to weigh the cost of attacking the TEE against the value of what it’s protecting.”
Still, TEEs are already used in a billion devices, and with increasing attempts to build open-source TEEs, developers will have further ways of mitigating attacks. One can always throw some MPC or ZKP at the TEE for added security.
What does the future hold?
Esli hopes for fully open-source TEEs and TEEs to become the base layer for anything done off-chain. Nukri foresees the rise of fully compliant web3 infrastructure and the rise of useful applications. He’s most interested in AI agents on private data, big data, and IP management.
David believes we’ll see a lot of innovation in open-sourcing chips, which will affect how private keys are created. For example, when the hardware owner creates the key, not the manufacturer.
For Web3, novel functionality should unlock the experimentation necessary to discover new use cases. Think of DeFi, which started as a big experiment and remains the largest sector of Web3 so far.
The future for TEEs x Crypto is bright. The current hype is justified.
Thanks to our speakers and everyone tuning in.