• Home
  • DeFi
  • zkCoprocessors and Intelligent DeFi

zkCoprocessors and Intelligent DeFi

Image

Space Summary

The Twitter Space zkCoprocessors and Intelligent DeFi hosted by eigenlayer. Explore the realm of zkCoprocessors and Intelligent DeFi with EigenDA AVS in this enlightening Twitter space. Participants delve into the significance of zkCoprocessors, the role of open innovation, and the concept of infinite sum games. Collaboration, community building, and future trends in Intelligent DeFi are key topics discussed. Discover the applications of zkCoprocessors beyond DeFi and the importance of integration for scalability and security in DeFi protocols. Join the conversation on advancing technology and innovation in the DeFi sector.

For more spaces, visit the DeFi page.

Space Statistics

For more stats visit the full Live report

Total Listeners: 42

Questions

Q: What is the significance of zkCoprocessors in the DeFi sector?
A: zkCoprocessors play a crucial role in enhancing security and scalability in DeFi applications.

Q: How does EigenDA AVS contribute to open innovation?
A: EigenDA AVS provides a platform for collaborative discussions and knowledge sharing in cutting-edge technologies.

Q: Why are infinite sum games relevant to the discussion?
A: Infinite sum games concept promotes long-term strategies and cooperation for sustainable growth.

Q: What are some key benefits of integrating zkCoprocessors in DeFi protocols?
A: Improved efficiency, enhanced security, and scalability are among the key advantages.

Q: How can collaboration drive innovation in zkCoprocessors and Intelligent DeFi?
A: Collaboration enables cross-pollination of ideas, leading to breakthrough advancements in technology.

Q: What sets EigenDA AVS apart in the realm of advanced technologies?
A: EigenDA AVS stands out for its focus on open innovation, community engagement, and cutting-edge insights.

Q: What potential applications can zkCoprocessors have beyond DeFi?
A: zkCoprocessors can be utilized in various sectors beyond DeFi, such as cybersecurity and network optimization.

Q: How can participants engage in the ecosystem of zkCoprocessors and Intelligent DeFi?
A: Engagement through discussions, collaborations, and staying informed on latest developments is key to being part of the ecosystem.

Q: What role does open collaboration play in advancing technologies like zkCoprocessors?
A: Open collaboration fosters creativity, accelerates innovation, and drives progress in emerging technologies.

Q: Why is it crucial for the DeFi industry to explore concepts like zkCoprocessors?
A: zkCoprocessors offer solutions to challenges faced by DeFi platforms, ensuring their sustainability and growth.

Highlights

Time: 00:15:42
Introduction to zkCoprocessors Exploring the fundamentals of zkCoprocessors and their role in DeFi security.

Time: 00:35:19
EigenDA AVS Collaboration Insights Insights into how EigenDA AVS fosters open collaboration in advanced technology discussions.

Time: 00:55:08
Infinite Sum Games Concept Understanding the concept of infinite sum games and its implications for innovation.

Time: 01:15:33
Applications of zkCoprocessors Exploring real-world applications of zkCoprocessors beyond traditional DeFi usage.

Time: 01:35:57
Community Building for Innovation Highlighting the role of community in driving innovation in zkCoprocessors and DeFi.

Time: 01:55:42
Future Trends in Intelligent DeFi Discussing the potential future trends and developments in the sphere of Intelligent DeFi.

Time: 02:15:29
Collaborative Ecosystem of EigenDA AVS Examining the collaborative ecosystem facilitated by EigenDA AVS for technology enthusiasts.

Time: 02:35:18
Scalability Challenges in DeFi Addressing the scalability challenges faced by DeFi platforms and the role of zkCoprocessors.

Time: 02:55:05
Integration of zkCoprocessors Insights into the integration process of zkCoprocessors in DeFi protocols.

Time: 03:15:48
Innovation and Security in DeFi Exploring the dual focus on innovation and security in the DeFi ecosystem.

Key Takeaways

  • zkCoprocessors and Intelligent DeFi are at the forefront of innovation.
  • EigenDA AVS offers unique insights into advanced technology and open collaboration.
  • The space emphasizes the importance of open innovation for sustainable growth.
  • Infinite sum games concept is explored in the context of technology development.
  • Collaboration and knowledge-sharing are essential for progress in zkCoprocessors and DeFi.
  • EigenDA AVS provides valuable information on cutting-edge technologies and advancements.
  • Participants learn about the applications of zkCoprocessors and Intelligent DeFi.
  • The space fosters a community focused on innovation and collaboration.
  • Innovative concepts like zkCoprocessors are vital for advancing DeFi solutions.
  • The discussion highlights the potential impact of zkCoprocessors on the DeFi ecosystem.

Behind the Mic

Introductory Remarks

Otherwise, no one will be able to speak. Yeah. Join the space so that I can invite you. I cannot invite without you joining, guys, so. Okay, sounds good. I will. I am joining right now. Let's see. I'll make you the co host, Ishaan. Otherwise, no one will be able to speak. Perfect. Join the space so that I can invite you. I cannot invite you without you joining us. Oops. C daisies. Isha. Wait a second.

Audio Check and Participant Confirmation

Can everyone hear me? Okay, I think Ishmael is here. Jay is here. Can you see everyone? And I. Brevis is not a speaker. Okay, there we go. Now Moe's. How's it going, guys? Gm. Gm. Doing well. Pretty good. Sorry for the technical difficulties. My first time joining a Twitter space, this was a bit chaotic. All good. Let's hope all of our audio holds up. Yeah, yeah. For sure. For sure. Okay.

Getting Started

I think we can give it, like, 30 seconds a minute to get started, just for more people to trickle in. Let's see. But, yeah, we have quite a few exciting topics to discuss. Where are you guys all calling in from right now? Cause I feel like we're coordinating time zones from, like, all quarters of the planet. I'm New York. Jai, you're California, right? I know. Mo, you're Singapore. I'm in Singapore. Yeah, yeah. Jay, where are you? Los Angeles. Nice. Nice. I'm in SF, so we coordinated. Pacific, for sure.

Recorded Sessions and Main Topics

Yeah, I think we can get started, guys, and then it'll be recorded so more people can come in and watch. But, yeah. Really excited to talk to you guys about ZK co processors and intelligent deFi. So we'd love to start with doing just quick intros and, like, why this topic is relevant for everybody here. Happy to start briefly. So, I'm isha. I'm a protocol researcher at Eigen foundation, and I recently wrote the article about intelligent DeFi, which is a. It's a blog post that dives into ten different use cases of DeFi right now that taps into trustless, off chain compute and trustless, off chain data that I'm really excited about.

Introduction to ZK Coprocessors

And one of those major key unlocks that I read about during the process was processors. So wanted to. Yeah. Really excited for this, but, yeah. Ishmael, do you want to get started? Yeah. So, my name ismail. I'm one of the co founders of Lagrange. We build a ZK coprocessor, as is hopefully a little bit clear in the name of this space right here. We frankly love the article that Ishaan wrote, I think it's one of the best kind of encapsulations of a trend in the space that we're very excited about seeing in terms of material defi adoption of emerging off chain compute primitives.

Computation Accessibility through ZK Coprocessors

And I think, you know, to put very simply, the way we think of kind of what we do is as a way to expand the types of compute smart contracts and Defi apps can access. And that's done through verifiable compute and verifiable off chain compute that smart contracts can request. Oh, I guess I can go next. Hi, everyone, this is Michael from Brevis. Great to be here. And we are definitely a very big believer in data driven and intelligent defi. And we build Bravis. Bravis is a smart ZK coprocessor.

DeFi Computation Shifts

Now, coprocessor is a way that you can migrate very heavy and sometimes data driven computation from an onchain environment to an off chain environment. One very good example of that kind of computation is to allow smart contract to access historical on chain data, such as transactions, events, and states, and then do arbitrary computation on top of that to drive very intelligent decisions and parameter adjustments and all that in the DeFi protocol, for example. So, yeah, that's us. Excited to be here.

Participant Introduction and Insights

Awesome. I guess I can go. Hi, everybody, I'm Jay. One of the co founders here at Weymont Waymont is one of the developers behind the Roico protocol, which is a new protocol live in a couple of weeks for basically deploying incentives and liquidity in the most efficient manner. Have been going pretty deep in the ZK coprocessor rabbit hole for the last few months here. And just with my background in Defi. Very excited to be here. Yeah. So, long story short, I got three chads to come speak with me, and I am vastly underqualified with respect to these three.

Discussion on ZK Coprocessors

So I'm gonna try to get these guys to talk as much as possible. But first off, I know we have two people who are building CK coprocessors, and one who is using a CK coprocessor, hopefully in a v two. So for people who don't know, what is your definition of a ZK coprocessor? Why do you actually need it? What does it unlock? Maybe mo, want to start? Sure. Yeah, I can start.

Understanding ZK Coprocessors

You know what ZK coprocessor is? Well, the reason we need ZK coprocessor is because we are building in blockchain space in the blockchain, whether it's layer two or layer one blockchain, they're fundamentally limited in terms of the computational resources available, because fundamentally they are all rooted in the underlying consensus mechanisms. We have been scaling blockchains with layer twos and alternative layer ones to build new consensus protocols to build an optimistic roab ZK roll up. But those approaches are scaling blockchain storage and computation.

Understanding ZK Coprocessors

This is why you can move this type of computation off chain and allow smart contract to actually have this kind of visibility and access. So, yeah, that's kind of a quick explanation. Yeah. And so to add to that, I think that was a great explanation from Michael about what ZK coprocessors do. I think the one thing I'd want to add is, I think in general, people make the question of what a ZK coprocessor is a lot more complicated than it needs to be. In reality. If you were building an application web two, and you needed to do something that was computationally intensive, you would likely evoke a different server, you'd make an API call to it, or you would evoke a multi threaded process. Async awaits API calls mutex. All of these are properties or techniques within building applications that span across multiple servers or multiple threads. And what a ZK coprocessor functionally is a way to take a blockchain like Ethereum or any l two that's single threaded, and to give it the ability to have asynchronous computation that's larger in scale than what it can do in the smart contract.

Developer Perspectives on ZK Coprocessors

And so from a developer standpoint, that generally means that there are certain types of things that you couldn't build today on Ethereum or on an l two. They wouldn't be gas efficient, storage reads would be too expensive. This wouldn't be optimized that you can now build the ZK coprocessor. And you know, I think some teams like Royco and Laymont are kind of on the forefront of this, as well as some other folks in the space that are adopting these primitives to really bring new value to users. Yeah, I kind of add on here like this is a whole entire new primitive. We unlock the ability to do computation at a level that we've never been able to do on chain. Even the intelligent DeFi post was the first post that actually put together some novel new use cases that take advantage of this new level of compute. The really cool thing is probably for years to come after this, we're just starting the domino effect for years to come. There's going to be some really insane applications that we can't even conceptualize today because this technology is brand new to us. Right? So for sure there's going to be a bunch of like very rudimentary things built in the next few months, like being able to do incentives on chain and kind of compute Merkle roots for whoever's providing liquidity, right?

Revolutionizing DeFi with New Innovations

But over time, I'm very excited for just what the new generation of DeFi is that isn't inhibited by the by current blockchain systems. 100%. I think that this is such a cool primitive that people like Ismail and Michael have built and many a few other teams as well. I'm excited to see what is out there, but also what is like, what are some use cases that we're currently identifying as possible with ZK coprocessors that we couldn't do beforehand. Michael discussed looking at previous on chain state, which I think if you're not a developer, you don't realize exactly how expensive this can because we don't index that on chain. And it's only when you actually try to do that, you see how complex this is, that you have to look at the current state and maybe look at sequence previous transactions in the past, and then say, okay, maybe make a diff between the current state and the time, or like the past five days worth of docs and see, okay, how many times did J that say, swap on this uniswap pool and then make different things accordingly? But I know people have lots of thoughts of what is possible with coprocessors. So would love to hear, what are your favorite use cases, defi related or not Defi related?

Challenges and Insights

Yeah. So I'm going to keep this very brief, because I frankly love to hear from Jay here, because he's a real application builder. But from my standpoint, if we look at what we can do on chain today, storage reads are large bottleneck. On applications, you have 2100 gas to act as a single storage slot, let alone storage rights. And this means that there's certain things that you want to do that result in having to read a lot of data that you can't do. So think of anything time based, like any type of yield bearing instruments, any type of incentive instruments, and then anything that has to do with a large cardinality of storage slots in a given contract. So, GameFi or NFT based, if I wanted to determine all of the nfts owned by Eshan in the given contract, and it was 10,000 nfts, I can't spend 20 million gas to try to iterate through all of those. It just wouldn't be computationally possible. And so what you get to with co processing is being able to kind of manipulate and play with data in new ways.

Enhancements Through Coprocessing

What you get from that is you get things like vip programs for Dexs, you get asymmetric fee markets on amms. You get things like incentive programs that can be very targeted to users, the same way you would see in analogous web two products. Yeah. Well, just to add on that, first, I guess it's worth to explain why we call this kind of intelligent deFi, because if we call the new generation of deFi intelligence, we somehow have some implicit assumption that current DeFi is not that kind of. Why is that? Well, because as Ismail also rightfully pointed out, right now, smart contract is not that smart in a sense, it cannot actually see into the past because if you want to actually access historical data on chain, it involves a very heavy computation to reconstruct the block history using a large sum of hash functions that is impossible to do on chain. And you know, this is why using ZK coprocessor you can offload this type of very heavy configuration off chain and gain insights about user behavior.

Developments with ZK Coprocessor

User activity option now for Revis, our SDK is publicly available in the production ready. And in fact for some of the use cases like mentioned here, we already have partners building and launching product in production live on Mainnet. Today. For example, recently we launched the first perpetual swap vip fee rebate program with Quanta, one of the largest. Like perpetual swap aggregator on optimism. And you know, we're already seeing very exciting results that, you know, people start to, especially market makers who wants to have this differentiated fees to trade more and get fee rebate at a later stage to construct and help to build a more efficient market so that the uninformed trade flow can flow into the protocol more efficiently. And in the end it boosts the protocol revenue. That's the end goal of this. Like why are we building this is that we want to build a better defi ecosystem for both users and also the protocols, right?

Hackathons and Collaborations

So, and recently we concluded a, you know, collaborative hackathon with Uniswap foundation on enabling Uniswap V four hooks using brivis powered ZK coprocessor. Some of the very interesting example building that is not covered in this amazing article is that, for example, idle liquidity based idle liquidity management, you can have a trust free idle liquidity management to essentially park liquidity in lending protocols based on different lending protocols, reward and earning rate. And the earning rate can be transparently and trust freely approved using Bravis. And you can also do very complex computation to do this kind of a dynamic fee. We talked about dynamic fee using trading volume. We actually built the first demo last year on uniswap to do that. But you can also do very complex like Bollinger band based volatility based hook for both LP's and also traders to have dynamic faith based on the volatility of market to minimize the loss versus rebalancing issue that is very prevalent and hurts the performance and benefits of the liquidity providers today.

Innovative Uses and Initiatives

And you can also do mirroring trades like you can essentially allow other users to delegate certain funds to a mirror trading protocol to mirror certain market trend and be able to prove that these mirroring actions are actually done correctly without manipulation and malicious intent. And we recently also concluded a hackathon together with Pancakeswap specifically to build brevis powered hooks and plugins for pancakeswap before upcoming Pancakeswap v. Four. There are 27 submissions, all very amazing. This is all public. You can look at what people have been built and you can do things like new type of LP reward programs that is not really related to the timelines of the LP deposit, but really based on the amount of fee earned by each of the LP's. Or you can do other things that are very interesting to kind of create a new type of ways to incentivize and retain liquidity.

User Acquisition and Identity

And also one very interesting thing is like on the user acquisition side. So you can generate a proof for the user, or the user can generate a proof that hey, I'm actually an active user and heavy user on some of the other DeFi protocols. Can I get a kind of vip status in this new DeFi protocol that is being launched or in this new pool that is being launched? Well, you can do that with brivis because again, you can generate a user identity and your user profile based on the historical interaction of users in this protocol and in others. Because like for ZK co processors, there is no limit of data you can access. You can basically access anything that is available on chain. So there are so many things you can build. And you know this. You know, for Brabus it's a production ready. You can just directly use SDK to try it out.

Future Possibilities with ZK Coprocesors

I'm just going to summarize before I give it off to Jay to hear his ideas, because that was a lot that, those are a lot of cool ideas. You said in production you have vip fee tiers which make market makers want to participate more. And that's live on Quinta. You said you have uniswap and Pancakeswap books which are. Sorry, those are public hackathons. Let me just point that out. Yeah, those are public hackathons. Collaborated with Uniswap foundation and Atrium in Uniswap case. And for Pancakeswap, we collaborate with Pancakeswap team directly to build this hackathon together. Awesome. Yeah.

Innovative Strategies for DeFi

For those who have hooks that are depositing to the highest yield lending protocol and other hooks that are putting dynamic fees to really solve that lever problem once and for all, I think that's fantastic. Jay, I know you also had a bunch of ideas. We were talking yesterday. Yeah, yeah. Cool. Like ahead of this, I tried to spend some time yesterday in the shower thinking about, okay, what are some cool like new DeFi primitives and how we can actually change the foundation of some of these DeFi protocols using ZK coprocessors. I came up with a few interesting ideas that I haven't heard discussed yet. My most favorite one is the following. When you think about lending markets today in DeFi, the biggest problem that they have is the creation of bad debt.

Addressing Bad Debt in DeFi

They don't want bad debt because it cripples any lending market in DeFi. Now what breaks bad debt is not having liquidity to repay loans. The basic idea would be what if you could use a ZK coprocessor to basically scan all of the dexes in real time to ensure that there's liquidity to unwind loans, right. And by building a lending market from the bottom up that uses a ZK coprocessor basically to create a debt ceiling, it will ensure that all of the loans can be repaid without creating bad debt in the market. Now let's say that there's not enough liquidity to repay all of the loans. The lending market could decrease the debt ceiling the same way that maker does. This would enable us to really have basically fully permissionless, like a long tail lending markets and not really have to worry about bad debt being created, which is really cool. And I haven't heard anybody talk about this use case.

Centralization Challenges in Yield Aggregators

One of the other ones that I was thinking about last night was, look, most yield aggregators today have centralized rebalancers, meaning that it is on a centralized server to basically determine the optimal allocation strategy in terms of whether it be into lending platforms, into amms, whatever. Basically it's a centralized server that's determining the optimal strategy, and then it's basically sending that on chain and then rebalancing between the whitelisted strategies that it can choose from. But you can do that optimization on chain using a ZK coprocessor. You could even determine the rates of the deposits that you are about to create using a ZK coprocessor, computing the basically post deposit interest rates and ensure that you are basically always optimizing for the highest yield in a completely trustless manner. These are two of my favorite ones that I came up with.

Decentralizing AMM Strategies

Another one is, hey look, we have a lot of AMM strategies across defi that are pretty centralized. When I say AMM strategies, I mean strategies that sit atop an amm where you trust a centralized player for execution, such as definitives TWAP feature. You should be able to go and use a ZK coprocessor to basically run the calculations on the optimal like twapping strategy and be able to do so trustlessly and anybody be able to compute what the TWAP should be in a real time basis. So these are three ideas I had last night that I think are pretty cool. That's fascinating.

Engagement in Yield Strategy Development

The yield aggregator idea is something that we've engaged very heavily with beefy and Uran on historically. It's interesting because the current yearn strategies are rebalanced by contract level checks, and it's something that's very limiting in terms of what they can effectively do.

Excitement About DeFi and Liquidity Management

Totally. Yeah, definitely. And I think for the amms, especially rebalancing of certain activity, liquidity, manage the solution, actively manage the liquidity position is something that is also super exciting. And I was just looking at the submission, one of the submission is doing exactly that. And you're right, these are all public hacks on there. Anyone can participate. And I was just looking at the submission last night after the deadline has passed. So it's very exciting that people can use public available SDK to build this kind of stuff already. And I think, Jay, you're right, the entire defi paradigm is going to be changed by this.

Real-Time Liquidity for Long Tail Assets

Jay, that was brilliant. I am very excited about looking at the state of real time liquidity and especially applying that to these long tail assets. So hopefully Mudang won't be long tail for long. It's gonna be one bill soon in NFA, but for now we can start lending on it if we use the ZK coprocessor to determine like how much liquidity is there so that you can only take up x amount of bad debt and then repay it, which I think that's, yeah, that's super fun. Going deeper into this lending paradigm, one thing I wrote about was this idea of personalized loans. so really taking the intelligent defi and saying what if we could add more user specific loans?

Personalized Loans and the Default Risk

If Michael is super good at repaying his loans, he's done that every time he's taking a loan. That's a small portion of his net worth. So we know that there's very little incentive for him to not repay his loans. Then can we give him better lending rates as a whole, since the risk, the default risk is just so low. And curious to hear your thoughts on creating liquidity profiles for individual users and creating a new generation of personalized finance. Excited to hear guys thoughts and also how this extends into applications besides lending.

Skepticism on Personalized Loans

Yeah, I think that's. Oh, go ahead. I was just going to say like, I know there's a lot of startups working on this. I don't know if this is a hot take, but I'm, like, pretty bearish personalized loans. I think the idea of, like, cyber resistance and all of this stuff remains to be seen. Right. Any. Any lender is going to underwrite based on the collateral being provided. If the collateral is not good enough, they aren't going to provide the loan. I think factoring in, like, a credit history isn't going to change that decision in an on chain environment more.

Collateralizing with Other Solutions

What I think, and this might be where you're going with the question, is being able to collateralize, like, other things, right. Whether that be via ZK TL's or whatever, that seems to be more exciting. But I don't think the idea of, like, credit history on chain, or rather just history on chain, makes a lot of sense, given the hostile nature of the chain. I agree with Jay there. I think ZK TL's very broadly speaking, if you can bring off chain civil resistant credit history into defi does have applications.

Challenges of On-Chain Payment History

I think it's tough to underwrite a loan based on payment history on chain, because it's too easy to fake whatever the parameter set that's being used to underwrite is at scale across a bunch of wallets, and then just execute on that. Right, where if we know that we're going to underwrite it based on three years of payback history, you just today make 10,000 wallets that pay back all the loans on time, and then three years from now, you go after the protocol. Yeah, well, I 100% agree with. With both of my friends here, so, you know, but I do think this.

Credit Based Loans in a Modern Context

You know, but if you. If we think about one caveat here is, like, if you think about the modern words, like real non crypto financial words, credit based loans, they're not really based on trust. They're essentially based on the fact that they can essentially underwrite certain kind of cash flow. Right? So now, Jay, I think you're absolutely right there, that we cannot use just looking at historical data and the historical repayment, their history, to actually do personalized loans.

Using Cash Flow for Collateralization

But maybe what we can do is to collateralize some sort of, like, a cash flow to basically say, okay, you can prove that you will have the expectation of a stable cash flow through maybe rest. Taking on eigen layer, for example, would be a good example, and collateralize that portion by compose the cash flow you can get from eigen layer restricting ecosystem to the lending protocol, that if you fail to pay this, then your future cash flow will be directly to the lending protocol itself. So, yeah, just want to add one note on top of these.

Exploring Protocol Level Loans

That's actually a really interesting point, because I think there's a product I heard of a couple of years ago, and I don't know where it went, but the idea was to underwrite protocol level loans where you can look at the average fees that some DeFi protocol would accrue over an increment of time and then collateralize the loan based on those cash flows, where the cash flows would be routed through whatever the payback instrument was. Yeah, that's interesting.

Challenges in On-Chain Cash Flows

I feel like there have been many protocols that have tried that, but, like, protocols didn't want to adopt this because how much cash flows is there actually going on chain like? Yeah, you have restaking, but to be honest, the yield is marginal in terms of the difference that makes to a loan. And underwriting, that is probably going to be more costly than it's worth.

LP Underwriting with ZK Co-processors

But I do think one area where, like, ZK coprocessors comes in, like, really interesting is, like, in underwriting Uniswap LP positions, right? There's billions of dollars in Uniswap LP positions currently, and it's extremely hard to calculate how valuable those uniswap LP positions are worth. Right. We built oracles for this, and we had to basically cut down the value of them by 80% because of how volatile it is. But with a ZK coprocessor, you can actually calculate these LP positions and the value of them in real time and be able to underwrite loans against them.

Untapped Liquidity in the Lending Landscape

Right. So, like, being a ZK co processor in this, like, lending landscape, I would more look at like, hey, where is the untapped liquidity that we can go and service? And I think looking at, through it, at that lens, you'll see that, like, a big portion of it is uniswap LP's that can go and get loans at the most efficient rate with the most efficient oracles, if it's powered by ZK co processors.

LPFI and Its Importance

This is a great idea, Jay. Yes. Yeah, I love LPFI. I think LP's are some of the unsung heroes of our industry. First off, I don't know why they're LP if people say it's so unprofitable, but maybe Guam and others can fix that. But, yeah, there's a surge of LPFI happening now with, like, lending protocols, like infinity pool, which I just talked to the founder, and he said that infinity pools is coming out hopefully soon, which can lead to a surge and at least they're getting an interest rate on top of the LP volatility fees.

Enhancing Capital Efficiency

But being able to take out loans just makes it so much more capital efficient. And you can underwrite a combination of both assets as well as the current value. And I think that sounds super powerful. We lost you. Oh, no, sorry. It was. If anyone has any follow up after that, if not happy to discuss, like go back to that personalized loans example.

Future Technologies in Personalized Loans

I know, Jay Ismail. Michael, you said you're bearish, but I think to summarize, there are two technologies. You said that you need this to work. One is anti civil resistance technology. So let's say world and Coinbase verifications, which I think, let's, for now, assume that one day in the future, maybe it's a year from now, maybe it's three years from now, that's a solved problem, and we can have anti civil tech.

Combining Technologies for Enhanced Applications

The second thing you guys mentioned is bringing in outside data via something called zktls. And I read the crypto's airtag article, or crypto's airtag moment article by nascent, but I think this combination of outside compute and outside data via zktls is really powerful. So first, can someone define zktls for us? And then we'd love to hear how you guys think that combination can make things like ten x.

Understanding ZKTLs

Sure. Yeah. Well, you know, people have been talking about ztls and mpctls a lot. So what that means is that, you know, it basically allows you to prove a certain session with any type of web, two websites. So if you log in Google, for example, when you're logging, when you're trying to log into Google, there is a process to establish a secure connection to authenticate yourself when you put in the password and everything, to essentially communicate with Google, and you can see your gmail then and everything.

The Role of ZKTLs in Security

So that protocol is on a web layer, or on the application layer is called HTTPs, and the underlying cryptographic protocol that is used is called TL's. The idea of zktls is pretty simple. That is, you can have someone else to sit essentially in the middle between you and also the server you are trying to access to, and then generate a zero knowledge proof that you are indeed the owner of a certain account with certain property. Now, you might ask. The immediate question you might ask is that what happens if this middleman will just steal all of your information data?

Multi-Party Computation and Security

Well, this is why most of the ZKTLs protocols are also based on MPC, so that yourself and the middleman form a multi party computation process so that no single entity can actually see through the entire session. Only you two combined can actually see the visibility of the session and also the information that is containing it. And with properly designed protocols, you can essentially generate zero knowledge. Proof that to show to any web three or web two services and application, you are an owner of a certain web two service and you have certain property that is stored in this web two service.

Bridging On-Chain and Off-Chain Data

For example, your binance balance, your certain token holding on binance, even your bank account balance would be possible if you establish the TL's session with this MpCTLs process. So this is a great way to actually bridge many insights from the off chain work to the on chain work. So let's say the simplest application for this is anti Sibyl. So basically you can say that, okay, I am actually a KyC user on Coinbase, or I'm a KyC user on binance, and I can prove to on chain contract that I am a unique user that is not duplicated in this entire service pool.

Identity Binding for User Acquisition

So you can of course integrate mpctls based solutions with on chain dataproof. One use case that we also just saw also on the pancake public hackathon developer build with revisit SDK is they use mpctls solution to build a user acquisition solution to say, okay, you want acquiring new users based on their own chain history to see, okay, this user is a power defi user and therefore you want to give some reward, you want to give them some preferred treatment to your newly launched protocol. But this kind of purely onchain based traces are subjected to a civil attack because you can have multiple addresses, all performed a lot of interactions and transactions on chain.

Challenges with On-Chain Identifiers

How to solve that is to bind your on chain identity with your off chain identity, or like a real human identity to say, okay, now I'm using this kind of real human identity with certain threshold in terms of my deposit on binance, with certain threshold on my trade history, also off chain to show that these two accounts can bind together as a real human, to attach it to the fact that you're acquiring a real user instead of like one user with multiple accounts. Right. So this kind of direction is something that we think can be very interesting.

The Importance of User Identity in Defi

And you know, people are rebuilding this type of applications as well. Yep. Yeah. So I mean, I'll keep it quick because I want to make sure everyone had the chance to jump in. But what I think is interesting about ZK TL's is that it's really the first time in crypto that we can bring off chain data that is private in nature, on chain in a way that both retains privacy and then retains the safety over aspects of that data.

ZK Oracles and Privacy in Data Management

And to put that concretely, we look at Oracles. They're fantastic instruments for bringing public off chain data on chain things like price feed volatility or random oracles, but they don't serve very well at bringing private data. You know, what your credit score is, PII, or any types of properties around your identity, where you're located, or anything that would be used as a civil resistance mechanism. And so ZK TL's provides a substrate to do that still retains user privacy and unlocks a lot of applications that require these types of things as primitives.

Expanding Creativity of Developers

And so I think by extension, the creativity of developers of what they can build extends rather substantially when the amount of data that they can access increases as well. And I think we're seeing examples on popular ZK TL's protocols of bringing things ranging from employment history to income statements to even details over somebody's power bill used as part of protocols like daylight. And the result of that is, I think the design space of what people are building is increasing as well.

The Impact of ZK TLs

Yeah, I think just adding on to that about why I'm extremely excited about ZK TL's is, for example, everybody talks about, hey, the institutions are coming with bitcoin, ETF and everything. Everybody gets so excited about that. The much more exciting thing, in my opinion, is how can we increase the GDP of whatever's on chain naturally, without going to the institutions. It seems like the best way to do that is by expanding to the Internet, real estate, or basically all of the value on the Internet. And DKTLs fundamentally is a way to take all of this value that exists on the entire Internet and bring it on chain. Right? So in terms of the downstream effects of that and what it can mean for DeFi, what it can mean for the entire space is like entire chain one day become closer to the GDP of the entire Internet. And I think that's so cool. And it makes like this idea of, hey, look, we're going to onboard these institutions like look like nothing in comparison.

Incorporating Private Data Securely

Yeah, yeah. Once you can include people's private data in a secure manner, and then you use that to like make more accurate, give more accurate personalized loans or I am honestly very unsure of like what is the extent of ZK TL's applications out there. So want to give you guys some time to talk more about the applications when you combine off chain compute and data, if you guys have thoughts, and then some time for the customer acquisition discussion.

Complexities of Off Chain Data

I think one of the complexities with off chain data is that it still has to be queryable from a smart contract, where if the amount off chain data that you're bringing on chain is too large for a given contract to compute over, you run into the same issues you face when you don't have a coprocessor and you're trying to compute over strictly on chain state. And so to make that very concrete, if I wanted to, if I had all of my users KYC on my application, then I have their age and their height, and I wanted to determine everyone within some age bracket who's under five eight who used my app, I would still have to query over a lot of data about my application from my contract in order to do something on chain predicated on that. So irrespective of the data my application is accessing is on chain versus off chain. If I store it in my contract such that it's accessible at the point of execution, I still have to ensure that the computational tools I use for that execution are scalable enough to handle that data.

Handling On Chain Data Efficiently

So as we talk about kind of moving this GDP of or this lake of off chain data on chain, to then be able to access the GDP of the entire Internet on chain, we have to still answer the question of how do we handle that data. We've developed a suite of tools in web two to handle big data, Internet scale data ranging from, you know, Apache Spark data storage and Snowflake redshift data analysis pipelines like Fivetran. And we are yet to have that type of meaningful data structure data infrastructure on chain. I think co processors specifically, things like what Lagrange does, what Brevis does, what teams like risk zero do that focus on taking well defined web two programming languages and letting you run that over on chain data and verify ways you can start letting developers move data on chain that's from a web two context and then query over it and write computation over it in web two languages. And I think it just really improves the developer ergonomics and data access for the smart contract.

ZK TL's Computation Considerations

Ishmael, I have a question for you on that. Actually, on the ZK TL's, let's say you're getting the data off chain. Why would you do that computation in a ZK coprocessor format versus outsourcing that computation off chain? If you're using a ZK TL's, it. Depends on the amount of, it depends on the query you want to run on the data, right. So if I'm storing in my contract information on, if a user has passed a KyC check such that under marginal swap, I don't have to call that ZK TL's endpoint every time, then I have retained state in my contract that I store with every single ZK TL's call. As such, if I want to compute some aggregate property over that data on chain, I'd either have to do that in my contract or do that with. Actually, I have to do my contract. I have to have a coprocessor. And that's the situation where it matters, because I don't think a lot about. I think when we think of ZK TL's in kind of the most primitive application, it's really every single user action makes the call to ZK TL's.

Retain State for Effective Data Processing

We wait for that data to come back, we store nothing, we use the call, we use the result of that call, and then we move to the next user. But I think in a lot of these applications, you're gonna want to retain state. And as you retain that state, in essence build a data asset in your contract that you then need the ability to query over. Yeah, I think overall that makes a lot of sense. Want to also shift the discussion because I know we're running short on time to customer acquisition. So one of the biggest core tenants of defi, starting with liquidity mining from compound in Defi summer, I think this is 2020, is figuring out how can you incentivize and bring on board users? And curious, how can we target customers more precisely, whether it's via ZK coprocessors or not, to decrease customer acquisition costs and maybe even increase their lifetime value.

Targeting Customers Efficiently

Jay, you are the natural person for this. Yeah, totally. I think ZK coprocessors are great here because now you can get extremely fine grained in terms of the things that you want to go and incentivize. You can say very easily come in and write a SQL query for all uniswap LP's in this specific range. And for all we know that range is moving over time, you can get extremely hyper specific with the use of a ZK co processor. Which means that when we get to incentives and where the incentive paradigm is going, it means that we can get so much more granular, right? Instead of arbitrum giving a bunch of money tons and tons of projects, arbitrum giving projects to hyper specific incentivized actions. That's great, right? Then you can actually see the results of these things happening.

Creating Negotiable Marketplace

Now, when you connect that with a marketplace like what we're building at Royco, then you get a two sided marketplace where somebody can say, hey, look, I'll bring a billion dollars, or I'll do this action if you give me this instead. Now you have this negotiation over these hyper precise actions. And that, in my opinion, is the future of incentives on chain. And to add something briefly to that, I think Jay put it very well with hyper precise user targeting from the ZK coprocessor, decreasing customer acquisition cost. I think the flip side of that, what we have talked about is LTV, the lifetime value of a customer. And I think a ZK coprocessor also meaningfully helps in this, because it allows you to differentiate the experience of an existing user in your application by tailoring how they interact with your application to the history of their interaction.

Tailoring User Experience for Improved LTV

The same way we do in web two with almost any app we use. This is, you know, asymmetric fees and amms. This is hyper precise incentives for existing users or for regular users. There's a lot of ways you can design programs to do that. This is, you know, discounts if a user returns after being gone for a certain period of time. And if were evaluating a fintech business, we would typically evaluate it based on the customer and the CAC of the customers, and the LTV of those customers. We look at the growth of that business as an optimization problem in many ways, trying to decrease CAC and increase lpV. And I think a coprocessor has an application to both of those things through the lens of Royco, having very specific acquisition strategies that target customers through both action incentives and then attribution of those actions with proofs, or through kind of this continued incentive model where you look to optimize the existing users experience such that they have the lowest churn possible.

Web 3 Potential in User Acquisition

Yeah, absolutely. Well, just to add on top of that. So I think web three has the potential to fundamentally change how user acquisition works. So in web two, if you think about it, how user acquisition works mostly is through buying ads positions. Let's say you have application, you have a very popular game, and you sell your advertisement slots to other applications so that user can download other applications. And there is always some middleman that is kind of mediating the steel. And this middleman is very opaque. That oftentimes is Facebook, Google, they're selling ads from publishers to the advertisers and they're charging a large fee in between. And it is extremely inefficient in the sense that from a titular point of view, after I acquire this user from some of the ads platforms, I just don't know, like, you know, whether this user is actually going to be valuable or not.

Value Transparency via Blockchain

I'm paying upfront for $10 per user in the US for like any rent any kind of a game wants to kind of market in the US, like $10 per user. But I don't know what's LTv for each of these users will be but using blockchain. If we build this type of new user acquisition platform for web three applications, you can actually transparently track from both publisher and advertiser what is the LTV of each individual user and customers. And you can change how the value flows instead of all the value flow into the, into kind of centralized and very transparent, very opaque middleman. We can now have a very transparent marketplaces where the actual long term value of each acquired customer will be shared between the user acquisition channel and application itself. I think what Jay is building is an amazing example of that for Defi.

Expanding User Acquisition Beyond Defi

I think this is something that has been missing for a long time, but this goes even beyond that. Actually, any kind of webster application can have this type of functionality to acquire user using a transparent marketplace and with kind of a continuous revenue and value sharing between the user acquisition channel and also the application itself. And how you can actually do that is by using ZK coprocessor to prove what is a user value by leveraging the computation on historical data and traces of the user interactions with the application.

Concluding Insights

I feel like I've learned so much in this hour. I know we're at the top and so people have to completely understand. But thank you guys so much for teaching me and hopefully a few others, about the intersection of ZK co processors and intelligent DeFi. I think the main takeaway on my end is that right now DeFi has been operating and blockchains generally have been operating within these specific constraints. Oh, you have twelve second block times. Oh, you can only access the current state of the current global state. You can't access anything in the past. You don't know who your users are, whether they have been a repeat user or done something in the past. Then ZK coprocessors and zktls as well break down all of these barriers that we can build more exciting, useful applications that are better optimized, just like more general web two applications are.

Vision for the Future of On-Chain Applications

And so when I think of my mission. Sorry, Eigen layer to all of my coworkers here, but I don't say I work at Eikin layer. I say I work for crypto, and I just happen to be at Eichend layer. In order to do that, the mission to bring the world on chain, this is game changing. In order to do so. If we have two, three minutes, happy to have a few people come on for questions. If not, then Ismail, J. Michael, thank you guys so much for your time. That was very well put. Ishaan, thank you for having us.

Final Acknowledgments

Thank you, Sean. Thank you. Okay. Yeah, I don't see any questions. I see Nader and fatigue in the audience. I feel like they have to have a question. Let's see. Okay. All good. I think we can call it. It's been a, it's been a fun hour, but thank you guys so much for your time today and really excited for the or, like, really excited for the future of on chain after this conveyor. Bullish. See you, guys. Thank you, guys. Thank you, guys. Thanks.

Leave a Comment

Your email address will not be published. Required fields are marked *