Q&A
Highlights
Key Takeaways
Behind The Mic

Rate This post

Avg 0 / 5. Votes: 0

We are sorry that this post was not useful for you!

Let us improve this post!

Tell us how we can improve this post?

0
(0)

Share This Story, Choose Your Platform!

Space Summary

The Twitter Space The Lagrange Expedition: Web3 Scalability with ZK Coprocessing hosted by lagrangedev. The Lagrange Expedition delved into the realm of Web3 scalability using ZK coprocessing, showcasing how hyper-parallel computation over blockchain data can unlock new horizons for data-intensive and interoperable applications. With a focus on scalability, efficiency, and innovation, the Expedition highlighted the essential role of ZK coprocessing in addressing the challenges faced by Web3 environments. By exploring the potential of ZK coprocessing and its impact on Web3 ecosystems, the space provided valuable insights into the future evolution of blockchain technology and the importance of data scalability.

For more spaces, visit the Infrastructure page.

Questions

Q: What is the primary focus of the Lagrange Expedition?
A: The Lagrange Expedition focuses on Web3 scalability using ZK coprocessing techniques.

Q: How does ZK coprocessing contribute to scalable Web3 applications?
A: ZK coprocessing enables hyper-parallel computation over blockchain data, enhancing scalability.

Q: Why is interoperability crucial for future blockchain applications?
A: Interoperability ensures that different blockchain networks and applications can communicate and share data effectively.

Q: What impact does data-intensive computing have on blockchain technology?
A: Data-intensive computing opens up new possibilities for powerful applications that require processing large volumes of data.

Q: How does the Lagrange Expedition showcase the potential of ZK coprocessing?
A: Through practical demonstrations and discussions, the Expedition highlights how ZK coprocessing can revolutionize Web3 scalability.

Q: In what way does the Lagrange Expedition address the challenges of Web3 scalability?
A: The Expedition addresses scalability challenges by introducing innovative ZK coprocessing mechanisms for efficient blockchain data computation.

Q: What role does big data play in the evolution of blockchain technology?
A: Big data necessitates scalable solutions like ZK coprocessing to handle large volumes of information efficiently on blockchain networks.

Q: How can ZK coprocessing improve the efficiency of Web3 applications?
A: ZK coprocessing boosts efficiency by enabling parallel processing of blockchain data, enhancing overall performance.

Q: Why is the integration of ZK coprocessing considered innovative in Web3 environments?
A: ZK coprocessing is innovative due to its ability to significantly enhance the scalability and processing speed of blockchain applications.

Q: What benefits do data-intensive and interoperable applications bring to the blockchain ecosystem?
A: Data-intensive and interoperable applications open doors to a wide range of use cases, improving the functionality and utility of blockchain technology.

Highlights

Time: 08:17:45
The Lagrange Expedition Introduction Exploring the significance of ZK coprocessing for Web3 scalability.

Time: 09:25:12
Practical Demonstrations of ZK Coprocessing Showcasing real-world applications of ZK coprocessing in Web3 environments.

Time: 10:40:59
Interoperability and Data Scalability Discussing the crucial role of interoperability and data scalability for future blockchain advancements.

Time: 11:55:28
ZK Coprocessing Efficiency Delving into how ZK coprocessing can enhance the efficiency of Web3 applications.

Time: 12:30:17
Future Prospects of Data-Intensive Blockchain Applications Examining the potential growth of data-intensive use cases in the blockchain ecosystem.

Time: 13:45:02
Innovation and Web3 Scalability Exploring how innovative solutions like ZK coprocessing drive scalability in Web3 environments.

Time: 14:20:49
Q&A Session with Experts Engaging with industry experts to answer key questions on ZK coprocessing and Web3 scalability.

Time: 15:10:37
Closing Remarks and Future Outlook Summarizing key insights from the Lagrange Expedition and discussing future developments in ZK coprocessing.

Time: 16:00:15
Networking Opportunities for Web3 Innovators Highlighting networking possibilities for those interested in Web3 scalability and ZK coprocessing.

Time: 17:05:29
The Impact of ZK Coprocessing on Web3 Ecosystem Analyzing how ZK coprocessing is reshaping the landscape of Web3 applications.

Key Takeaways

  • Hyper-parallel ZK coprocessing offers scalable solutions for Web3 applications.
  • Blockchain technology is evolving to support big data-scale computation.
  • Unlocking interoperable applications through innovative ZK coprocessing methods.
  • The Lagrange Expedition showcases the potential of ZK coprocessing for data-intensive tasks.
  • Web3 scalability improves with the integration of ZK coprocessing techniques.
  • Interoperability and scalability are critical for the future of blockchain-based applications.
  • Efficient computation over blockchain data is essential for advancing Web3 capabilities.
  • Embracing ZK coprocessing can revolutionize how blockchain data is processed and utilized.
  • The Lagrange Expedition highlights the importance of data scalability in Web3 environments.
  • Innovative solutions like ZK coprocessing drive the evolution of blockchain technology.

Behind the Mic

Welcome to the session

This is Lagrange Labs and oh, thanks for joining, everyone. Yeah, we're just going to take a few minutes and welcome others who would like to join and get started soon. Hi everyone. For those who are just joining, we're just going to give a few minutes for others to join. So hang tighten. Welcome everyone. Yeah, we're just waiting for more people to join. We'll probably start at the five minute mark, but feel free to. If you have questions that you want us to talk about during this exp spaces, feel free to go to our discord. There's a place for you to submit questions there. Or you know, if you want, you could also quote, tweet the announcement. Tweet and we'll take a look and address any questions there.

Updates and introductions

Okay. Yeah, it looks like we have some people joining still. I see some community members. Thanks for joining. We're excited to share more. Okay, Ismail, just double checking that you are in here and are able to speak. Smile. Is it everyone? Oh, yep, there you. Great to see you. Great to see everyone here. I love such a turnout for the first space. Yes, this is our first official Twitter spaces. So very exciting. Okay, let's go ahead and get started. Yeah. So first of all, welcome everyone. This is the Twitter spaces to talk about Lagrange, our ZK coprocessor and our roadmap towards Mainnet. But we'll be addressing different questions from the community as well near the end. To start off, I'm Sherry, I lead marketing at Lagrange Labs. And Ismail, if you want to do a quick intro as well.

Introduction to Lagrange

Yeah, so my name is Ismail, I'm the founder of Lagrange. Hopefully you guys have seen some of our tweets or have been on our discord. But for those of you who haven't, we build decentralized proving ZK co-processing and a bunch of very innovative technology in the ZK space. And so, yeah, very excited to be here and to see some of you for the first time and see some familiar faces in the audience as well. Yes, definitely. Okay, so to kick it off, I just wanted to kind of give an overview of where Lagrange is in our journey under this theme of the Lagrange expedition. In the last few months we've been in the Euclid testnet. We've launched two protocols, the state committees and our vk prover network on Eigen layer as ABS.

The journey of Lagrange

And we are working really hard to launch what we're calling ZK Coprocessor 1.0. And on this very rapid and exciting journey towards launching Mainnet for our ZK co-processor. And so yeah, maybe on that thought. This Lagrange expedition is basically kind of a way for community members to join in and help us build towards this exciting mainnet launch. There are different ways to participate depending on your interests and experience. There are community quests that we are releasing on a weekly basis throughout August and September and onwards for different tasks you can do on galaxy to just help us test, give us feedback, learn about Lagrange's protocols. And another way you can participate is also by running provers. Right. We have launched our ZK Prover network currently has more than 60 operators, I believe, including really top teams like Coinbase, OKX, P two P, et cetera.

Invitation to collaborate

And so if you are interested in becoming an operator for our prover network, you can actually do that as well and you can sign up on our website. And then lastly, obviously building with Lagrange, the ZK coprocessor, our state committees, we're talking with a lot of cool partners right now who are integrating our tech to unlock different use cases. If you have an idea or some pain points that can be addressed by our ZK coprocessor, there is also a way to reach out to us and we can help onboard you with our products as well. So with that said, maybe to kick off Ismail, I was hoping if you could talk a little bit about what's the overarching mission at Lagrange and how specifically our roadmap plays into that mission.

Lagrange's mission

So Lagrange was founded almost two years ago. At this point with a very simple mission, how can we enable blockchain data to efficiently be used? It's not like you're still muted. If you're saying anything. Oh, you can't hear me? Can you guys hear me? Hello? Can you guys hear me? We're still not hearing you. If you're, if you're trying to say anything, you can't hear me. Okay, you're good. So it sounds like maybe some technical difficulties. Let's give him a few back channeling, it sounds like. So either the issue is on my audio or it's on Sherry's audio. So let's just figure that out real quick and then we'll get back into the meat of this.

Technical difficulties

But okay, it looks like folks can hear me, Kashish, from the audience and some other folks can hear us. All right, I. So let me just dive into that. To Sherry's question about Lagrange. So Lagrange was started two years ago with a very simple vision, which is how do we scale what we can do with blockchain data on-chain in between chains? For about ten years, we've been asking the same question in blockchains, every L1, every roll-up asks the same question, which is, how do we enable developers to write more expressive, larger, more data-intensive applications on the chains that they're already on?

The Narrative of Chain Migration

And everyone has for the last decade answered that question the same way, which is, come, leave your chain, come to my chain and build your application on my roll up on my l one. Leave Ethereum and come build over here. This is the story that Aptos sold, the story that swe sold. It's the story that all the l two s on Ethereum sell. And what Lagrange believed, foundationally, that there was another way developers could stay on the chains that they're on, and could access more computation where they already are, using zero knowledge proofs.

Zero Knowledge Proof Utilization

They could sit on Ethereum, and they could request zero knowledge proofs of computation and verify those proofs. Back on Ethereum, they could sit on optimism, request zero knowledge proofs of computation, and verify those proofs back on optimism. In essence, they could scale their applications where they already are. And so, in the journey of Lagrange, we started with this very simple vision. We've been growing fast since we've been working with a lot of the top projects in space.

Collaboration and Growth

Hopefully some of the people here have seen some of our work with Eigen layer, with Mantle, with Coinbase, with OKX, with Kraken, with many of the top players in space. And what we're excited about now is this inflection point where were opening up, hopefully the Lagrange journey to include everyone here, to be part of what we're building, and to be part of changing the future of crypto. And so this is part of the reason why we're very excited for everyone to be here today.

Deeper Engagement

And so let me now give a little bit of a deeper dive into what is next. And so Sherry, I think, can ask the next questions. Awesome. Yeah, sorry, some technical difficulties here. Yeah. And in terms of what the key use cases that you maybe envision Lagrange's ZK coprocessor impacting or solving, can you talk more about that?

Expanding Use Cases

So, foundationally, what we let you do is to take your favorite existing DeFi application or NFT application on the chain you're on and do more with it. So how can the NFT applications on ethereum let users do new things? For example, how can I build loyalty programs on chain? How can I reward my most loyal users with incentives or Defi protocols? How can I build yield bearing assets, the things we all come to defi to use, the things that we love? How do I offer discounts and loyalty programs?

Opportunity for Defi Innovation

Indexes the beauty of Lagrange's coprocessor is it's a zero to one for a lot of Defi applications in terms of what they can offer their users. So we have a big launch coming very soon in our journey in the next couple weeks that will really show some of the best use cases that many of our users will be able to do. Top teams in the NFT and defi space. And to put it concretely, we recently announced a partnership with Etherfi as well as Renzo, two of the top LRT protocols in Defi right now.

Partnerships and Incentives

And both of them are very excited to use Lagrange ZK coprocessor for incentives and rewards programs. So as you use these Defi protocols, you get additional yield and additional benefits that are powered on the backend by Lagrange's scalable zero knowledge proofs. We see a future where the whole industry runs on top of zero knowledge proofs generated by Lagrange.

Prover Network Launch

And so I can also talk a little about yeah, and you know, we recently launched the ZK Prover network on Eigenlayer and made a big splash, right, within eigenlayer and respective communities. So what would you say is next for the ZK Prover network? So at its foundation you can think of a prover network like the next generation of L one S. It is a base layer to offer something to the vast majority of the crypto market.

New Paradigms in Proof Selling

L one sold block space. Prover networks sell proofs. And so Lagrange's prover network is the first production grade, production ready prover network in crypto, currently being operated by many top operators like Coinbase, OKX, Kraken, and delivering proofs across both our state committee and our ZK coprocessor products. These proofs are integrated into protocols like layer Zero Axler Polymer, many of the top lrts, mantle, et cetera.

Expanding Network and Capabilities

And so what we're excited about next is how we can expand the prover network, both with high quality operators from institutional grade operating groups as well as operators at home like hopefully many of you here, and how we can expand the types of proofs we serve to applications that are delivering differentiated zero knowledge compute. And this ranges from rollups to ZKML to ZK TL's protocols.

Vision of the Future

At their core, we anticipate a future where the entirety of the ZK space rests on top of proven networks. Like with raunch. Yeah, definitely. And so on the topic of operators joining the network, I think we actually had a question from the community submitted earlier. Are we willing to accept even smaller operators into the Lagrange network?

Community and Inclusivity

Yes, we are. The way I think of Lagrange and what we are building in Lagrange is hopefully a community different than anywhere else in crypto. There's this trend I have seen over the last six months where a lot of communities are migratory. They follow a company until that company releases a token, that company releases the token, then they leave and go somewhere else.

Building a Lasting Community

If we think back to the best communities in our space, the communities from the last cycle, the communities that still exist today, Ethereum, Solana, and some of the ones that don't like Cardano to some extent, Polkadot to some extent, what they had were people who stuck with those communities as they scaled and were richly rewarded for doing so, who became fixtures and components in an ecosystem that they were members of. They were not just passerbys through.

Commitment to Community Centricity

And so everything Lagrange does is going to be and will continue to be community centric. And because of that, we are very excited to welcome anybody who wants to run approver, not just institutional operatives, but anyone at home. One moment, folks. I think we're going to bring Travis. Awesome. Yeah. Okay.

Engaging with the Community

I think right now maybe we can go into other questions from the community. Let's see here. Okay. Yeah. Beyond the technical capabilities of the ZK co processor, I think we talked a little bit about compelling use cases or applications earlier. But how do you plan to expand the Lagrange ecosystem to attract developers and users from outside? Maybe the traditional blockchain space?

Integrating Zero Knowledge into Broader Tech Landscape

So I think it's important to think of zero knowledge proofs as a new component of the technology landscape. You know, I think the applications of zero knowledge proofs are not solely isolated into crypto. The ability to have verifiable compute over data. One of the things that Lagrange offers is a foundational primitive in a bunch of applications. In web two, I started my career actually working for a financial services company, Manulife John Hancock, and it's a multinational insurance company based in the US and Canada.

Challenges Faced in Traditional Industries

And one of the major issues they faced was how can we effectively share data with other major web or tradfi players?

Customer Overlap in TradFi

If I wanted to determine the overlap of my customers and your customers, and we're two different tradfi companies, how do I do that without showing you my whole customer list without showing you all the Social Security numbers. It just doesn't work. And so at the core, a network to trustlessly generate zero knowledge proofs has vast applications within both web three, web two, and in trad five. And so when we talk about the future of Lagrange, you're talking about a network that has the capacity to deliver value not just in an isolated portion of the technology ecosystem, but across all major companies leveraging technology. And so, yeah, we expect to be able to pull in a bunch of community across both web three, web two, and Trav five.

Technical Engagement with Community

Cool. We have here maybe some more technical questions. Just a note, we plan to be hosting technical office hours on our discord soon, so please feel free to attend those as we announce them. And those are a great place to talk to our engineering team and ask any specific technical questions or inquiries. But here's one. Can you please provide more technical details about the storage database than what's shown in our docs? For example, which ZK friendly hashing is applied? What's the subset of storage slots selected, what types of ZK queries it currently supports, and so on?

Core Components of the Coprocessor

Yeah, that's a great question. And so thank you guys for asking technical questions, because this is our favorite thing to talk about. So at the core, you can think about our coprocessor as using two primary components. The first is a pre processing of data, and the second is a verifiable query over the data. And so the way the pre processing works is we prove that the storage of a smart contract within a given block and then over a range of blocks is equivalent to a ZK optimized structure that's efficient to compute proofs over. So put simply, if I was to compute a bunch of inclusion proofs and computation over the Ethereum state, way too expensive.

Challenges of Traditional Provable Systems

It would be massive proving time be very expensive, nobody would want to use it. This is why a bunch of other projects in the space doing ZK coprocessing have never been able to gain users. It's just too expensive for these teams. And so what Lagrange realized a while back is one of the things we can do is take the data over a series of blocks and verifiably prove that data is equivalent to a new data structure. In this case, instead of it being the Ethereum storage structure built with an MPT tree with obviously a linked list of blocks, it is simply a giant binary Poseidon two merkle tree that is provably equivalent to the subset of storage you want to compute over on Ethereum.

Efficiency of ZK Queries

Once we have that equivalent structure, which is a Poseidon two binary Merkle tree, we're able to then structure very large computations over that don't have to deal with the ZK unfriendly or finite field unfriendly structures of Ethereum. And so, put simply, what this lets you do is to have very performant, very efficient ZK queries that any user can specify in SQL. So very easy to specify language that run at scale cheaply for many of the major applications in the space. So this is a fundamental innovation from our research team led by Babas Papavanthu and trauman Trinvasan and Demetrius Papadopoulos, who have invented many of the core mechanisms that have let our team be able to leverage this type of compute.

Transition to Travis

Great. Yeah, I think I'm still having some technical, excuse me, technical issues. So I've brought on my colleague Travis, who is a co host now. Travis, did you want to take over the question and answer portion? Yeah, sounds great. One sec. Awesome. So, yeah, Ismail, we'd love to hear how the Lagrange protocol ensures the security and integrity of cross chain state proofs, specifically when handling submissions from potentially untrusted users.

Zero Knowledge Proof and Security

Yeah, so I think the beautiful nature of a zero knowledge proof is the security is derived not from the person who submits the proof, but from what the circuit proves and what the public input to that proof is. And so in the case of our ZK coprocessor, the public input to that proof. So the thing that you're proving computation over in this case is the block header of a chaining question. So if you're in a single chain context, you're on Ethereum and you're proving computation over Ethereum state with our ZK coprocessor. It's entirely based on the block header of Ethereum that you can access from within Ethereum.

Assumptions and Cross Chain Security

So there's no additional security assumptions over who submits the proof. The only security assumption is that Ethereum is honest and Ethereum is accurate, which is the foundational assumption you have to make if you're building on Ethereum. Now, as Travis mentioned, the cross chain context, in the case of our state proofs, the beauty of the cross chain context is that we have restaked security behind attestations to what the state of rollups are that we often sell as proofs of. To cross chain protocols.

Collaborations and Security Representation

We work with like Layer Zero, we work with Axler, we work with polymer, and with all of these partners, the kind of core thing that we offer is a proof that there has been a sufficient amount of restaked voters who have signed the state of a blockchain or roll up at a point in time. And so what the proof tells you in the end of the day is, hey, there's $7 billion of security that has been committed to ensure that this state is accurate. And so what that gives is as secure of a representation of state as you can get in crypto today.

Backed by Security

$7 billion behind the state of a chain of roll up at a point in time is about as secure as you can get. And that's how, in the context of state proofs, we can guarantee security. Awesome, thank you for that. The community has some more questions, specifically on the types of capabilities that Lagrange is going to build out. What kind of capability will there be to decentralize the technology in the future?

Decentralization of Technology

Yeah, it's decentralized today, and that's the important thing to think about here. Lagrange launched a day one decentralized because decentralization, we believe is a foundational principle in building blockchain applications. So our prover network that generates the state proofs and the coprocessor proofs is already decentralized. This is why many of you can run provers at home. This is why Coinbase Kraken OKX are already running proverse. So when you get a proof from Lagrange, you aren't getting a proof that Ismail computes himself, you're getting a proof that's computed by a decentralized group of provers.

Safety and Computation

And so that's something that we're very excited to offer to the community, because we believe that it means that both the safety of the computation you get from Lagrange is based on our proofs and aliveness of the computation, the liveness of the proof. So you getting the proof when you ask for the proof is not guaranteed by us. It's guaranteed by a decentralized network of some of the best operators, improvers in the space. Awesome. Yeah, that's important to keep in mind, the technology is already decentralized.

Understanding the Project's Goal

Some community members want to make sure that they understand the technology exactly as it works. So from one person they would like to understand correctly that the goal of the project is to connect all blockchains into one. And if not, what is the goal? Please tell us more about specific examples and use cases, for example in the DeFi sphere. Yeah, that's a great question.

Clarifying the Project's Objectives

So I wouldn't say the goal is to connect all blockchains into one. I think that would be the goal of maybe a messaging protocol or bridge or an aggregation protocol. What our goal is to generate the zero knowledge proofs that will power the next generation applications in crypto. And in the case of our coprocessor, that's generating proofs for data heavy on chain applications that need to have robust computations over historical transaction storage and receipt state for the state proofs, that is generating proofs of the state of rollups, and then for other applications, that'll be us generating other types of proofs as well.

Foundational Infrastructure for Zero Knowledge Proofs

But at our core, what we build is a foundational infrastructure to power the future of zero knowledge proofs on blockchains. And some great applications of this are messaging protocols and bridges that use our state proofs, the aforementioned layer zeros, axlers, polymers and many other great teams or deFi protocols that want to build rewards or incentive programs on top of our proofs. Think Etherfi or Renzo or even NFT protocols, some tier one world class NFT protocols that we'll announce very soon who are working with us, who want to build rich and enjoyable on chain experiences for their users.

Use Cases for Zero Knowledge Proofs

All of which are powered by zero knowledge proofs generated by Lagrange.

Introduction to the Technology

Awesome. Thank you for that explanation. And another question regarding the technology. So the main technology supporting off chain data processing is Zkmapreduce. Is that correct or not correct in terms of Lagrange's ZK coprocessor? And can you tell us a bit more about the research and protocols and if there are other things in the pipeline in development? Yeah, so we like to call it Zkmaprius. That's an older name of what we called our underlying proving system. Right now, it's much, it can be much more simply thought of as ZKSQL or verifiable SQL. Simply put, you can write an SQL query, the same thing you would write over a database, and you can prove that computation over blockchain data and verify that computation back on chain. That is foundationally the best way to think of it and to talk about the research behind it.

Research Team and Current Projects

So Lagrange has arguably one of the strongest research teams in Israel knowledge space right now. So our chief scientist, Babys Papamthu, is one of the chairs of the photography lab at Yale. Dimitris Papadopoulos is a professor on our team at HKUST. Sravan was a former PhD student of Babus, and we have a number of other PhD students on our team right now focusing on core cryptography, mechanism design and distributed systems research. And so the first paper that many people might be familiar with that Lagrange published is something called Recltrees, which is a new authenticated data structure that has properties of update ability for Mapreduce computation, which is one of the things that's currently powering our coprocessor. We have other papers, one called Perseus, that we've talked a little bit about previously that are coming out very soon, as well as some interesting work on mechanism design for prover networks, and some other interesting design and recursive snarks that we really think are going to be this massive unlock for the speed, the performance and developer experience of building on our products.

Technical Questions and DDoS Mitigation

Awesome. Great. We have one more bit of a technical question. One user would like to know what your point of view on the issue of counteracting toxic data that has the capacity to do DDoS is attacking verification nodes through complex operations. So to make sure I understand the question correctly, I think it is what stops you from blocking approver by requesting some computation that's too large for the prover to do that. VDos is the prover. Keep in mind that to request a proof, you have to pay a fee to the prover, and in doing so, the fee that you pay is proposed proportionate to the amount of constraints in the circuit therein. If you want the prover to prove something that's too large or too much data or is too intensive, the amount you would have to pay is proportionate to the size of that proof. So your proof takes 50 hours to compute. You're going to have to pay a lot of money. Fortunately, the things that every user, or mostly every user, wants to don't take 50 hours, they take a few seconds.

Community Engagement and Upcoming Plans

But if you try to ddos the network, you would have to just pay an inordinate amount of money and you would quickly bankrupt yourself. Moreover, some of the interesting mechanism design we have, which will be releasing soon, means that the network can dynamically price itself based on a bunch of heuristics over proof request volume, which makes it so that the resource pricing the network will continue to scale. As someone ddos it quickly bankrupting that person. The example I would give you would be something like Ethereum, right? You could structurally not DDoS Ethereum as the more blocks you try to fill, the higher the base fee goes, and then obviously eventually you'll run out of money. That makes sense. Thank you for that explanation. Well, yeah, that was the bulk of our community's questions in regards to technical information. We have some community specific questions that I think we can answer on this space, so I'll just go through these and if anyone has any more questions, feel free to add them below the space itself.

Participation in Lagrange Expedition

So some people have joined the Lagrange expedition, which is a campaign that launched around our build up to Mainnet. Some people are interested in understanding how beginners who are not technically inclined can join testnet in the hopes that all or most of the tasks will be fine to try without much developer knowledge. And for that, yes, we will definitely have a variety of quests, and we already do have a variety of quests on the Lagrange expedition in galaxy, so you can check that out. And a lot of the quests are to test your knowledge on the ground to explore the ecosystem. So we've made sure it's very friendly for our non technically inclined friends and we are adding new quests regularly. So stay tuned on Twitter and discord to hear the latest updates.

Recognition and Community Contributions

And another community question we have was wondering if the previous galaxy quests before the expedition launch will count as contribution for rewards and for community recognition. So we've gotten a few questions around this because we know that we included some galaxy quests prior to the Lagrange expedition and to clarify any and all contributions to the community, whether it was during Euclid, testnet or the Lagrange expedition now will be part of the consideration for community recognition. So we appreciate and we'll definitely make sure that our community's efforts are. Yeah, and I think that is kind of the bulk of the community's questions for the expedition. Ismail, if you have anything else to add in regards to what we're excited about developing and the path ahead.

Inclusivity in the Community

Yeah, so one of the things I wanted to add for everyone here, which I think is a very important point to recognize, is despite Lagrange being a highly technical product, you don't have to be technical yourself to be part of the community. We want anyone who finds the vision of building a future of crypto based on verifiable, trustless, compute interesting to be part of our community, irrespective of if they're a technical or non technical. In terms of the roadmap of Lagrange, much of the products that we build are highly technical. Much of the research that we put out is acclaimed both in the engineering context of the space as well as in the academic context of conferences and publication venues. And so Lagrange is very confident in those points.

Future Announcements and Closing

But the community that we build. We want to encompass everyone, regardless of how technical they are. And then in terms of the future of Lagrange, I would recommend everybody stay tuned for some very big announcements we have coming very soon. On the commercialization standpoint, our company has just been taking massive strides and partnering with some of the largest teams in the space. Many of those have yet to be announced. It will be coming to fruition very soon. So we look forward to announcing and having many of you hopefully celebrate alongside us as Lagrange continues to scale. Wonderful. Thank you so much. Yeah, we're all very excited about the future ahead. Thank you everyone for joining and feel free to add any other questions that you have below this space and we will do our best to answer them in a reply.

Wrap Up and Future Engagement

Hope everyone has a wonderful day. Thank you everyone. Thank you for coming. Yeah, thanks so much. So I think that wraps up our first Twitter spaces. Thanks for joining and bearing with us as we had some technical difficulties, but yeah, lots of great questions. If you have anything else that you're wanting to ask. We'll be doing more of these in the future, but always on our discord. Feel free to reach out to us and let us know what you're thinking about, what you'd like to see more of. And we look forward to hearing from you and building towards the mainnet launch of Lagrange's Ek co processor together.

Leave A Comment