• Home
  • Gaming
  • The Rise of Onchain Gaming #CryptoDaily w/@BCGameOfficial

The Rise of Onchain Gaming #CryptoDaily w/@BCGameOfficial

Image

Space Summary

The Twitter Space The Rise of Onchain Gaming #CryptoDaily w/@BCGameOfficial hosted by MarioNawfal. Explore the burgeoning landscape of Onchain Gaming within the crypto sphere as discussed in the Twitter space. Discover how blockchain technology, NFTs, and community engagement are shaping the future of gaming. Dive deeper into key topics such as smart contracts, Play-to-Earn models, and blockchain interoperability, paving the way for a more inclusive and secure gaming environment. Gain insights into the challenges and opportunities that arise from merging gaming experiences with blockchain technology, and envision the exciting future trends awaiting the Onchain Gaming community.

For more spaces, visit the Gaming page.

Space Statistics

For more stats visit the full Live report

Questions

Q: How does blockchain technology improve security in Onchain Gaming?
A: Blockchain ensures transparent and secure transactions by recording all gaming activities on a decentralized ledger.

Q: What role do NFTs play in Onchain Gaming?
A: NFTs enable unique ownership of in-game assets, fostering a new economy of digital collectibles within gaming.

Q: Why is community engagement vital for Onchain Gaming platforms?
A: Communities drive user adoption, feedback, and overall growth, creating vibrant ecosystems within Onchain Gaming.

Q: What are the benefits of integrating smart contracts in gaming?
A: Smart contracts automate processes, reduce fraud, and increase trust among players by executing predefined actions based on conditions.

Q: How can Play-to-Earn models revolutionize gaming?
A: Play-to-Earn models enable gamers to earn real value from their playtime by participating in in-game activities that generate rewards.

Q: In what ways can blockchain interoperability benefit gamers?
A: Interoperability allows gamers to use their assets across different gaming platforms, enhancing their overall gaming experience.

Q: What are the challenges in merging gaming experiences with blockchain technology?
A: Challenges include scalability, user experience, and regulatory issues that need to be addressed for seamless integration.

Q: What future trends can we expect in the Onchain Gaming space?
A: Future trends may include more immersive gameplay, advanced NFT integrations, and increased adoption of decentralized gaming platforms.

Q: How do DeFi protocols enhance the gaming experience?
A: DeFi protocols provide innovative financial tools within games, allowing players to interact with decentralized financial systems in a gamified manner.

Highlights

Time: 00:15:40
NFTs Revolutionizing In-Game Assets The impact of NFTs on ownership and trading of unique in-game assets in decentralized environments.

Time: 00:25:17
Play-to-Earn Model Discussion Exploring the disruptive potential of Play-to-Earn models in reshaping traditional gaming reward systems.

Time: 00:35:29
Smart Contracts Ensuring Fair Play The role of smart contracts in automating processes and maintaining fairness in onchain gaming interactions.

Time: 00:45:55
Blockchain Interoperability for Seamless Gaming Discussing the advantages of cross-platform interoperability for gamers across different blockchain-based games.

Time: 00:55:42
Community Engagement and Growth Importance of community-driven initiatives for the sustainable development of Onchain Gaming platforms.

Time: 01:05:18
The Future of Decentralized Gaming Envisioning a future where blockchain-based gaming leads the industry towards new levels of decentralization and player empowerment.

Key Takeaways

  • Onchain Gaming presents new opportunities for decentralized experiences within the blockchain space.
  • Integration of blockchain technology enhances transparency and security in gaming ecosystems.
  • NFTs play a significant role in revolutionizing in-game asset ownership and trade.
  • Community involvement and engagement are crucial for the success and sustainability of Onchain Gaming platforms.
  • Smart contract implementation in gaming minimizes the risk of fraud and enhances trust among players.
  • Exploring the potential for interoperability between different gaming platforms using blockchain technology.
  • The gamification of DeFi protocols provides innovative ways for users to interact with financial systems.
  • Emerging trends like Play-to-Earn models are reshaping the traditional gaming landscape.
  • Discussions on the challenges and opportunities of merging gaming experiences with blockchain technology.
  • Insights on the future of Onchain Gaming and its impact on the gaming industry.

Behind the Mic

The Opening

Hello? Hey, Gallon, how are you doing, man? Good, how are you? I'm good. Okay, we got two minutes, so let's dive right in. I am logged in as host under Zog Labs account. What next? Okay, perfect. I'm going to log in as myself. Just give me a second to do that. And then once I do that, you just have to make me co-host. Okay, sounds great. Basically, I'm just gonna, like, quickly introduce everyone. I know we need to introduce you. I don't think we need that because you don't have to introduce me and I can introduce Michael. Okay. Okay. You want to do that? Yeah, yeah, I got a cover. No worries. We can just like, hang out and until enough folks trickle in and officially start, at which point then I will do the introduction.

Technical Issues

So. So we'll cover. Hello, are you there? I think you're muted. Oh, I need to. I need to mute my mic. That's so strange. It. Can everybody hear me? Hey, yeah, I can hear you. Good to have you here. Hello? Yes, can you hear me? Hello? Yeah, can hear you. Good to have you here. Just give me 1 second. We'll make you a co-host. Hope you're able to stay cool in the 80-degree weather. We've been having that set. Let's see. Okay, I think we just got started and folks should be joining very soon. Hey, how about now? Yes, I can hear you. Yes, now we're on. Good to have you. Oh, yes. Hope you're able to stay cool in the 80 plus degree weather that we're having.

Talking About the Weather

It's actually. It's been kind of nice. It's been a really easy transition from Singapore and Hong Kong. Yeah, yeah. Temperature parity achieved their easy migration. A little unusual, but I take it for sure. Yeah. There were a couple weddings up in Bay area over the weekend and all the photos came out with their faces red. But I guess we're immune from all the Singapore time that we spent. Yeah. I don't know about you, but I'm freezing right now. Who would have known? Simple stuff like AC coming so handy. Yep. All right, should we get started? Let's do it.

Introduction to Zero G

All right, folks, welcome. Thank you so much for joining this episode of Zerog AMa, where we're reintroducing Zero G, the first decentralized AI operating system. And today we're really lucky to have Michael Heinrich, the co-founder and CEO of Zero G Labs, to take us through the first look at Zero G's Daios, the world's first decentralized AI operating system. Hey, Michael, how are you doing? Doing well. Excited to be here. Nice to be back in San Francisco and get to see my daughter again. She is turning eight weeks old this weekend. Time flies. And she must have missed you. I know that you have been on the conference circuit chatting with AI builders and researchers for the past few months across Korea, Hong Kong, New York, Singapore and SF.

Perspective on AI Industry

Can you give us an update on what you're seeing, the AI industry and what's been going on with you and Zero G in the past few months? Yeah, first of all, it's super exciting to be part of this time because web three AI just feels so early. From a stage perspective, there's a lot of excitement, a lot of builders coming into the space and building across the entire AI stack, I would say. So it starts on the, let's say, data generation side. You've got companies like, for example, grass or Mizu doing synthetic data or enabling new sources of data where the end users can actually monetize their own data versus it being monetized on them to new data labeling approaches like HEfT AI or public AI, where they can actually decrease the cost by about five x, which is pretty significant because that represents about, if I remember correctly, 30% of the overall AI costs for training one of these really large LLM models.

Building and Innovations

And if you're talking about, you know, five to $10 billion for the next generation of LM models, that's a pretty significant cost savings. So that was exciting. Of course, model building companies like sentient and you know, like pond and so on, building on the space, various approaches, whether it's an LLM or large graph model, to then kind of new forms of inference. For example, I met a company that's doing fhe mljdehe, probably the first that I've heard of. So that was super cool to ZKML, OPML, TeMl, all of those different flavors. And then there's also agentic type of frameworks that are building in the space, like theoric and Telus and many others. So a lot of excitement all across the AI stack.

Challenges in Building

Now, the challenge with that is how do you as a builder actually start focusing on something within the AI space, like which part of the stack do you actually use? Because with OpenAI, it's very easy. You go to OpenAI and you have a bunch of API calls you can do, and you've got your entire basically execution layer for you. But it's not so easy in web three meets AI. And so that's where we really come in so we can chat a little bit more about that, which that's why we're essentially introducing or reintroducing zero g as the first decentralized AI operating system, because we're really seeing this need to be the connective tissue between different web three and AI companies. So it's very easy to basically get up and running and start building in the web three AI space.

Introduction to Data Infrastructure

So just as a way of kind of reintroducing, we started at the data infrastructure perspective because. Sorry, go ahead. No, we're good. Please. Okay. Yeah, just at the data infrastructure level, we basically didn't see any good solutions for storing AI models, AI workloads, and having the kind of data retrieval and ingestion rates that we, that you need for AI to be fully on chain. So we built a full decentralized storage network. On top of that we built data availability system so that we can do things like on chain model training in the future where you need tens if not hundreds of gigabytes per second of data throughput, which is currently not available in current data availability systems, which generally top out at about ten megabytes per second or so.

Future Developments and Rebranding

Then with the daios, soon we'll be releasing a serving framework that then gets us is a foray into the execution layer and then becomes that connective tissue that we want to be. So we'll be releasing different ways of using TEML and fast data retrieval processes and so on. So it's going to be really exciting once that gets released. And that's why we wanted to rebrand Zero G into the first decentralized AI operating system, because it really pays homage to this broader vision of making AI public good and AI safe for all humanities, so that AI consistently can be in alignment with human needs and build kind of a world of abundance that we want to see, versus a world that AI wakes up and decides human beings are no longer necessary in this world.

Importance of AI and Blockchain Vantage

And for that type of world, we absolutely need to have AI on blockchains. So sorry, I spoke for quite a bit, but wanted to give a good kind of overview and structure to the conversation. Roy, if you're speaking, I can't hear you. here. Okay. Twitter glitches again, distrust in centralized systems. But yes, I'm glad that you laid a strong foundation on the current landscape. There's almost a choice paradox with the number of tools and platforms that are just popping up, leading to confusion for developers on how to choose amongst all of these options. So as we dive into that, and you touched on, hey, can we have a single point of contact? Can we leverage some of the cost advantages with this single point of contact? You land it at the end with what does this mean in the larger picture of a whole suite AI operating system that is aligned and trustworthy?

Vision for AI Systems

So we'd love to dive in there with you on where do you see the underlying values that drive the dIA system, and what's the vision that we're heading towards in five to ten years? The first step is to get to parity with centralized AI systems, and there's still quite a bit of work to be done, because if we want to train large scale LLMs in a decentralized fashion, it's currently not possible. Still, we have to figure out how to do batch and asynchronous training, for example. So there's a good amount of work just to be done there to get to parity. And then after we get to parity, maybe the next two to three years is my hope, then we can start looking at, well, what's beyond parity.

Blockchain Technology in AI

How can we use the benefits of blockchain technology to really impact the AI space as well? And there's a number of parts where that really makes sense. So, for example, if we can figure out how to unlock all of the latent compute GPU's and so on that are in consumer devices, and I think there's roughly 200 million or so of those, then we can open up the available compute for training some of these larger scale models. So that's, I think, a pretty interesting and exciting direction, but it's still on a research phase. Once that's figured out, then perhaps we can train even larger LLM models than would be possible with centralized data systems or data centers rather.

Fair Rewards Distribution in Decentralized AI

So that's one exciting thing to look into then two is how do we as end users actually get fair rewards distribution? Right now, centralized companies, they're basically paying for data on Reddit and other sources, or just basically scraping the data and then using it to train these models. And so the people that win in that case are the data providers, which are basically monetizing your data, and then the model builders. But what if there's a world where essentially you contribute to a model? That contribution is called an inference request or fine tuning request, and then you get compensated for it. So there's some really interesting things to explore there as well.

Security and Alignment in AI Development

And I know, for example, story protocol and others are looking into that space. So that's another superpower blockchains can bring. Then there is aspects of security or alignment with AI models and alignment can happen at different levels of the AI stack. It could happen. Can we prevent at, for example, data injection site? Can we prevent data injection attacks from happening real time, for example, utilizing blockchains? Because blockchains were built for adversarial type of environments, it's a trustless environment. You need to prevent things like that from happening.

Enforcement and Alignment Research

And you can use slashing or economic incentives to support or not support certain actions on particular blockchains. And then you could also do it while the model or the agent is running. If you see model drift, or if you see agents misbehaving, then you can enforce slashing conditions or incentive conditions on them. And so the whole field of alignment, I think there's a lot of really ripe research there that we can look into with our decentralized AI operating system aspects. And then there's a lot more things like censorship, resistance, and how to actually ensure that AI systems are claiming specific things and that we can actually verify that those claims are true.

Verifiability and Trust in AI Systems

So what version of the model are you running? Was it actually given to you on a, a 100 gpu? All of those verifiability claims become really important when you're dealing with societal level aspects or issues. So for example, if the next ten years we start having AI agents running complete societal systems, like logistics systems or administrative systems or other type of systems, we really want to make sure that those systems live on a blockchain so that we can independently verify that something has happened. If a model, for example, just claims like, hey, I did this thing, but actually it didn't, and just created a bunch of deep fake messages and broke into a centralized database to change entries, then we need systems that prevent that from happening.

Alignment with Human Interests

And so that's especially important when you're dealing with physical world. Let's say you have a lot of humanoid robots running around in a smart city and doing work, it's very important that those are fundamentally aligned to human needs and interests. So blockchains, in short, provide a lot of superpowers to AI and to us. There's just no future in AI without blockchains. Absolutely. Recently at Token 2049, Olaf from polychain kind of presented this idea of programmable economy aligned with the vision that agents will ultimately operate in the real world, but run programmable rails for web three.

Technical Breakthroughs and DAIOS

And as we understand deeper the values and advantages of launching AI apps on zero G's daios, we're now appreciating the access to larger compute resources, up to 200 million customer devices, the cost of vintage the rewiring of the incentivization and alignment. All of this is really based on technical breakthroughs that's been enabled just in recent months. Now, what I mean by this is zero G has been building a traceable and programmable data storage and data availability layers. How do these components and technical breakthroughs feed into the possibility of a Daios?

Provenance and Traceability in Decentralized AI

And what are some of the other components we should get excited about? So, provenance and data traceability, absolutely essential if we want to have decentralized AI, because how do we know what went into a particular model? A very crass example I give is if you train a child on how to build a bomb, then what is a child going to grow up and do, like build bombs, basically. And so, similarly, if you train models with bad data, then they're likewise going to make poor decisions off of that bad data. Or that data could also have had certain censorship things injected into them, or somebody might want to willingly corrupt that specific data along the chain.

Importance of Traceability

And so we need to think about all these cases and prevent that. And therefore, traceability is super important. And that's why when you have a decentralized storage layer, that traceability is built into the functionality and feature set, so you can independently verify what happened to the data at any point in time. Otherwise, you have to trust a centralized authority to basically say, like, hey, well, this data went through these specific passageways, and here's what happened to it. But maybe that centralized authority actually changed something about the data and just didn't tell you about it.

The Role of Decentralized Systems

And so having it be in a decentralized system, it's completely clear and visible and transparent and completely verifiable where the data came from, what happened to the data, how it's being used in a particular model. So it's absolutely essential to get that as a basis. Beyond that, of course, there is the actual models themselves and the compute that are really important. And so on the compute side, we're definitely going to be doing research on how do we tap into consumer devices and other latent compute and data centers that's not being used. But again, that's very much in research phase, not implementation phase.

Collaborative Research and Development

And then on the model side, I think there's, as I mentioned, a lot of really interesting builders in the space trying to build some cool models. And so we definitely want to work with them to utilize our storage layers so they can get the full traceability and then embed them into the zero g ecosystem. Fundamentally, if you think about it, what does aios really do, or what does an OS do? It's really administering and aligning hardware and software resources together so that you can then run applications very easily without needing to worry about, oh, is this computer running out of memory or is my graphic cards being overutilized?

Operating System for AI

Is there enough garbage collection? And so on. So the OS abstracts all of that away from you so that you as a builder can just deal with, hey, I just need to do this inference call for my application or I need to store something here and it's all done for me, so lots to be excited about. That's neat. So the Daios is sort of like a windows or Mac, but instead of coming out of metal box, it's an operating system for AI that sits in the cloud and runs on blockchain rails. Yes, exactly. Yeah, I mean, cloud is something that's more used in the kind of centralized world.

Clarifications on Infrastructure

So I would just say it runs on a blockchain. But agreed with you, that was an accurate description.

The Evolution of OpenAI

Right, right. This is sort of the next evolution in the phase in trend of OpenAI. And today's chat is actually super timely because just yesterday Nvidia announced their MVLM, their first fully open source LLM model. And that's a signal to the entire industry trending towards a more open building environment, open community and open ownership. So with this, we've also heard several other initiatives within the crypto and blockchain ecosystem that are trying to build out certain components or certain layers, layer one or layer two, geared towards an open AI. How is zero g different?

The Contrast of AI Approaches

So it's a little ironic that kind of close, traditional source, closed source companies are moving open source on the AI world, and then the company that was supposed to be open from the beginning is moving close source and for profit. So it's kind of an ironic state of affairs if you just take a moment to think about it. But it just kind of highlights the stark contrast between centralized AI, or what we call closed AI, where you have to trust a specific company to basically do all those things that I mentioned, like where did the data came from? How was the data labeled? What were the weights and bias of the model? How did the model converge? You know, what was the backpropagation and how is the model being served? Is the inference verifiable? All of those things you have no clue about, you can't verify. Basically you're just trusting some authority to have done that for you.

Decentralized AI and its Benefits

And then on the other hand, is decentralized AI, where all of that is transparent, independently verifiable, and you enjoy all these superpowers that I mentioned earlier. So that's why our fundamental belief is that there's no future in AI without blockchains. And so that's how we're different. There is closed AI and then somewhat more open AI with kind of the open source collaboration that's coming out. And then the most open is decentralized AI because it fully lives on a blockchain. You can verify it. There's a lot of security features that can be enjoyed as a result. And we're really excited about building that future for humanity. That is tremendously invigorating as we move towards this vision and future of an OpenAI.

AI Components and User Choices

Now, as an aios understanding, it's sort of like also a configurable Tesla, right? Except it's for an AI apps to build on top, where you can pick different AI components that are built already on decentralized rails. So as users dive in to daios, who are some of the provider component tooling choices that we anticipate that they will have access to? So what you're highlighting is that we're building it in a modular way, because different people need different ways of interacting with applications. Some people really care about performance because let's say there's a fully on chain game and it wants to do some inference calls for non playable characters, but that needs performance because otherwise the game experience totally sucks.

Supporting Diverse Use Cases

So that's one use case. Another use case may be I am safeguarding billions of dollars of assets and I don't care about the latency that it takes. I just need to make sure it's as safe as possible. It's as secure as possible, and I'm happy to wait not to mess up the assets that are sitting on the chain. And so we need to support all these different use cases. So if you just have one stack and it just supports one of the use cases, then you're not being true to all the builders. So we want it to be done in a modular way, and we want it to be done in such a way that we can bring in the broader ecosystem and web three AI and support builders with the different needs that they have.

Provider Selection for Different Needs

So if for example, you care a lot about performance on the inference side, then utilize something like Teml for example. So we're working with for example, Falla network and ISO on some implementations there. If you need more on the security side, then ZKML may be the right solution for you. And there's some providers that we work there, like modulus labs for example. So we just want to make sure that builders all across the stack can be supported. And we're going to be thinking through how do we make it a really easy, simple user experience so that you can choose different parameters.

Understanding User Preferences and Needs

Do you care about performance? Do you care about cost? Do you care about security? So there's going to be different tracks where different types of providers make the most sense for what it is that you're building. And so we'll make it really simple and easy to access that over time as we build out the daios. That's super exciting in simplifying developer experience from having to stitch together all these different components, whether it's TML or CKML, into a single pane of glass. And as we chat about where we are today, I'm also wondering, why now?

The Timing for AI Evolution

So we're probably on the beginning of a multi decade AI evolution as it changes the course of human history. You know, we could have done this two years earlier, just on the onset of agent development. And frankly speaking, they're not really agents today, they're really co pilots and missing some critical functions like long form memory, like access, proactive access tooling. And we could have also waited till some of these specific capabilities are flushed out. Why is this important now? We have to get started at some point before we reach parity with centralized AI systems.

AI Safety and Human Existence

But there's a number of scientists and politicians and so on that have raised alarm over, hey, if we don't start thinking about AI alignment and AI safety, we may end up in a world where only AI exists, no human beings. And so it's absolutely important to be part of this future, because centralized AI companies, once they figure out how to get to AGI, and if you believe that most of human commerce can be automated, that is, use an agent or a model to be plugged into that work process. And then the cost of essentially intelligence and work drives down to zero, basically close to whatever the hardware costs are to service. Then basically large swaths of industries will be completely automated with just AI agents.

Concerns Over Centralized Control

And what happens in that world? Do we let a handful of companies basically control all of commerce? Basically, that would give these companies more power than nation states or states in general. And so that's a very dangerous level of affair. And so we need a opposite system that balances kind of this amount of power, a system that has abundance built into it from the get go. Because in the former case with centralized companies, then governments have to basically step in and figure out, well, do we give some universal basic income? Like how do human beings contribute to the system? Do they contribute at all? What do we do with society? So you have very different kind of questions that get raised in that type of environment.

The Importance of Immediate Action

And so it's absolutely pertinent to start now before it's too late. Got to start now and prepare as people, as human beings also towards this future where the cost of intelligence and work drives down to zero. What do people do? What's our role as individuals in the future where most of the means of production are automated away, and we sort of have these AI workers and AI chore takers to do away with the mundane tasks that kind of hold us back? What do people do? Well, I certainly hope we can create a type of utopian society as a result.

A Vision for a Utopian Society

I was just talking yesterday about this with Miku from gumi cryptos and Anand from canonic on a panel for SF Tech week. But I really hope it creates abundance because once human beings are less stressed, have more time, don't have to worry about financial means, then we could really move towards actualization and really do things that are actualizing us, whatever that may be. Maybe for me it may be I want to just go meditate in a cave for a month and not have to worry about whether I can afford it or not. Or farther said maybe. Hey, I really wanted to have this experience of traveling the world, for example, so there could be quite a bit that gets created.

The Path to Abundance

Miko's point was also we need abundant, clean energy in order for that to happen. I mean, not only does the marginal cost of intelligence need to go to zero, but hopefully the cost of energy so that it's just 100% abundant at all points, which then means that agriculture and other means of production also become fully abundant. So it would be a pretty awesome world, because then we can really choose what we want to do in our daily life. We don't have to, quote unquote, work for a living anymore. We can live and enjoy the work that we actually do because it brings us meaning. And so I'm excited about building towards a world like that.

Reflections on a Bright Future

That will be fantastic. And to your point earlier, there's sort of this irony of the purported OpenAI's being the most closed. Recently, Sam Altman had shared his vision towards what people could do in the future and shared his experiences from burning Mendez. And his version of that is like, well, we all could just ride bikes wherever we want all day and dance. I was like, that sounds great, Sam. That's, yeah, back to the basics. Great dance and bike. But what he's not telling us, of course, to your point earlier, is like he's just raised $7 billion from the Middle east and building the world's most centralized AI system while, you know, we're all kind of like gallowing away, but feeding our data and feeding our usage and attention to this very centralized system.

The Future of Open Source Development

So that's really important to realize, you know, we could be at the cusp of a divergence as we go down a very closed or very open source future. What are some other developments in open source, in open rails that you're excited about? Sorry, repeat the last part of the question again. Yeah. What are some developments in open source and in open rails that you're excited about? I mean, in general, I like seeing a lot of the creativity that's coming out on the open source Aihdem model movement. So, for example, Lama's new video generation engine was pretty awesome. So definitely it's better than Sora. And so it's nice to see this kind of arms raise with the open source community behind it, but then you still have to realize that meta is behind it. And if at one point they basically say, hey, I'm no longer going to fund this, then how do we as a community keep competing? And so those are the types of things that I think about as well. But yeah, just in general, better kind of trained models that are starting to get to photorealistic video and images. It's very impressive.

The Emergence of New Roles

Which, of course, already then means that the most in demand job for the next five years is going to be, quote unquote, prompt engineer. And prompt engineer can mean movie director, can mean ad commercial director, given kind of all of the power that we have now with some of the models that are coming out. So I think that's super exciting, just seeing all of the creativity being poured into these different modalities. Right? Make your own Netflix, make your own TikTok. Everyone's a creator. Exactly. And I. One of the other things that is not very well or talked about at all at the moment in the web three AI circle is this whole idea of embedded AI. There's a lot that needs to be trained for specific visuospatial systems like humanoid robots, because they will need to learn about the world very differently. And so I think there's going to be a lot of open source initiatives to support gaining that data from individual users as well as other sources.

Training Humanoid Robots

So I'll give you an example. If you want a humanoid robot to make kind of the best latte ever, you can feed a bunch of training videos from some of the best baristas in the world, for example, to that humanoid robot. And so where do you get that data from? So I think that kind of evolution or revolution, I don't know. However you want to call it, with humanoid robots starting to take parts of work for us is coming, and it's going to be here sooner than we think it is. Yes. Spatial technology would pave the very last mile for AI enabled robots to finally participate in real world. And of course, the grandmother and godmother of AI, Fei Li, recently raised 200 million to launch her spatial AI focused startup. So fingers crossed she's going to do it in an open source way.

Opportunities for Developers

Speaking of open, well, there's an opportunity for all the web, three AI builders. When one door closes, another window opens. And so maybe that's a blessing in disguise for developers who are looking to build an open way. And speaking of open, we have quite a few guest members from our community here today. So guys, if you have specific questions that you would like to ask Michael and take advantage of this time that we have with him, go ahead and raise your arm and we'll be calling on you in order to have you hop in and ask your questions. I saw one, and one of the threads that I can maybe address, there is one person that asked. From a community investors point of view, does this transition affect mainnet deployment and tge? I would say a bigger kind of impact would be more the market itself.

Market Conditions and Deployment

If the market is very shaky because we want to launch Mainnet together with our token, then that's more of an impact versus having a broader vision and narrative. So yeah, that shouldn't impact kind of our go to market and main timeline. That's great. And on top. On that note with the are we ready to share some pointers on when we're anticipating mainnet token listing and some of the future milestones that are coming up? Yeah, I mean, market dependent q one to q two of next year is what we're targeting. Amazing. And during the next couple of quarters, how can a community participate and contribute and hopefully incentivize too? By being part of the zero g ecosystem. Absolutely.

Engaging with the Community

So you can participate on the supply side. So running things like nodes on the network, whether it's validator nodes, storage nodes, Da nodes, that's all definitely possible. And always we're really excited to see about all of the traction on our network. I know some people are running hundreds of nodes at the same time and even doing some interesting dual mining applications with other GPU providers. So that's also really cool to see. Then on the demand side, you can build an app chain, an l two utilize us for the DA. You could utilize our storage layer for different AI type of applications. For example, if you have a specific data set that you want to store and make open to the community, that's a great way to do it, but really it could be for any use.

Possibilities for Developers

Case, if you have an NFT platform and you need to store data about it, you can use our storage network. So the possibilities are really endless from that perspective on the storage and EA side. And then they'll get even larger once we introduce our serving framework within a month time so that you can start enjoying inference and fine tuning support as well for all of the AI type of applications that you want to build in a fully verifiable, decentralized manner. So super excited about all of that. There are so many ways to engage. Where does a developer or project builder or just a user who really wants to contribute towards an open AI? Where did they start?

New Documentation Release

Well, funny you should ask. We are going to release our next version of our documentation which will make it much easier to get started. Big thanks and shout out to our tech team and Michelle who is our product lead, and everybody that's contributed from the ecosystem side as well. I know we have some really active community members like Moonhill capital for example comes to mind, and many others in our Discord channel that have spent many hours actually recording videos supporting the community. So really wanted to give a big shout out for all of your support and impact in making the documentation better. But all of that's going to live on Zero G AI, the website. And if you look at the developer documentation there, you'll soon see version two come out of I don't know if we have an exact release date, but it's going to come in short order.

Appreciation for Community Support

And really that's the culmination of a lot of community support. So just feel really grateful that we have such a wonderful community supporting us. We have such a prolific and active community. And guys, you might just get a chance at a shout out next time. At the next AMA start with zero G AI, check out the next version of documentation. And Michael, as we go towards the end of our time today together, what's the one thing that you want the audience to take away today? I'd say the first of all, the most important part is why we started to call ourselves now the first decentralized AI operating system. It's because of this broader vision and to really clarify what direction that we're going into.

Future Vision

And so there's going to be quite a number of components that are going to be built from our side on actually making a decentralized AI operating system function. And so we're super excited about it and are super excited about engaging with our community on it. And then two, we wouldn't be here without our community. I've had a lot of fun recently interacting on x and discord with some of our community members. I mean, every day I log in, there's like six to eight different content submissions and there's some people who are like super active as well. So for example, I consistently see Myoku and Manchik consistently creating content and various others. And so the second thing to take away is the only way we're going to succeed is with our community.

Building Together

And so you make us work harder, you make us feel alive when you interact with us. And so super excited about building this together and making sure that in the end, it's going to be a very rewarding journey as well for all of us. Yes, huge shout out and appreciation to our community who has been along with us to this day and unveiling the next chapter of decentralized AI operating system. Thank you so much, guys, for joining our chat today. And thank you, Michael, for showing us where we're headed. Thank you, everyone.

Leave a Comment

Your email address will not be published. Required fields are marked *