Exponential: A Nexus Podcast

Episode 29: From Nodes to Agents

Nexus

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 36:59

Mrudul Gole, Head of Business Development at NodeOps, joined Exponential to trace the arc of a company that started as a blockchain validator and is now building the infrastructure layer for the AI era. 

The conversation covered the evolution of decentralized compute, the unlikely synergies between blockchain and AI, and a bold prediction about what dev teams will look like in six months.


SPEAKER_02

Welcome to Exponential, a Nexus podcast, where we talk about people, code, and capital. I am Daniel McGlynn. And in this episode, I talk to Murdo Gole, Head of Business Development and NodeOps about building infrastructure for the agentic era. Okay, uh, welcome to Exponential. Thank you so much for uh joining. It's great to have you on the show today.

SPEAKER_00

Thanks, thanks, and you thanks for having me.

SPEAKER_02

Yeah, so I I really wanted to talk to you um about uh NodeOps and and also just about um kind of some of the developments in AI. I know you all are working hard in that space and have some interesting products there. Um so I thought given uh all the headlines we're seeing recently in the past month or so about the explosion of AI and the impact and how that's changing everything, I thought it'd be a great time for us to chat and uh talk more about your work and what you're up to. So with all that in mind, maybe a great place to start uh to kind of kick off the conversation is you could just tell us what what is NodeOps? What is it all about and like uh you know how the company started.

SPEAKER_00

Sure. I mean um to give you a quick background, I mean NodeOps uh we've been with like an infra layer basically for teams that are building in AI and blockchain. But the way it all started was probably like three, four years ago when like uh both uh uh like Naman, uh Pratik and uh JD, they kinda uh started as like a validator on uh different blockchains, uh more like a Genesis validator, uh predominantly in the Cosmos ecosystem, but uh around the same time. I mean, uh we saw a lot of um projects coming up with this whole idea of uh spinning up a node uh and trying to get rewards on top of it. So uh we suddenly saw like a very uh big uh market of uh projects like XAI, uh this having Aether node sale happened, uh and then it followed with a lot of different projects like uh Lumos, uh CARV, all these projects started coming in, and uh there was one problem that we noticed was like uh for any retail user, it kind of became like a uh a painstaking process to spin up their own nodes because they were buying these licenses um uh to these uh Guardian or uh checker type of nodes, but like weren't able to spin these nodes up. Um at the same time, there were uh people who were able to do that, uh kind of went into the problem of monitoring security aspects. If a protocol throws up a new update, how to make sure that these nodes are always up and running and uh up to the standard, right? So uh we saw there there was a gap over there in terms of uh how retail uh uh folks are kind of reacting to spinning up these nodes and and thought of why not make it very very simple for anyone to spin up a node, specifically retail focus nodes. So that's how the whole idea of NodeOps console kind of came into picture, where uh for any protocol that we support, people could just simply come in uh in one click deployer node and uh they pay us a very um uh minimum fee of five, ten dollars, depending on the kind of resources are being used uh uh by these nodes. And yeah, I mean we started seeing a lot of users coming in. We currently have around 70 to 80,000 paid customers on our platform uh who actively run nodes. Uh but on the technical side of things, if you see, I mean, we were deploying these nodes on different VPS providers. So uh imagine AWS, GCP, Azure, uh, and bunch of others. So uh we we saw there was like a reliance on these VPS providers, and we wanted to create an economy where uh basically uh bring in uh bring these deployments to to be more decentralized in in a sense, and that's how the whole idea of decentralized compute kind of came in, and we kind of onboarded over 80,000 compute providers across the globe, so that uh any of the workloads that we were getting in were then went down to these providers and uh created like a very sustainable economy for uh people providing compute to us and then um and then uh we giving the value to them. Uh a lot has not changed. Uh uh things that have changed are mainly uh the nodes side of things, the meta is not that prominent as of now, but we still run um uh legacy validator nodes for a bunch of other projects. We've got good enterprise folks running us, running nodes with us. So, this includes the likes of Maven11, uh the Spartan Group, Bankless Ventures, uh BitScale Capital, and uh a bunch of those. Uh but since there was a decline in this retail side of things, uh and AI kind of taking over the I think globe now, uh yeah, we we wanted to make uh use of this infra that we already had. So uh because um we had providers and and the the idea was to basically now build products which not just Web3 folks can use, but anyone from Web2 can also start using it. That's how the gradual progression happened, and we we launched CreateOS, which is uh uh a platform where uh you can go from idea to production in like one continuous loop uh without uh switching context, uh without switching tools. So you create uh using by coding, you uh deploy in one-click fashion, so we take care of everything from front-end back end database. We've got services like messaging queue, all these in place. And if you are an AI team who is trying to, I don't know, uh use GPUs to train your own uh SLMs or create work on your own model, you can rent GPUs as well. We've started working with a lot of different GPU partners of ours to make those GPUs available, um, and yeah, very uh sustainable as well as um uh discovery platform for anyone who is using our product so that we can support them through our application uh store. So yeah, I mean that's that's Create West now. That's where we are. We've also um started NodeOps router, which is uh more of like an LLM gateway for uh an event to get access to um the open source models, which uh essentially you'll have to uh mainly run it on any GPU. So we we make it uh on our own, we we provide it through our platform. So uh imagine models like Deep Seek, uh Minimax uh uh and and Quinn and a bunch of others. So yeah, that's that's where we are uh and and what we're doing.

SPEAKER_02

Yeah, that's fairly cool. I I want to talk to you more about the Create OS, but I think before we move on from um sort of the origin story of NodeOps and kind of um you know that initial problem you were trying to solve of making it easy to just deploy nodes um and so anyone could do it in one click. Um yeah, I'm just kind of curious about like you you mentioned there's that evolution from um kind of providing that um decentralized compute as a service to you know then moving into like uh I I'm sure probably you saw like the trend of like people starting to use um that kind of compute for maybe more AI applications. Um but yeah, could you just tell us a little bit more about that that sort of like um I I guess on one hand the the demand side is maybe a little bit different for um you know compute networks right now. I'm sure it'll it'll uh like a lot of things in in the uh decentralized or web three space that you know comes in peaks and valleys. But um but yeah, maybe you could just walk us through more of that that sort of like uh uh where the idea of uh when you're looking at decentralized compute and providing nodes and and building um that infrastructure to uh to then deciding, hey, we could use the same infrastructure but provide this kind of um easy uh easy to onboard AI experience. Like what was that, what did uh that all look like?

SPEAKER_00

Yeah, so I mean definitely we were seeing a lot of trends uh in terms of like I think this would be like over a year ago when uh AI uh built products or or I'll say like uh AI writing code and then uh people actually facing problem of uh deploying this code kind of uh started popping up uh uh here and there. And and for us, I think uh we had that vision of okay, uh, since we are progressing towards this era where people can at any point of time wipe code and and um and deploy applications so that they can earlier it was mainly around uh testing your uh MVP or getting initial feedbacks. Uh but but we we always knew that uh with time it will just keep on getting better, and our bet was mainly around um with that uh deployment or DevOps would definitely become an issue to uh any developer um in the space who's kind of building things. Uh and we we eventually saw that also, like uh if people are building um, I'll say uh let's say anything on Chat GPT or uh or uh uh Gemini for that matter. Um deployment, if you ask ChatGPT back, then it'll just tell you that okay, uh you have to create a Git repo and then basically upload these files and then uh go to AWS or GCP and then follow XYZ step. Again, it takes in like a lot of effort to someone um who's just vibe coding, right? And wants to identify and uh see if this would work or not. So um we saw that and like uh time also helped. Uh MCPs came along, um and and uh now skills are there. So uh it's kind of becoming very easy for anyone to use a product, uh like build it on any IDE or any chat, LLM chat interface, and and just simply deploy. So uh that's why we we decided that we want to be the infra layer for anyone who's basically building, and at the same time try to provide the best value um with uh the margins as well as I'll say uh the orchestration of decentralized compute that we've created because it kind of becomes much more cheaper than than eventually paying uh something to AWS and and and much easier as well, uh, because in one click you can deploy, right? So uh so that was the idea of why this transition and our vision towards how how quickly people would be able to build applications with time.

SPEAKER_02

And do you think like um I mean you kind of just mentioned this, but uh this is something I've been thinking about recently um with the the rise of agentic um commerce and finance and and just kind of like agentic everything, like production, I guess you could say. Um, people are enabling agents to do all kinds of things and experimenting with different like uh agenc flows. And I'm kind of wondering like um what what does that do for the demand for things like decentralized compute? Like, do you think um yeah, I'm just kind of curious, like is that something that w will will grow? Like, do will agents thrive in like kind of that decentralized network, or do you think it's going to be like we'll need more structured kind of compute networks for the agents to actually perform better? Um it's something I've been like kicking around, like I wonder how that's gonna actually play out because it it could be really interesting for um you know people like NodeOps or or other companies that are kind of building this infrastructure layer, uh had had already been building it for like sort of the the previous generation of technology. Kind of curious how you all are thinking about that.

SPEAKER_00

I think I think on the agentic layer side of things, not much would change. Uh because like today, if you're deploying an agent on let's say AWS, uh it would function the same way if you get uh deploy it on any any decentralized uh compute layer, because uh it'll end up doing the same job. It's more about I think the idea of having your agents run and them holding that data of basically your data to be with uh any any centralized entity or be more decentralized ecosystem where uh definitely there's an uh there's a point of it being a bit cheaper, but also uh I'll say safer uh because like okay, if any one of the nodes or any one of the compute provider goes down, we kinda have multiple uh uh things taken in place so that uh we have failover practices and other uh kept in place so that your node doesn't or or your data doesn't get lost at any point of time. So I think it's more in terms of mindset of an end user in terms of what works best, because again, uh uh today, like managing a decentralized network, I think, uh definitely takes a lot of time uh because you have to make sure that you are incentivizing these providers at every point of time. Uh you have to reward them uh even if there's no workload on their uh on their compute, uh, because if they don't get rewarded, why would they eventually give you the work uh the compute, right? So so these things definitely come in place, but uh VPS has its own challenges. I mean, building uh your own layer of of like a centralized compute provider uh definitely comes with uh its own with its own challenges. But but end of the day, I think it's more in terms of what seems feasible to you uh because anywhere you deploy it's going to do the same job.

SPEAKER_02

Okay. Interesting. And and I'm kind of curious, um if you're seeing um I wanted to talk about kind of like blockchain development and in AI, um, the impact of AI on blockchain developers, if you're seeing any uh any interesting projects in on that front, um people uh maybe using blockchain a different way or or building products and services that didn't exist prior to you know this explosion in in AI or agentic AI. So yeah, anything kind of top of mind there that you're seeing um specifically people building with AI and blockchain?

SPEAKER_00

I think there are I think I think there are two areas that I'm seeing a lot of movement, mainly in terms of blockchain, and I'll include exchanges as well in this. Uh but first one being I'm seeing a lot of agentic payments being managed through uh through through uh different protocols and and products built around agent tick payment because definitely I think there's a lot of area to to explore because uh I think agentic payments, like let's say if if someone is has created an agent which would do all your job in terms of you just creating a shopping list and then uh on basis of that, like just making payments, I think it's much more easier to do uh crypto payments via USTC or DT, any any fiat for that matter. Um and and fractional payments also kind of becomes an interesting point because with all the um I'll say APIs coming in play, and uh I mean the we we've always had pay as you go model, but with agents like giving access to everything uh of your own product and uh interacting with agents for that matter, kind of uses certain parts of it, so you kind of bill or charge users for that, and so agentic payments will definitely play a bigger role. Second one uh about exchanges is like um you've seen uh I think everyone, um all major exchanges, including Binance, um uh MexC and other centralized exchanges, kind of exposing uh some of their APIs through MCPs. Now, now this has never happened, uh I mean has happened but like for a very long period of time. Uh this has been like a closed box uh to to to get access to your exchange and kind of using that exchange to uh uh push orders, uh do uh trading, um, and they've made it available so that you can use AI with them. So this is a very interesting piece that we are seeing today that okay, agentique flows have come into exchanges, and people are thinking about building products which where their agents could perform XYZ transactions or do trading um with these exchanges and stuff. So these are the two interesting areas that I'm seeing a couple of projects kind of building um still LTDs, but yeah.

SPEAKER_02

Yeah. Yeah, no, that's definitely something we are uh watching and and um talking a lot about here at Nexus is just um yeah, agentic uh trading and um how that will you know that seems to be the future because you have this the combination of just um high frequency and then just um kind of more efficient in a lot of ways if you set up your your parameters and then just let your agents um handle that you know all that decision making, um actual making the trades. And then it gets interesting when you start thinking about that, like exchanges for agents and um you know how how it it will not just be you know one agent doing trading, but then like agents trading among themselves to to make trades, and it's like it get it like when you start thinking about it, you're like wow, this could get interesting fast and uh pretty cool too, I think, in some regards. So um yeah, definitely that's that piece we're we're also kind of keeping a uh close eye on.

SPEAKER_00

Yeah, I think this there's another thing, right? So even even in Web3, if we see like majority of people, like whatever number we usually quote around in the industry, that this many people have are now in in this web 3 ecosystem, majority is still interacting with an exchange at certain level, right? Yeah, they essentially don't use web 3 products in their day-to-day life, they they kind of uh buy certain tokens, keep it with the exchange, do some trading and so on and so forth. So I'm I'm very happy in in a sense that okay, um centralized exchange is for that matter have made this call of basically allowing people to get more interactive with uh their AI uh uh or any LLM and then basically plugging their uh uh plugging their uh systems inside that's own space so that people can interact more with those exchanges.

SPEAKER_02

Yeah. Yeah, I think that's a good way to kind of frame it or look at it as like it's just an the next level of interactivity and um Yeah. Um Yeah. Be interesting. Um and So I guess kind of along those same lines, I'm curious how you're thinking about like maybe some of the biggest challenges or opportunities of working or building with AI and blockchain specifically. Like where do you see like the the parts that um that make a lot of sense, maybe in terms of efficiency or or uh you know um just kind of the the basic sort of developers uh you know day-to-day to-do list? Or um and also some of the the places where um you know maybe maybe there's um places that we need to be more cautious when it comes to like you know building with uh AI and blockchain.

SPEAKER_00

Yeah. I think uh there are a few things. I think one of those is smart contract development is becoming more accessible now. Like writing uh in Solidity or Rust has always required a pretty specific skill set. Um so coding with AI tools are getting like good enough now that smaller teams can move much faster without hiring for like specialist roles uh for every piece of it. Um and and on the infrastructure and ops side of things, um for us like monitoring nodes, responding to incidents, handling deployments, so all of this with AI agents are starting to uh own more of that. Um and and with that said, like we're building in that direction ourselves the idea of an agent handling um uh routine ops and only putting in human when when something actually needs judgment, that's that's not a uh distant future uh uh thing now, right? Um, but but I'll be honest, a lot of AI blockchain conversation right now is still concept level. Um stuff that actually shows up in the production is probably a couple of years out still. Um I mean I've seen certain projects kind of uh building good products, uh decent products, but uh the motivation of having it to be uh a token-driven thing kind of always pushes the development cycle or or the zeal to work on creating a very nice uh AI project kind of always has its stake when when it comes to uh AI and blockchain. So yeah, I mean that's that's one thing. I think there's there's a lot more uh things with security and stuff that can that that is now very much accessible and and because like we've seen this with uh I think we keep on seeing like people building smart contracts which still has some some issues bugs in them, yeah.

SPEAKER_02

Yeah.

SPEAKER_00

Even though they've been audited by like this crazy good audit firms, but uh I think the uh possibility of uh finding an error would be much more easier than it was earlier before.

SPEAKER_02

Yeah. And just on that continual basis, right? I think that's also something that you know an agent would excel at where human would get tired of reviewing lines and lines of code, but like you could have your agent just continually monitoring those lines, or you know, the the smart contract, I guess. Um yeah, it's all interesting. And I think the the cool thing about that space, uh the blockchain and AI, um the synergies there is it's just it's moving so incredibly quickly. You know, like if we were talking a few months ago, we might be saying different things. And then um, like you say, like a lot of the stuff is still so conceptual, but it's also it's like concepts that are moving fast, right? So um uh it'd be interesting to see how things come to market or what things come to market first and um and what that enables for um you know for things like we were talking about earlier, like exchanges or or people uh building new kinds of financial apps. Okay, so uh another question I had is just you know if you have any interesting given that you're sort of watching the space emerge and and you know with Create OS and um watching AI builders kind of take these tools in hand and and and make stuff out of them. I'm kind of curious if you have any like insights or are are seeing any trends or or just any observations um that are maybe kind of underreported or just like not well known yet that you think are are kind of interesting or might indicate where where this is headed.

SPEAKER_00

I mean definitely there are a couple. Um I think there was one thing that kind of caught me off guard was the performance gap between frontier models and open models is is way narrower than most teams assume. The cost cap though is massive. I mean, I see this directly through the router that we've created for a huge chunk of use cases. Uh well-prompted open model produce output, you can't tell, uh, is apart from something like GPT 4.0 or or or 5.4 is still good, but like very comparable. So teams that actually test this uh end up cutting model costs significantly, uh, and and and latency also matters way more than people admit. So um every everyone makes model decisions based on benchmarks, but in production, what users you actually feel is response time, and I've seen teams switch models not because uh quality improved but because it felt faster. Uh that's uh a real uh purchase driver for a lot of these folks. Um so yeah, I mean um like early like early players or early uh builders with whom we were talking were very much focused towards like no, we just want frontier models, and we told them like guys to try it out in terms of free models, you'll cut down your cost like drastically, and and they found uh certain models much more relevant to their case, and this this whole idea of model evaluation, uh routing, uh something someone it's not actually discussing because the the it's one part of basically using an ALM and uh and getting the right decisions uh or right responses, but uh the moment you switch context, uh so this whole thought of uh model brittleness or or or LLM brittleness, right? So uh if you change a little bit of words, it'll give you a totally different answer. Now, evaluation kind of becomes a very big problem uh the moment you hit scale in production, right? Like early users, it's fine, but if you are like doing like I don't know, um 10,000 sessions in a day or 20,000 sessions a day, yeah. Uh that changes a lot. So not a lot of focus given to how production level uh application should function, not just from the front of it, but also how your LLMs would start responding to uh different queries that people would start uh putting in. I think there's uh some progress that people are making, and and I think uh and people have started adopting as well.

SPEAKER_02

Yeah. Yeah, that consistency issue is really um interesting on so many different levels, whether it's you know working at scale like you're talking about, or just even working across the team, um, if everybody has the same tools, um and if you think you know that the context is shared, um but you're getting different results, it's like, well, you know, it so yeah, that that'll be uh it'll be interesting to see how that or or you know how that problem gets solved and um and what that will look like and and what it all entails. And then like you were saying earlier, like at what cost? You know, does it does it slow down? Is it make things slower? Um does it make them you know what uh come at uh cost of speed or a cost of actual um you know financial resources. So yeah, uh a lot still to be figured out, I guess. Um and the last question I have for you today is just if you were looking ahead, you know, and and um had to make a big uh a bold bet or a prediction about the future, um, particularly you know what we've been talking about, this kind of intersection in in AI and and compute and and blockchain, like what what you know big bet or bold prediction would you make?

SPEAKER_00

I think uh like predicting something for like a year would be too I mean it it would it's too too far to think of right now. Yeah, right. Things change so quickly, I think. Uh blockchain was like this uh like I think a couple of years back. But uh what I'm seeing right now is like agent declare kind of coming in in picture pretty uh quickly uh with small teams. Um but um uh for mostly like for small uh dev teams, uh they will have at least one AI agent running as a permanent part of their workflow, so not as a tool, they they open occasionally um an actual participant of the team. Um it's watching your infrastructure, it's handling deployments, flagging issues, escalating to human only when something generally needs a decision. So uh agents working directly with you as part of the team and not as a tool that you'll subscribe to uh in any any space. I think there's a lot of push towards creating agentic workflows, uh but enterprise would take their own sweet time to do that. Small small dev teams would like quickly adopt to it, and and we've seen this happen with OpenClaw and the kind of rapos people have created around OpenClaw. Um and and there's a lot of consumption of AI uh and and uh working in your own workflow is happening over that. And so yeah, I think uh that's that's my bet like in like next six months or seven months, people will start using agents as their as part of their own team and not as a separate tool uh that they would quit keep on switching to.

SPEAKER_02

Yeah, that I think that'll be a really interesting switch, like work-wise, like the way we handle work and do work when you have that persistent agent who, like you say, is like kind of your teammate or your you know your partner, and uh, you know, you're shutting down for the night and taking, you know, taking a rest or whatever, and your agent's still at it, and you come back to work in the morning, and you're like, okay, you got a lot done. So um, yeah, but I think I think like right now we're used to um interacting with AI in a way that's like, you know, has that sort of like on off switch or or like beginning and end kind of time where I I I agree. I think it'll be really interesting to watch when all of a sudden it becomes persistent and it's just like on all the time and and kind of working all the time and and what that means and how that changes the way that we work as humans and and the and the work that we're doing as humans. So um yeah, I definitely feel feel that uh just also reading about all the really interesting stuff people are doing with open claw and just um things that people are building um for workflows. And so yeah, maybe we we talk in six months and we'll have our uh our agents uh you know filling us in on what's been going on. So yeah, definitely. Well, thank you so much for your time. I really appreciate you walking us through what you're building at NodeOps and um and uh the with the Create OS and just like you know all the all those observations. It's it's uh really interesting. So hopefully we can uh do it again sometime.

SPEAKER_00

For sure. And thanks again, Daniel, for having me. I mean, it's always good chatting about what's happening in both in the intersection of AI and blockchain. I mean, that's that's my favorite topic because I end up talking to a bunch of folks uh in in both the spaces and and they're like nice um uh things that kind of come out with people what what they are building and how how we can also support them with their infra and and agent tech side of things. But yeah, thanks, thanks a lot for having me here today.

SPEAKER_02

Yeah, for sure. Thanks again. Talk to you soon. Thank you for listening to this episode of Exponential. We'll be back next week with another show featuring people, code, and capital. Please like, review, and share exponential wherever you listen to your podcast. And be sure to visit app.nexus.xyz to see what we are building.