SNIA Experts on Data
Listen to interviews with SNIA experts on data who cover a wide range of topics on both established and emerging technologies. SNIA is an industry organization that develops global standards and delivers vendor-neutral education on technologies related to data.
SNIA Experts on Data
The Power of Edge AI and Data Management
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Will AI revolutionize the way businesses operate and the way we live? Hear AI pioneer Dr. Rita Wouhaybi, an expert focused on AI, share her excitement over the rapid advances in AI technology. She explains why she is obsessed with the concept of Edge AI and how Edge breaks down some of the fundamental assumptions that we have had about AI for years.
Looking beyond large language models and generative AI, she raises questions on all that we still need to learn about AI and her anticipation of amazingly rapid innovation, highlighting two critical abilities that we have yet to expose to the full extent.
Rita offers real-world deployments of Edge AI in practice today in manufacturing, explaining how these technologies are dramatically enhancing efficiency, quality, and sustainability in manufacturing, and she challenges the popular concept of "data is the new oil," arguing instead that data is more like water—perishable and complex.
SNIA is an industry organization that develops global standards and delivers vendor-neutral education on technologies related to data. In these interviews, SNIA experts on data cover a wide range of topics on both established and emerging technologies.
About SNIA:
Welcome to the SNEA Experts on Data podcast. Each episode highlights key technologies related to data.
Speaker 2All right, welcome to the SNEA Experts on Data podcast. My name is Eric Wright. I'm the Chief Content Officer and Podcaster in residence here with GTM, delta and the SNEA Experts on Data podcast. Super excited today because I'm blessed to be joined by Rita Wuhebe, who has been somebody who, looking at your impact on the industry and folks that we share connections with, it's great to be able to chat. You've got such an exciting story, lots of background with SNEA and all things technology. So, rita, if you want to just do a quick introduction for folks that are new to you and then we'll talk about, I'll say it's as AI comes in, everything old is new again and there's a lot of interesting things about how storage and memory and compute is changing, but it's actually kind of just coming back around.
Speaker 3Yeah, thanks, eric, for the intro. It's great to be here today. I am excited for our chat. My name, like you said, is Rita Wahibi.
Speaker 3I am an AI fellow at Soladyne.
Speaker 3I have been doing things in AI for decades before it was cool I'll date myself when it was called neural networks and everybody thought that it over-promised and under-delivered so I'm excited to be here.
Speaker 3For the last eight years or so, before I was at Intel and now at Soladym I have been focused on AI software, especially at the edge, and I started this journey looking at manufacturing as one of the domains or market segments where edge AI is transforming what is happening, and it was such an it still is such an amazing journey and really a humbling experience. When you know, I came in as an AI expert and factories started opening their doors for me, such as Audi manufacturing, and I learned a lot from them and I'm really honored and I can't tell you how excited at being able to hope and make such an impact in how people produce products to be, you know, more green, more efficient, smarter, higher quality and so on well, that really, I tell you it's such a beautiful thing to remember that we often forget when we talk about technology is the, as the line from halton catchfire goes.
Speaker 2Computers aren't the thing, they're the thing that gets you to the thing. And AI, as you said, like when you began your journey in that world and starting with academia, it wasn't known as AI, and we often AI wash a lot of stuff that's going on, and we often AI wash a lot of stuff that's going on. Now it's, you know, now it's 100% more. Ai is the new sticker that we slap on on every piece of software and that's. I believe it's normal as part of the industry that we will capture marketing momentum from the excitement around it, and then it also creates the fun part. As we know, in some of the analyst world we call it the trough of disillusionment, where people all these AI sounding startups or labeled startups come out and then we realize, like natural consolidation then begins in the industry. And we saw even edge.
Speaker 2When you described like just the phrase edge, we sort of struggled with even how to define what it really is, but I think we've settled that it it was originally seen, people thought like, oh, you're going to run. You know, full, large-scale compute. You know, on small devices, like, it's impossible. You're like well, we've actually developed a lot of hardware capabilities that allow us to run at extremely low power CPUs and memory footprints and the principles Of like, what are we trying to achieve? Because if we Just look at it, incrementally, faster horses, etc. All the sayings we know, but if you actually say it In each generation, we go back to first principles. It's such a beautiful opportunity To now say We've had so many changes that now we can do things we never thought we could do because of an amalgam of innovation that's occurred. So I'll say Rita, today, what do you see as some of the greatest enabling factors that put AI onto the forefront?
Speaker 3Yeah, you know, a lot of things are happening that are super exciting in AI. Every day, somebody from my team brings me and says hey, you know what? Do you think? I found this new thing and it's like, oh, how exciting that is. So there is, the amount of innovation that's happening in AI is just amazing and, honestly, I know this is overused, but that's okay, I'm going to use it. Ai is still in its infancy. We're still going to find a lot of cool things to do and it's going to grow in ways that we are not expecting it today.
Speaker 3Whoever tells you I have the crystal ball and I'm going to predict things, I think 99.99% chance they're wrong. So that's part of the excitement. As a matter of fact, I was talking to a colleague the other day and he's new to AI and he asked me he says so, rita, why? Why? I mean the AI engineers. Why are they different than software engineers? They're software engineers and I can, you know, interchange them with software engineers. What do you think? And I said no, no, no, big, big, big mistake. First of all, and that actually gives you a perspective of what's happening in AI, and then I'll circle back. I know you want specifics from me, then I'll circle back. I know you want specifics from me, so I'll circle back to that.
Speaker 3But to a large extent, if we start thinking of AI as just software, we are not paying enough attention. That's what I told my colleague in a little bit more diplomatic words. I actually even did the analogy. I said my engineers, my AI engineers, my data scientists, my AI software developers are on a treadmill and we are running as fast as we can on this treadmill and I have noticed that the successful individuals who are driving a lot of the AI innovation love being on the treadmill. They are like the crazy people you watch in the neighborhood at 5 am when it's freezing cold, you know putting on all their clothing and going out for a run. That's what they love to do. So if you start thinking of them as just general purpose software engineers, you will lose them. And it's not just about me. It's about being part of that innovation and those new things that are getting created every day. So, speaking of new things, I am not different from a lot of people.
The Evolution of Data and Technology
Speaker 3Obviously, llms and Gen AI are super exciting. I am excited about them for some applications that are starting to become more popular, but it's not what most people think about, and that is they have two abilities that we have not exposed to the full yet. The first one is the ability to predict things in general. If you think about it, llm what LLM is doing when you prompt it is trying to predict what a very knowledgeable human, based on the data, would have said to you. So it's that next token. What if my next token is not predicting chatting to a human or writing a piece of code or generating an image? What if it's something like is this device going to fail? What is the weather forecast? And historically they weren't that great at any number of manipulations, but that is changing rapidly these days. So I'm excited about this ability. That's one of the things. The second one is you know us as humans. We are super duper lazy. We sometimes like to hoard things and then we can't find them, and that is so accurate.
Speaker 3With the data and as technology has gotten very easy, where we can generate data at a way more rapid rate than we can consume it or even know what to do with it or even understand what is this data, we are right now sinking in data. There is a lot of data islands that people can't reach. We are right now sinking in data. There is a lot of data islands that people can't reach and LLMs to a large extent or in general not just LLMs, but transformer technologies and other techniques that are growing out of that area is going to get to a point where it helps us find our data. Even before we say we're looking for something, anticipate some of our needs, bring in some of this data in front of us in ways that we are able to consume it.
Speaker 3When LLM last year started making the big splash, I was at a manufacturing event and many of the customers, many of the factories what we call end users, right factories came in to me. I was sitting on a panel and I made some very strong opinions. As you can tell, I'm very opinionated. They came in later to me and they wanted to chat more. There was a common theme. This common theme is that these people are very worried that they're going to miss out on this innovation called LLM if they keep just hiding their data, if they keep doing this to their data, they're going to miss out. Yet their ability to collect more data is growing at a rapid rate and I think there is a lot of anxiety that, once they collect this data, what do they do with it. I think there is a lot of anxiety that, once they collect this data, what do they do with it.
Speaker 3A few years back, when I was at Intel, I was called into an executive meeting one time and I was asked to answer some. I was, you know. They brought me in to answer some questions related to data and data management and one of the execs, who retired since then, says Rita, you know, data is the new oil. I saw it on the economist front page. Data is not the new oil.
Speaker 3Data is more like water, right you? It's a byproduct of so many things. You get it and it's really hard to move. It's it's heavy. You need to figure out how to deal with it and if you store it for very long, it starts to smell funny. So this is not the new oil. All by itself is not going to give you a lot of money and a lot of value, but, like water, if you use it right, you get a lot of amazing things out of it. You get life from it. So to me that's really crucial on how we deal with our data and how we're able to digest. It is going to become very, very challenging, and even more challenging than it is right now.
Speaker 2If you think about the interesting thing you brought up there is the idea that a software engineer doing AI it's not sort of a you know, just we're just to slide you over here and change your title. It's a combination, number one of sort of like obviously general knowledge and skill set. But beyond just the programming language itself, I would say the discipline is now much more cross-discipline, right, because you have to understand much more about it, including almost sort of networking and and storage and memory, like there. We used to not have to worry about packing too much, you know, in memory. We had the idea of in-memory databases. We had some specialists that worked on how do we use that for forues, and so we started to leverage those things.
Speaker 2But it was never thought of like how much we really want to maintain in memory. It was like let's just get it as quickly as possible, flush it. But it was a great way to have transition space for that or, you know, fast processing for a really interesting thing, really neat stuff that happened around eviction algorithms, lots of neat stuff that's still innovating today. I mean, I see patents weekly on like new methodologies for eviction and optimization, but now we have this incredibly new pattern that even AI itself doesn't have a pattern. It has a plethora of patterns about training and ingestion and, you know, transforming, and so let's talk about the impact on the people, but then what are we doing at the technology layer? That's really going to shake things up.
Speaker 3You know, you, you I don't know if anyone ever said that to you, Eric, but you have a skill set where, when you're talking, you're you're wanting me to talk about so many things, so you're making me want to talk about so many different things. So that's, that's great. I'm glad how you're you're running this conversation that I would like actually to build on. I'm going to choose a couple and then, if you have additional questions, let me know. It's interesting that you're talking about networking and memory and the intelligence, more than just AI itself. Right, ai is changing how compute looks like.
Speaker 3As a matter of fact, right before I left Intel, I had a very awesome debate with an awesome colleague of mine and he was super excited about taking some of those large models and getting very creative and crazy about how you divide it to fit on multiple devices so that you can parallelize it, and I basically decided to take a very provocative stand and I said that's never going to work. It's never going to work because it reminds me when I was pursuing my PhD, when some of the smartest people in computer science were pursuing a problem called multithreading, and I remember I would go attend. You know, I was going for my PhD at Columbia, and Columbia, in the city of New York, would attract a lot of people who would come in about to graduate and give talks and they were sometimes job talks, sometimes they were just practice for the job talk so about to graduate and we would get we got actually a plethora of folks who are unbelievably smart way smarter than I am that would come in and discuss these amazing ideas about multi-threading and how you can get sophisticated, and literally PhDs were created based on multi-threading. And then what happened?
Speaker 3We kind of as a community, shoved multithreading under the carpet and instead said, well, let's do container and concurrency in different ways and solve that problem with different ways, with different assumptions. And if you think about it from a technology and intellectual perspective, multithreading is superior. Right, I mean again, I've never been in awe with containers. The concept is simple, right, I mean, I love it, I use it every day.
Speaker 2It's like when people say they're like oh, it'll make it faster, you're like no, it's literally impossible. Everything the abstraction layer you put on is degrading performance. It's like saying that shampoo is good for your hair no, it's not it works.
Speaker 3Right, it works but, I think the key concept, the difference between something like multi-threading, which is technologically a superior software solution, rather than containers, which is a, you know, almost a hack right, it's a packaging hack is the fact that all the software that has been built doesn't need to be rebuilt, and that is not trivial. So I'm going to go back to the hardware. It's interesting that you said that innovation is happening all over the place. I would argue that some of the most interesting innovation, from a going back to the fundamentals, started happening in networking, and that is the day when the smart NIC came to appear and then the DPU. Because what happened there? Basically, the networking folks said why would I just send the packets to the CPU if I can do something smart with it ahead of time? I can improve the efficiency. And they kind of, for the first time, bifurcated the CPU being at the center of the universe, and said I'm going to do something interesting here. And I think we're going to see more and more of that right, not just in the network, because the network will be able. If I have a very smart network and a bunch of compute nodes, I can divide a very large network, a very large deep learning network completely transparently to the software. Software doesn't even know that it's actually running on multiple nodes. They will all look like one big wall. I think compute is becoming more prevalent across all the pieces in the network, whether it's the NIC, the switch, the hub, the whatever sophistication, the DPU, whatever we're calling out, whether it's in storage, whether it's in different kinds of compute. No, that can run particular tasks faster. It's a GPU, it's an FPGA, it's a DSP accelerator. We're going to see a lot of diversification. It's a TPU right. There is so much amazing innovation happening. But also some of the innovation is starting by saying I can add intelligence across my entire pipeline. I don't have to, you know, put all the intelligence in one place and then just serve the intelligence. I can actually distribute it, and that brings me back to the edge.
Speaker 3To a large extent, one of the reasons why I got so obsessed with the concept of edge two years ago is the fact that the concept of edge breaks down some of the fundamental assumptions that we have had for decades in AI. Right, for decades in AI, we said in order to train a model, in order to do anything interesting, I, or a node somewhere, has to have full knowledge of the data, full knowledge of the problem, has to have the full data set. I can't hide things from it. It cannot happen in a distributed fashion. They all sit in one place and then I can solve the problem and that's okay. Sometimes we have to make those assumptions.
Speaker 3What Edge is doing today is saying well, what if you don't, what if your computer is not in one place? Some conversations with my boss a couple of weeks ago and I said to him I said as a community, us computer scientists we spend the last several decades moving data to compute, going to the cloud, and right now we're swinging the other way. We're moving compute and distributing it back to the data. But that's not just a simple location, right? Because, again, what does it mean to train on data that is fully distributed and then be able to act on it locally without the full knowledge and without even the full system?
Speaker 3So anyway, so many things happening that are exciting in this space that at first you're right People at first thought well, do I bring big compute down and that's how I do edge, or do I trim the software and optimize it so that I can run it on lesser compute? But in reality, the use cases and how people actually value this work, because if I don't have someone who's valuing it, I might as well not do it. How they value it is actually pushing the boundaries and having us go back to the drawing board and say, well, I no longer have access to all the data. What do I do now?
Speaker 2And the interesting thing is, I'll say this is what we used to think is edge, forgetting that this is edge. Edge is a place at which you are like I'll say it's a form factor or a distribution factor, really. In that and also the hardware capabilities when we first said edge, this type of thing, you know, a small ICB, a small Pi or Arduino we consider that like, oh, it's going to be something that's hanging on a telephone tower. It's like, well, what about the one that's in your pocket?
Speaker 3Well, edge is actually way more diversified than that. So I held those beliefs until I was saying I got humbled by factories. I remember one of my trips and I'll use Audi because they went public on the fact that we did a lot of work with them. So I'll use them as an example and what I'm going to say is not confidential. If you look up online, you'll find this information quite easily. If you go to a factory floor today and you open if they are kind enough to let you open that cabinet right, because these machinery will look like cabinets, like metal cabinets If you open the cabinet that is the robotic controller, you'll find that this robotic controller is actually almost a mini rifle service inside. You got, you know, hefty CPUs, you got GPGPUs. You got some arm controllers that are controlling. You know hefty cpus. You got gpgpus. You got some arm controllers that are controlling. You know the different joints and motors and levers and in your robotic arm. So it is sometimes a lot of compute. I don't think.
Speaker 3I don't think it's, it's only you know your phone and you know a smart camera those are all interesting but there is a lot of um edge that has enough compute, has quite a bit of compute, but I think what's interesting about it is that the data is localized. Many times you have real-time requirements like I can't have a robotic arm hanging in open space waiting for a decision to come back from the cloud, like that's not acceptable yeah, I guess, if I think of the, I'm trying to think in my head of like, what's the easiest way to guess.
Speaker 2If I think of the, I'm trying to think in my head of like, what's the easiest way to explain it? I'd say the edge is being closer to the task, not necessarily compute data, whatever it is there. Obviously there are other things, but by saying that we were, it has to be closer to data, closer to cpu, closer to gpu. It's more that it's closer to the activity and we've done other things to enable that, but it's yeah, it is such an interesting thing.
Speaker 3I don't disagree with your definition. There is another one that I like to use because it's, you know that, simple, and that is if it is on-prem, if it is any campus, I'm gonna call it edge. It could be an on-prem data center, a rack of servers, it doesn't matter. It still has a lot of properties of being, you know, local, having real-time requirements in many cases right doesn't have access to all the data and so on.
Speaker 3So I think, from the AI perspective, in many of these applications, the on-prem boundary is super simple and actually works in the huge majority of these cases. So that's what I have used. And then there is a spectrum there, right, there is from the rocket servers to the hefty robotic controller, or even stuff that sit in an MRI machine, to little tiny sensors, to smart cameras, and as compute is becoming more powerful and more specialized, we're seeing lots of innovation. The cameras that I started seeing in 2023 and 2024 as smart cameras are able to run a good part of your pipeline, and we were doing a use case with a European partner.
Speaker 3As you know, privacy is very big in Europe, so we were able to actually blur all the faces. It came as a setting on the camera, using AI, and it worked beautifully. It didn't blur the hands, it just blurred the faces, and we're like whoa, okay, okay, two thumbs up, I can do that. Um, so, really, intelligence is getting across the entire spectrum, from, you know, the far edge super light devices to hefty things, on-prem, to the network infrastructure and the cloud, which is exciting, really exciting to be part of. You know, this big wave of innovation.
Speaker 2Yeah, now for the people side. Now, as a new practitioner coming in, you know, you've seen waves. I've seen waves. I've been at this for a while, as the frown lines around my eyes and my grayish hair tend to tell that I've spent too many hours in a data center over my years. But part of what I learned as an architect was talk to every team, understand every connection, understand every impact, understand the business impact most of all. Now, as a new software developer coming into the world and saying like hey, I want to explore ai, what are the other factors and other sort of areas of study that you would tell folks that they should explore around creating true sort of like optimized enterprise, scalable ai of unenable AI?
Speaker 3as a first principle, you know I think I'm a little bit of a rebel right In the sense that I don't know. I disagree with many people who are now feeling that there is a threat on education with AI and you know you have to stay within the roots. I actually remember when I was a teenager and wanted to use a calculator, my dad getting really mad at me.
Embrace Innovation and Flexibility
Speaker 3Look at us, right now no one cares that I can multiply numbers in my head. Honestly, I think it was a waste of time and I think education at all of this level not just for software engineers, engineers is going to change. However, one thing remains really, really important, and it's a common thread, and that is be inquisitive and get your hands dirty. I think those are the two skill sets that any software engineer or any any person really should adopt. Now for AI and software engineering, it's more so because, again, things are moving very, very quickly these days, but these remain very true Be inquisitive, keep up, read things. Whatever I'm going to tell you as an advice today, right, go look at this new version of Lama or whatever it is, or you know that paper from OpenAI that talks about general intelligence? Yeah, yeah, I mean, those are going to have an expiration date. Whatever advice I'm going to give right now, versus, my advice should be keep up. Keep up with these technologies as they're changing, be humble, be inquisitive. Ask people these technologies as they're changing. Um, be humble, be inquisitive, ask people around you what they're doing and then try things with your own hands. That's how you learn, right? Uh, I think one of the most times I've learned is when I stayed up till 3 am trying to fix a bug, found that bug and fixed it and, you know, went, went to sleep and didn't sleep well because I was still all hyped up about it. So I mean, this is how you get your best learning is by getting your hands dirty. And getting your hands dirty is going to change because now we have at our disposal things like copilot, we have the, you know, 70 billion and 100 billion and 400 billion parameter LLM. So how do we work with them is going to be different. But yeah, I mean, I think those are the two things.
Speaker 3The other one that served me well over the years is to be flexible. Sometimes it's a little bit of a random walk to get to your destination and what gets you excited. When I finished my master's degree I had taken all the classes on neural networks and published in this space. I was very excited to find a job. I went on the job market between master's and PhD. Everybody thought I was weirdo for even talking about neural networks. So I switched to computer networking. Actually, I joined a startup for five years in computer networking and I learned a lot. I learned tons of things right. I can probably still configure a Cisco router while half asleep. And then I went back to my PhD and I was starting to gain popularity. So it's about being flexible and learning from different domains and widening your horizon and again getting your hands dirty, I think is very important. I traveled a lot of engineers who read stuff but never tried themselves.
Speaker 3It's like give it a shot, try it try it themselves.
Speaker 2It's like, right, give it a shot, try it. Yeah, it's like reading. It's like reading 100 self-help books and but never applying a practice from it. Like we can consume a lot of potentially meaningful information, but until it's actually enacted or tested it allows you to sort of find, find your own edges.
Speaker 2Now, thinking about snea, you know in, like we, it's funny to see the practice areas that we see sort of like they're very you know. It's like network security, storage, memory, you know, cloud, hpc. There are things that started to develop. And then, as we hit HPC, it was like, well, hpc is really storage, memory, networking, like, okay, cool, it's a few different things. And then suddenly security had to get invited to every working group, because you're like, well, there's security everywhere. And then it was like, oh, then there's programmability, and then there's, you know, swordfish, integrated like all of these different layers.
Speaker 2And suddenly we realized we are a community of a town of people which have different community centers in which we gather for different sports, but the same players may play volleyball and basketball. They play, maybe, understanding, memory and understanding, you know, networking, and then understanding, you know threading and and and distributed, and so we see all these things. So what I found is SNEA as a community is the fact that I can go and I can find amazing folks like yourself and other folks that are practitioners of all sorts of different areas, working at some of the deepest, coolest R&D, and then be able to evangelize and socialize, like ideas and ask questions. Like you said, this curiosity is what allows it. So when you look at, you know folks that we know in our peer group in the SNEA community, and what is the advantage as a human you know to be among those people and those communities.
Speaker 3Oh, tremendous, absolutely tremendous. You know to be among those, those people and and those communities. Tremendous, absolutely tremendous. There is a lot of power, obviously, in you know, talking to your own community, they teach you a lot of things related to you. But then there is even more power talking to people who are different than you and I actually want to touch on a couple of things.
Speaker 3Um so ago I was a fairly new engineer who joined Intel, coming with a PhD to Intel. Obviously I had some pent-up energy when I wanted to write papers. But in the industry, as a researcher it's not like academia you don't write as many papers. One of my mentors one day said you know, why don't you write patents instead? And I was like, yeah, I can do that.
Speaker 3And then I started writing some and then all of a sudden, the patent group invites me to a harvest session. They used to call them think tanks and they teamed me up with three other individuals. One was a security guy, one was mobile uh, you know, cell phone and mobile devices and uh, the third guy, um, was very much cloud guy and at that point in time, you know, I was interested in writing recommender systems and I was like down the rabbit hole. Um, say, I and I was so pissed at first and I was like, why, why would they do this? Like I have nothing in common with those guys and it was amazing, the four of us, I think, ended up writing more than 50 patents together. Um, and it was like the riffing on each other's because I would, and I was so hesitant at first. I was like really stupid and and perhaps a little arrogant. I didn't even want to share my ideas and then I shared one and they were like, oh, and then you know, you can do this.
Speaker 3And I was like these guys, are interesting, like I would have never thought about them, which was absolutely amazing and so enriching and taught me that there is so much value in talking to people who are not like me, because they ask me questions that I wasn't expecting. Right, they push the envelope, they get me to take a step back and revalue some of my assumptions, kind of like what I said about edge and AI. Right, it's a big step back to revalue. Hey, we've made those assumptions, whatever 50, 60 years ago, and it turns out that here is something that's pushing on the boundaries. So that's always, you know where that tension, that healthy tension, creates new innovations in ways that you haven't thought about.
Speaker 3And and I, you know obviously I'm not the only one that says this or even snea if you look at a lot of the you know open, where open ai started, and some of the innovation that they brought together, they have on purpose, hire people with a very different skillset, and so on. Um, so there is so much value in this. Now I want to share with you something. As a grad student uh, towards finishing up my PhD I got obsessed with this concept of social networks and how social networks gets built as a graph. I had definitely an unhealthy obsession with graphs and general graph theory as a grad student and my advisor knew that, so he had to manage it.
Speaker 2He's like, okay, that's cool, but another education you say, like Rita, not everything is a graph, it's gradually. No, it is, trust me.
Network Analysis and Innovation Insights
Speaker 3So, anyway, I got obsessed with that and I wanted to be able to capture a social network. And there was a social network. I'm not going to mention their name since then they're no longer popular, but they used to list on a website everybody's friends, so you would potentially be able to create a caller. So I remember one time I was just like many grad students, right, you geeking out with a bunch of other grad students in the lab and we're talking about it, and one of them said, well, that would be a lot of work. And, as is my custom, I took this as a challenge and I turned around that afternoon and I, by Saturday morning it was like a Friday afternoon, by Saturday morning I actually had a crawler that was able to crawl that network. And I crawled that network and it was roughly around 2 million users with their interconnections.
Speaker 3There were a lot of exciting things that I learned along the way, but what was fascinating is I started drawing pieces of that graph and what it showed. Every time I would take someone let's say I took you, eric, and you're the center of your own universe and I would draw, you know, your friends, and then your friends, friends and your friends. So I would do, you know, n equal two, three, four, and what I would always get are these awesome looking flowers and that had things like almost petals, but then some of the petals had connections. It's like what the heck is that? So I dug a little bit deeper and I found that most people have few interests, and each interest will immediately bubble, either as one group or multiple groups, with what we call cliques, right.
Speaker 2Right yeah.
Speaker 3Highly, highly highly connected group of people that know a lot about each other and friends of friends, and the friend is also friend with all the friends, and so on. Right, and these are communities based on topics and these will be like very thick petals, and then I would see thin petals and most of these were things like very thick petals. And then I would see thin petals and most of these were things like family members, friends from high school. Right, they're not as strong, they have other interests, they don't have as many affinity to each other, and then, every now and again, these groups will have some connectivity between them that the, the social networking community, calls bridges right.
Speaker 3So yeah let's say you are interested in golfing and playing tennis, so you become that bridge between the golfers and the tennis players, and then there will be communities that are completely separated, that have no connection whatsoever, and these were controversial topics related to politics, without going into the details, and it was fascinating to me, and I, you know, plotted several individuals and did some analysis on them. I never published that work, by the way, because my advisor was hustling me to graduate, which is fine, no regrets there. But it was interesting seeing the diversity as a graph, the people who are interested in very different things, and how that acted, how that not only reflected on them but it reflected on the network and how they became a stronger bridge if they were interested in things that are very different from each other and that made this whole concept of interest and diversity of skill set, even put a picture on it, which was pretty darn cool.
Speaker 2And I think that is the wonder of what we have. We've been given so much capability with innovation and technology, of what we have. We've been given so much capability with innovation and technology. And it's funny. I sort of started off thinking we would sort of go back in the way back machine, talking about, you know, hpc and which then sort of had multiple waves. That is now sort of cool again because we just we call it by a different name.
Speaker 2But in the end, to go to this very thing that we talked about at the start, the thing that gets you to the thing that's it right, is that we're not here to discover the things we know, but to find the edge cases and find those anomalies and then realize that those are not as anomalous as we believed and that there are patterns to the anomalies. And in fact now a different way of measurement and prediction as you say is the sort of the buried lead of LLMs Is that we believe that it's like you can complete a sentence, you can write a paragraph, you can write a blog, you can write a book, but in the end it's like how and why does it do that? And then we take a look at the other applications like how and why does it do that? And then we take a look at the other applications and then what allowed us to get here is, as he said, it's hardware, software, people, innovation, breaking barriers and in fact I think the acceleration now, this flywheel effect is incredible because we, like you said, we're going to use ai to build ai like we.
Speaker 2We never had that before. We had we wrote on paper. You know, I've got a friend, he, and he said he was a student of John Nash and he says, amazing thing, he's like we used to write our code on paper and he would just look at it and go wrong. I know that he was your compiler and because you had to find somebody who was that good to be able to move that fast. But now I can just pick up a phone, I can download an app and I can be that fast absolutely, and and think about what will that would do for innovation right right because only the people who were able to write code on paper that would pass the John Nash test were coding.
Speaker 3So now the entire world can code, with prompting and bringing their own creativity and personality. Think also about the innovation. I remember the days when you had to be someone who can write code in C or even Java to call yourself a developer. And then this happened right, and everybody were writing apps. And gosh darn it. Apps that they write are so much better than the apps that I would write, because my running joke is that I'm an engineer and you ask me to give you an interface. I'll hand you a command prompt or a Jupyter notebook, and nobody wants that from you. But imagine all the creativity and the awesome apps that people have come up with. These are not your c, c plus plus developers, and thank goodness for that. So now we're even opening the aperture more, and it's going to be super exciting it's, uh, it's a beautiful time.
Speaker 2I always say every day is the greatest day to be super exciting. It's a beautiful time. I always say every day is the greatest day to be in the industry we're in and in the world we're in. There's obviously plenty of challenges in the world and there's plenty of challenges in technology, all of which are being advanced and dealt with and thought about and acted on by amazing people like yourself and your team. And so, rita, I could literally do this all day. I'm only saddened because we don't have enough time to keep going. But, yeah, let's chat more. For sure, and for folks that do want to connect with you and learn more about your work and maybe you know, obviously, obviously amongst Nia, we've got lots of opportunities to connect to folks through that way but what's the best way to reach you, rita, to find out more?
Speaker 3Yeah, you can definitely find me on LinkedIn. I have such a unique last name it should it should hit just me. But I'm also Rita Uhebe at solidimecom and would love to hear from everybody and geek out and collaborate fantastic.
Speaker 2Yeah, as somebody who has the most common name ever. This is how I ended up with disco posse as my like moniker, because eric wright is easy e uh, you know, there's a, an author in the us, an author in canada, there's a football if. If you look up Eric Wright, I'm like the 19th page of results, but if you look up Disco Frosty, that's it. Yeah, there's only one of those. Fantastic, rita. Thank you very much.
Speaker 2And again for folks, definitely check out the work that Rita's doing both in and out of the SNEA community and all the amazing stuff. We've got lots of great podcasts like this with amazing people of interest and folks that you can meet as well in person, because we've got events like SDC, the Storage Developers Conference, and I'd say it actually should be SNEA Developers Conference and Innovation Developers Conference. It is so much more than just storage, so let's check out more. And we've, of course, got the audio version of this podcast and there's a video version. If case you're listening and you want to see all the cool stuff that we're talking about, you can see it live Head on over to the SNEA YouTube channel. So, thank you very much, rita, it's been a real pleasure.
Speaker 3Thank you, bye-bye.
Speaker 1Thank you for listening. For additional information on the material presented in this podcast, be sure and check out our educational library at sniaorg slash library.