The Fat Pitch

Data Ownership - A Fat Pitch with Alexander McCaig

July 12, 2023 Clint Sorenson and Paul Barausky / Alexander McCaig Season 1 Episode 9
Data Ownership - A Fat Pitch with Alexander McCaig
The Fat Pitch
More Info
The Fat Pitch
Data Ownership - A Fat Pitch with Alexander McCaig
Jul 12, 2023 Season 1 Episode 9
Clint Sorenson and Paul Barausky / Alexander McCaig

Alexander McCaig, Co-Founder & CEO of TARTLE discusses the transformative potential of data in the context of AI and society. McCaig shares his journey and insights into creating a technology that empowers individuals to take ownership of their data and use it as a valuable asset. 

The conversation delves into the historical significance of data and its evolution in the digital age. McCaig highlights the benefit of shifting from a coercive-based model (where data is used to drive people in their behaviors) to a needs-based model (where individuals are empowered to control and benefit from their own data). He also touches on the broader implications of data ownership and its potential to drive decentralization and democratization. What's the fat pitch in data? Tune in and find out.

RECORDED JUNE 30, 2023

Show Notes Transcript

Alexander McCaig, Co-Founder & CEO of TARTLE discusses the transformative potential of data in the context of AI and society. McCaig shares his journey and insights into creating a technology that empowers individuals to take ownership of their data and use it as a valuable asset. 

The conversation delves into the historical significance of data and its evolution in the digital age. McCaig highlights the benefit of shifting from a coercive-based model (where data is used to drive people in their behaviors) to a needs-based model (where individuals are empowered to control and benefit from their own data). He also touches on the broader implications of data ownership and its potential to drive decentralization and democratization. What's the fat pitch in data? Tune in and find out.

RECORDED JUNE 30, 2023

Paul Barausky:

Thanks for joining us on this episode of the fat pitch podcast. As always, I'm your host Paul Borowsky chief distribution Officer of Sealy investment securities and I'm joined by my co host Clint Sorenson, from wealth shield. Clint wanted to tell everyone who this week's guest is.

Clint Sorenson:

Yeah, this is awesome. This is one of it looking forward to just because of everything going on in AI, everything. You're the DS, we talk about decentralization, deglobalization, decarbonisation, but the most important D which is data. So we're here we've got Alexander McCaig, here to talk to us about why the fat pitch is in data, and in particular data ownership. So Alexandra, welcome to the fat pitch podcast, and thanks for joining.

Alexander McCaig:

Thanks, guys really appreciate you having me on and pretty sharpie to be talking about data itself. It's gonna be the fundamental for society for a long time going forward. And our key is going to be figuring out how is it that we're going to use it properly to continue to evolve the systems we have in place. So I'm very happy to talk about it today. Yeah, we're excited

Paul Barausky:

to have you and kudos. Nobody listens to this in live time. But here, the three of us are committed to hearing about data on the Friday before long July for holiday. I think you're out in Santa Fe, Alexander,

Alexander McCaig:

you're saying beautiful Santa Fe.

Paul Barausky:

I'm in Dallas. So we've all chosen to stay inside and air conditioning? I don't want but yeah, I'd love to learn more about kind of your origin story, right? Ended up where you are now.

Alexander McCaig:

Yeah, most definitely. And I will grace this with you know, we think about how we develop systems. For many people out there, we have basic skills and half decent thought processes. And if you apply hard work to that, and a little bit of luck, and you know, being at the right place at the right time, things start to evolve naturally on their own. And it cascades into something rare, like wow, this has turned into a thing that is actually going to be you know, helping people or, you know, other systems depending on what you design it for. So many moons ago, back around 2015, there was a lot going on with things like Cambridge Analytica, the Experian, you know, hack, stuff like that. And there was brief sprinklings about how data was being used and essentially lost because there was bad regulation around how people were storing, you know, data as to things intangible thing, and how companies are really using it to understand people like Paul or Clint in what is what is the profile of them, like, how does their identity, you know, mirror what they do in their day to day life. And back during that time, you know, my background is in finance. So, a good friend of mine, Jonathan Chelan, and we share the same alma mater, but many different years apart from when we graduated, I care to say that he's probably the more handsome of the two and probably more intelligent, but I'm just a good hard worker over here. So we were sitting around, and we were in New York City, and we were in a steak house. And at that time, you know, I did eat meat, I'm currently a vegan now, you know, data, right, leads me down the path to making other decisions. And when there are some issues that we you know, both in a, what I would say a larger, more giving back sense wanted to solve. And we saw that many of the people who participate in the global economy, about 60% of them don't have access to specific financial products. They don't have access to bank accounts, they don't have the proper paperwork. And one of the things that allowed them to be a part of the things which we are so accustomed to here in our Western developed, you know, society itself. Does that make sense? So far? Yeah. And I'll show you how this rounds around towards data. And please stop me because I can monologue all day long. Now, you know, we're discussing and, you know, I was writing some papers at the time, just essentially on the interconnectedness between all these different nodes, which made up the system and where things kind of fell off and where regulation falls short of allowing people to participate in this thing that allows us to essentially thrive over here and live up, rather find material existence. So Jonathan Niren discussion, we're looking around at the steak house in just out of Rockefeller square, and everybody was on their phone. Well, first of all, it's a travesty. Because you know, when you're out to lunch, you should be having a nice social conversation over a good meal. But everybody had their heads buried in these devices. While start, and we're kind of bringing it back here in a special medium, right. But yeah, within that last start, you know, we were discussing these disparities in the global financial system in this room, and I was like, hey, well, what about the thought of underwriting people on their data? I said, it's all well and good that you can collateralize assets, right to say that, Oh, if I want to take out a loan, well, what do I have to essentially, you know, put up against that risk if I'm the bank or the underwriter itself. And for many people, if you want them to participate, you know, they don't have cars. They don't have heavy material assets. They don't have a portfolio of these things. But what they do have is information which they're generating in droves, and at that time, there's about 3.4 billion smartphones on the planet. And people in the poorest nations do have access to them. And they will charge their phones overhead specific city shops or off, you know, commercial light poles, whatever it might be just to make sure that they're on the grid. And it was something that was very pervasive. So it's like, how do you use the power of that to benefit people to drive them into this system where they can participate in the global economy? This, we're like, well, let's, let's underwrite them on the data that they're creating. Let's allow them to use that to get access to financial products. Awesome idea. Great. Why don't we do that? And then wait a minute, hold the phone, we got a problem. How would you know how much this data is worth? Is it like a truckload? Like, what does an asset look like? Where's the limitations on? How do you define how much and what the price is how fungible this this object could be? And what's the value to the people who are doing the underwriting? And so when we looked at that, and realized that, you know, truly, when it comes down to the behavior, the thought processes, the emotions, perspectives and feelings of a human being that can be tremendously valuable for understanding their flows, and really de risking that person, if you were to lend them something, because essentially, Paul over time, you know, this is my first time meeting you. But over time, I developed data through a relationship with you, you know, my mind's composing that my feelings, my emotions, and so the next time I engage, it's like, you know, this relationship I have with you, Paul is quite trustworthy. And I'd be willing to do things that otherwise wouldn't do right out of the gate with you, unless, you know, I'm not that averse to risk, you know, I'll do anything under the sun, but the long horse, right. So with that, we realized we needed to create a technology that took data which people were generating in droves, and turn it into something that was packaged, fungible, secure, and had the ability to transfer itself peer to peer instantaneously and appropriate a value to it. And so what we found is that we created a marketplace technology that allowed normal people all over the world to freely participate in taking their data down off the internet, and putting new data into this thing, which was a Data Vault. So in order for us to get to that point, go ahead,

Paul Barausky:

did you say a Data Vault? Yes, I want to make sure that's correct.

Alexander McCaig:

So much like you think about like with a bank, and you're storing your assets in it, take that same perspective and a very intangible notion of being online and you start storing your information. And then you can pick and choose when to withdraw those things and use it essentially as payment, right? I'll share this to you and you can give me something in return. And now, what we found is that by taking this step back and unlock the world of possibilities for how this data can be used in removing those silos, around where data was sitting, and if you can channel that down to the the core person who's generating it, it reduces the frictions and flow of information so that society systems enterprises can take this and make better decisions and do a lot less guesswork. You guys follow me with on this?

Paul Barausky:

Yeah, very much. So one of the questions that comes to mind to me is creating, essentially, I'm going to use the term product for lack of better vernacular. Sure. The query is because our head is how do you find value? Right? The old saying is, something's worth what the market will bear? Sure. So or infancy of this thought process you had? Does the value just is it a value discovery? Yes, you put that data out there in the early stage, obviously, then it matures, markets become more efficient. But it sounds to me if I'm understanding click knows. I, you mentioned, I was in the half of the class that made the top half possible. When

Alexander McCaig:

you were you were the base supporting them, you know, yes,

Paul Barausky:

I am helping them achieve greater. So I'd be very curious, since you clearly were at the onslaught of this, all we heard about was hey, Cambridge analytic because convincing Brexit and Trump and everything else. But you on the other hand, were out there looking to create a market for this data early on, how do you find the value?

Alexander McCaig:

Yeah, that's a really good question. So for so long, companies from mining tons and tons of data, and they weren't truly sure what to do with it. And the most powerful things in the world are the things that actually control people. And if you can control them in droves through behavior, I mean, what better technological weapon and warfare to use than data itself? No, why launch a missile when you can get people to eat themselves alive? So that model of data aggregation and analysis turned into something of a caution based system, right, where you're using that to drive people in their behaviors without them knowing it? Yes, now, the technology we designed flips that thing on its head and turns data into a needs based model, where people are defining themselves in telling others what they need. And so then the market can meet them with the proper products and services to match, essentially, where those gaps are, in better understand us, you know, you know, as a society, or you know, what we're voting on or anything of the sort. And you'd asked about price discovery. So why would anyone pay for something that they would seemingly be getting for free before, right? I mean, we're stealing it. So why would we pay for it? This? We got a great business. This is fantastic. It's like the IRS, you know. So if we take this a step further here. Yeah. So if we take it, I have no political views. This is just talking about a political. This is just objective observation here, folks. So if I'm looking at the aspects of this here in the system, how do we get people to pay for it? Well, the troves of this data that were being, you know, essentially aggregated and stored and put into algorithms to say, this is what I think Paul is like, and if I do a couple of these tweaks, or show him something, it may nudge him behaviorally in a certain direction. And if you nudge people half a degree, but do it very often, they find themselves on a totally different path, and then they what they otherwise would have been taking, okay. But that's all historical information. And these algorithms were taking these best guest approaches. And this was essentially the auspices of machine learning and deep neural networks themselves, which is saying, let's take a bunch of inputs, a bunch of weights a bunch of biases. And we're gonna say that this is how a human operates. And depending on the data they put online, we're gonna predict that this is the outcome they're going to end up at, okay, and we want to drive that outcome. And that's where the coercion based model comes in. But what if we could continue to refine this into the future and make it totally needs based, where I know exactly what people's intentions are in the future. So rather than collecting data in the past, which is 50%, of what's going on, let me take something real time in all the intention of where someone is headed, and then measure the delta between that and then that's where the value sits. That's where the market gets made. That's why companies when they were seemingly getting all this stuff for free, still come to us and say, Hey, can you help us get something directly from people to let us know where the future is headed, rather than us guessing about it? Right?

Clint Sorenson:

That reminds me of the social experiment. If you watch that movie, when they said, if it's free, you're the product. Well, the product was the data, right? They're taking stealing too far, Alexander start stealing data for people to manipulate them into transacting, right or using the platform.

Paul Barausky:

transaction would be the by the new irons, because you'll hit the ball 20 yards further, or to vote to leave the EU. It doesn't matter doesn't matter. And Alexander's example, each yourself. I'm gonna tell you, Alexander, I think you'll like this quote, you might use it, the late, great, Jim Morrison of the doors said, He who controls the media controls the mind. Yes, he could have added for today's Social Media ticular, as he was in the 70s, when he saw the

Clint Sorenson:

data, the data. Importantly, it's funny that

Paul Barausky:

he controls the data, because back then that's all you had was major media. That's cool. He wrote that when he wrote that quote, which is always stuck with me. And I think you've advanced that to data.

Clint Sorenson:

And I love the whole idea of flipping that on its head, right? And that's what you said earlier, if you think about that comment, if it's free, you're the product, what you're doing is you're essentially giving data ownership to the individual, which is like the purest form of decentralization that I can think of. And that's what excites me the most about what you're doing. Because I think ultimately, you have to have this collision course. Right? AI is going compute spin AI is going to gobble up so much data, it's going to think about if you just think about from a supply and demand perspective, that demand is going to be exponential. And that supply, right is going to get more valuable, at least it should I mean, I'm throwing traditional economics out the window lately. I think that's at least so it should work.

Paul Barausky:

That is rolling over in his grave.

Clint Sorenson:

That's right. So I mean, is that kind of what you're doing? Right? If you think about it, that's such a beautiful thing, in my opinion. But you know, if you don't mind elaborate on

Alexander McCaig:

that. Yeah, you're exactly correct. And I would say that, in a way, shape or form. I'm not personally doing it. What we did is we created a technology that allows the world to do it themselves. And I think there's a foundational change that's going to happen with society, especially in a web three world, where we become a little bit more self responsible for many of the things we're generating. Right? I think thematically if we look at when we were going through the Manhattan Project, I'm gonna take you a little bit through history and you know, human evolution here. You know, when we first realized that there was a ton of energy and efficient process from splitting an atom we were like, Let's let's put that thing in a bomb. Right? It's like come on guys. If you serious this is just the first thing we jumped to you know, like this because we'd be on evolve with our processes here so barbaric and then, you know, things evolve and the world and, you know, technology becomes a little bit more delicate and intangible, it's not so so brutish and straightforward. And with the development of these new things around AI AI has been around for quite some time. So, gosh, there's so much to get into here. I'm gonna walk you through it, and then I will.

Clint Sorenson:

Yeah, roll. Yeah, no

Alexander McCaig:

problem, get the flow. So I kind of I nerd out about this one specific objects. So I like to fly a little bit of a pilot here. And there's probably something I'll never have the opportunity to fly. Unless I was some ridiculous billionaire. Let's see sr 71 blackbird. So Lockheed Martin designed this thing in the in 1962, with a slide roll. All well and good, fantastic aerospace engineering spycraft really quick Mach 2.5, probably a little bit under that. And the real beauty of it is that they commissioned a company to design a smart system are essentially a computer. Now at that time, IBM had computers that were filling up entire rooms, okay, you got punch cards, you're trying to do, you know, advanced trigonometry to figure out how you're gonna land a lunar module on the moon. But Lockheed Martin was like, Hey, why don't we use this like, really cool, like gyroscopic accelerometer with like a camera on it, have a look at the sky, but make it tight enough to be inside this airplane. So this thing that kind of like a trash can, like our two D two, used to sit in the back of this aircraft, and there's a viewing port. Now, this system could tell the pilots where they were anywhere in the world at any time, regardless of cloud coverage above them, and under a minute and 10 seconds and completely autopilot on this plane itself in 1962. So it had a system that was essentially learning a computer a fast paced one very quickly to the elements are happening outside and the current positioning of stars that were in the celestial sphere. Pretty remarkable. So this thing's taking an input from the outside, right? It's it's viewing, it's observing, taking this data and and crunching it really quickly to essentially fly an aircraft at fast speed, operate cameras, and make sure that you're headed on the right path. Now, computing at that point was quite remarkable. And the government for some time realized that there's a lot of power in computation outside of just building the, you know, missiles themselves or, you know, new devices to blow stuff up or shoot things, it doesn't matter what it is, or a new ship. How is it that we can analyze the massive amounts of data to better understand where things are headed, or what's going to happen before the enemy does. And so what we found is that computations in, you know, aviation and the, the growth of computers started to advance so drastically, that when the internet came out, after, you know, it was developed in California, and we started to apply more inputs into the systems outside of just, you know, aeronautical engineering, let's put it into, you know, basic computers with a heck of a lot more data systems got really, really smart. And so with that, we continue to develop an advanced things in AI came out much, much earlier than we recognize it. And when, you know, like, if we think about open AI today, the company's been around for 10 years. But it takes so long for the public to actually see and perceive the value of what's occurring. Like even in our stance, people are like, What are you talking about data, I can sell this thing, it's worth money, right, and only took six years for people to start figuring that out. You know, it's a long process, but when it hits, and people like, oh, yeah, it makes total obvious sense now. So as things continued to develop further, and we had better computational power, these devices could crunch and crunch and crunch and our ability to do statistical analysis became tremendously efficient. And predictive statistical analysis with these things called regression based models allowed us to build this thing called machine learning, which was the base of essentially artificial intelligence. And if you go a thing deeper, you have neural networks. So to say that if I want to create something artificial, and have it mapped out to how my brain currently fires with its neurons, I need a computer to essentially do that same process mathematically. Now, as these things were in their infancy in these programs were occurring. These processes and algorithms generate this thing a waste called Trash. Okay, couldn't name it any better. It's so dependent on the accuracy of this data set and how quickly it could process vast amounts of information right and infer about it. It finds this thing called a local minimum, which is like what's my most efficient course to determine with the highest amount of predictive you know, you know, aptitude here to say that I have 90% confidence that say, the image which thing is taking in is the number nine, or this constellation is Ursa Major if I'm flying a plane, right, because you want to make sure the planes on the right course. So we've developed so much in these processes in reducing the amount of trash in finding out what these local minimums are. as quick as possible. And we've now begun to apply those principles into things that are AI, natural language processing, audio inputs, video analysis, whatever it is, you name it are essentially the motor functions of a robot, if you want to get its joints to move properly, it's like how do we know if you know if it's going to be moving in this direction? Or using a certain amount of energy? It can do those calculations, say, all lasts this long, or make this large of a leap, right? If you're Boston Dynamics. Now, where does the real efficiency in AI grow? Where's the real growth in beauty and goal of this thing, because for so long, from the early 60s to where we are now, all we've done is driven this thing into a weaponized technology. And if you mirror that, in the commercial space, data has been used to drive consumers to do things they otherwise wouldn't want to do in the first place, or whole behaviors of nations. So when we begin to reverse that, throw it on its head, like Clint had said, democratize it, decentralize it. But the power and ownership back in the hands of people, it offers a large amount of tangible benefits for the companies, which are one, using the AI to create new sorts of systems to give us output so that it creates efficiencies like Goldman Sachs with their 1.5% efficiency, right? Or your

Paul Barausky:

activity enhanced productivity enhancer. For those of you listening, we talked about this ahead of recording, I asked Alexander who appointed Goldman to magically come up with 1.5 activity. Any comment before we?

Alexander McCaig:

Honestly, it's, it's remarkable, they can even come up with that, which tells me that the processes of moving whatever the work was, was something of like one desk to the next. And if you cut off, like maybe 30 seconds of that process for people across 10,000 people in your workforce, you probably got like 1.5% I don't know how they did the calculation hammers

Paul Barausky:

for multipliers. But I didn't mean to derail but you said let's assume it's a one occasion.

Alexander McCaig:

No, it's all good. So for so long, when the systems were designed, whether you know, to be weaponized or to coerce people into doing things, all the power and reliance, Chip processing everything is to say, let's take all this data, lacking the input and context of people from whom we're getting it from or systems. And let's determine what is best for them. And so without you doing so much guesswork, assuming that Clint, this is something that you're going to want, right, we can we can tell by your past, right? If you always take a left hand turn, you're going to work at this time, right? This is you're probably going to want to do this or change your tires on the side of the car at some time, I don't know, it doesn't matter, whatever stupid thing you want to advertise to you. But the difference is one of those days, you choose to take a right hand turn one of those times you choose to take that time off from work because you're not in the proper emotional state that day. Those systems don't know that. And so all they were doing was saying, let's take humans, put them into a box, statistically analyze them and say this is the minimum This is how we can make their lives more efficient without any context or input into them being a part of it. Now, if we put the power back of data in the hands of the people, they can give that layer of context, they can actually tell you what they need, when they need it, what the intentions are, how they feel what they perceive. So it's not about a system on the outside guessing and saying this is what the world looks like for Clint. Clint defines it himself. In that definition, there is the sweet spot, because the systems which are looking for that efficient local minimum, which is saying how do we get to that path quickest to make this best for Clint with the least amount of trash in his life, we can now understand the global minimum. Let's take a true measurement of people all over the world, using a technology like turtle to source this information, run it through our systems to enhance our AI algorithms with something that's truly a proprietary data set, understand what their needs are, and deliver something that is fundamentally robust and beneficial for the human being and its evolution. Our technology has been evolving for the sake of evolving technology, and now with what we've created here allows people to be responsible with what they're creating, and use it to help evolve themselves. And I think that is the great tide change and turning point for humanity going forward.

Paul Barausky:

So question for you. Thematically, I love what you're saying. Application. Yeah, real life, burn individual. One of the things that's hot right now is talking about health. Sure. And the ability so I said instead of the healthcare industry selling me things, I can control data and predict things that I may need so that I can have a higher quality of life and healthiness. Is that an application for data I read that I'm getting this right choice.

Alexander McCaig:

So let me just preface this thing, I have no PhD, I am no doctor, none of those things. But what I do care is about the health outcomes of human beings. I want people to live a long time, I want them to be healthier. And I want all of these systems to be used properly, like the resources to go to those who are truly in need of those resources. Now, in the healthcare space, and it's I'm glad you brought up this example. 23andme was swabbing everybody's mouth. Okay, and building up this, collecting a lot of data, so much data on the genome sequence in a raw genome file is a couple of gigabytes, right. And it's just a lot of information. And so with that, if you're in the biopharmaceutical realm, and you have that info that can help you with drug development, or specifically identifying people, and what a possible outcome to taking a drug may be, which is an interesting thing and application of what AI is doing now, right? And how will this body react to this, you know, this new chemical structure, right? If ingested without actually having to go through the process of running the trial out of the gate. Now, there's a huge amount of value in that. So 23, and you start assembling people's genome sequences to or you know, facets of it information, chunks of it to biopharma so they could develop drugs, right, or find people like you are perfect for a clinical trial for us. Now, they started doing that without consent. So that's an issue. And when people find out, they're like, Okay, we got to, we got to stop the ball on that one, we can't be sharing that that got us a good old slap on the back of the hand. So where do you find the ability to then go unlock that healthcare data, whether it be genome sequences, or a very hot one today, which we help a lot of insurance plans with, which is called social determinants of health. And so, in the healthcare industry, we've measured everything on when people are already sick in the hospital. And when they get out of it. And when they return to it. The one thing we've missed is all the habits, behaviors, everything else that leads someone to get to that state of illness, and actually end up being in a hospital. And if you can prevent it and catch it early on, there's massive economic cost savings to the hospitals in the plans themselves. And so the people, their savings is that they're healthier people, right. And I think that's a huge benefit to life itself outside to just, you know, money. So these plans, these health care plans, or hospitals will come to us and say, Hey, we have a whole bunch of different members out here. And we're doing a really poor job of engaging these individuals to essentially build a profile on members to understand where our risk sits, you know, they're all out there. But we don't know much about them. You know, we'd all love to think that our health care plans and our doctors, you know, know us from soup to nuts, and that they're going to tell us right thing when we walked through the front door, but truly, most of these plans don't even know what language is spoken in the home. So how are you going to ever get to the point to know that, like, what is someone eating? Do they smoke is cancer, a part of the family history, all those things, if you can collect that ahead of time, it allows you to then cordon off all your members into these buckets, say, here's the high risk group, here's medium, and here's the low and here's the resources needed for each. So we help them actually collect that information from people. And people are generating all this healthcare data, that's health care profiles, social determinants of health inside their turtle Data Vault, in these plans can come to them and say, Hey, these elements about you, we would like to ask you to share those with us, and we'll pay you in return. And now dependent on what that plan is doing. And many of them are state mandated to collect this information, right to better serve you, Paul, you know, where you, Clint, they have also a certain amount of money, which they can pay to incentivize a member to actually share that information to them or do something, whatever it might be, and some of them are up around $80. So if you think about that context, somebody who has very little or trouble feeding themselves, and you know, they could be all be at quite a high risk, whether it be a mental or physical state, you can help financially alleviate their current situations, you can collect data from them so that you can reach back out to them with proper resources. And then you as a plan can intervene at points that prevents them from actually getting to a state of true illness. And that's where the real beauty of this thing starts to happen. That's where the healthcare system doesn't become reactive, but becomes proactive. And studies are starting to come out to show that there's massive economic benefits and also long term health outcomes for these people that otherwise wasn't possible. When I

Clint Sorenson:

say about homelessness and mental health, right. So I think about this all the time, like, hey, if we could print $9 trillion in a matter of two months, you know, to essentially do a rolling bank bailout. Why can't we print a trillion dollars for sure. Proper mental health, right? Yeah, in Canada.

Paul Barausky:

I live in the middle of it. As Clint knows I'm in an urban area in Dallas and a high rise and I'm close to an area that is continually seeing more vagrancy and the mental illness is apparent.

Clint Sorenson:

Proactive data gathering SCADA, but at least a nice attempt I try to preemptively solve some of these issues right get to these people before it becomes you know, too difficult to really turn back right before help is no longer on the way and so I love the strategy Xander.

Paul Barausky:

What other areas? I mean, healthcare jumped out to me. So as obvious, apparently you put a lot of thought into it. What else really is a highlight for you other industries areas? Sure. I'm really seeing you right now.

Alexander McCaig:

Banking and Financial Services. So I'm sure this one will hit it right answer.

Clint Sorenson:

Yeah. Oh, man, I promise everyone. This is not to say, we did not plan on this. We designed to do this raw and organic. And this in Paul's teaching Alexander?

Alexander McCaig:

Yeah, I don't know what you guys are gonna ask me. So I'm just gonna respond to it as I get in here. So we all know that information is the edge, right? Whether that comes to finding trends in the market, or what's the next stock to pick, or whether it's insider trading, you know, information is the edge. So what happens when you open up the availability to see where markets are going to move again, intention based, and start to source data from those people, which are using products and services? Right? And does it actually fundamentally match up? With? What is going on in the financial markets? Is someone drumming this up? Is there actually a weakness to it? Is it more marketing than, you know, real tangible value here? So I'll give a couple examples. Banks will come to us and say, Hey, we've realized that Western banking regulations don't really work well, in developing nations. People don't have things like a labor card, or a passport or driver's license, depending on where they are. But they're working hard, they store money under their mattress, and we want to offer them financial products that can really help spur up economic activity in the area. And also, you know, the bank gets to make it's not off a couple percents of whatever the product is. So they'll be like, can you help us start sourcing information from people in, say, geographies of West Africa or the Philippines, and help us develop a risk score. So for them, they're like, I don't have any documents on Clint, you know, Clinton, Mozambique, he's got his kid in the United States going to school, which is something that happens all the time, but your kid CLIN could send you money. But because you are an individual of the nation of Mozambique, you're automatically D queued because you're in a high risk area, but it doesn't mean you're a bad person. So again, the world's starting to change here, it's not going to take just that big macro view, we can go micro because of computational power to assess individual risk, right, not statewide risk. So the banks are coming to us and say, Hey, we want to do this individual risk and start to measure this on people and offer them directly these products or services that match them depending on the risk level that they're at, and have them, you know, share these documents and data. So we can say, Okay, our algorithms and AI have said that these are the things this person qualifies for. Now, let's open up that level of engagement and offer them those things. So banks will do that, right. And they can do that in a bunch of different areas, right to try to get that edge. But let's go a little bit deeper here. Let's talk about all data, you know, quantitative hedge funds, their whole game is measuring Delta, right. So in their stance, they'll come to us and say, We want to ask her a question or source a piece of data, one specific thing from a vast amount of people. And we want to do it a lot, very frequently. And then we want to see it change. And then we want to go back to those individuals. And ask them why it changed. And that right there is the bread and honey for how those individuals get their leading edge with information is to understand why the Delta occurred. So measure a lot of it, get competence in the measurement, and then ask why that change happened. Now, in the old paradigm, they didn't have the ability to do this, right? You can have the guy sitting out there counting chips, or a person going into a factory, any of this work, but now you can use the whole world, as your data workforce to train your models. And to give all that information very proprietary to your traders so that you can have true legitimate insight into something and say that this is really what we should be acting on. Even if it's so contrarian to the rest of the world. It really

Paul Barausky:

is. And what's so exciting I think gets lost on people until they consider it is you started out using the example of foam but the reality is the Internet of Things has exploded and multiplied beyond even Moore's law about technology. Sure. So you are getting data from everywhere.

Alexander McCaig:

Correct. So, turtle has people from 248 specific locales across the world right now. It's more places than countries, which is interesting. I never knew that it right until it started to happen. And this thing, you know, again, it multiplies on its own, and you've given the world an opportunity when they didn't previously have one. And it's something that's free for them to use. And they can annuitize something and create value that cease to exist in the first place. It's a narrative that works really, really well. And if you can design the technology to be efficient, and simple for them to interact with, everybody's going to use it under the sun, they're going to tell their family members, they're going to tell grandma, and it blows up like Facebook did. Right.

Paul Barausky:

So this is where Clint normally talks the smart guy versus the dumb guy, as we get. Because the three of us I think I could listen to Alexander talk for three hours on this stuff, Clint, but we'll kind of wrap up in the next few minutes. My question is, it's called the fat pitch, right? Shawn Williams famously waited look for the fat pitch Buffett made it more mainstream in investing. from an investor's standpoint, what's the fat pitch in data? In your mind? Is it this the control of it from the individual? Is that the main frame? Is there something that stands out to you when you're having a short conversation with someone?

Alexander McCaig:

Yeah, most definitely. The fat pitch in this entire thing is proprietary datasets. You know, in a gold rush like this, it's not about going to get the gold, the gold in the sense of all these other you know, there's so much investment in AI, people like, Yeah, let's, let's put it into the mind, right, let's the data centers, you know, it's everything else. But no, really, it's the behaviors that cause people to do the gold mining. It's understanding that it's the thing that trains people to take that action. That's the real value in this entire thing. You're seeing people invest a ton of money into chips, you're seeing our United States government put chips out in the South China Sea to protect Taiwan because of chip manufacturing over there. Because AI is it's on the rise. The real thing, though, is that it's all well and good to have all those chips, and all that stuff. But if you don't have information, if you don't have the fuel that goes into it, to make that stuff valuable, you got nothing. So your biggest your fattest pitch is going to those places and investing into the things that allow proprietary data to fuel this entire system and the fire that we're seeing right now.

Clint Sorenson:

That reminds me the Gordon Gekko quote in Wall Street when he tells Fudd, he says, but the most valuable commodity I know is information on a reset. I think that is funny to think that that was when was that movie created? 1982 92

Paul Barausky:

I had fun. I'm older than Clint. I have the phone like

Alexander McCaig:

I can see you blue horseshoe steel. You know what I mean?

Paul Barausky:

Boy, I may or may not have had an Armani double breasted. Pat Riley coaching the next. So this is absolutely fascinating stuff. And like I said, we could have a three hour podcast on it, I think we will be so bold as to ask you to come back and more than once, if you're regale us with more stories, because you really only hit the tip of the iceberg here. And I think my mind is just starting to race with applications across so many industry sectors and virtually every part of the planet. So it's fascinating stuff, how to folks, you got a lot of tremendous thought leadership reactive on any Twitter, anything short that you

Alexander McCaig:

can find Twitter, LinkedIn, all over the internet. I mean, I publish a ton of material. Like I said, there's probably 1000 podcasts that I've done. And it forced me time to speak to experts and individuals like yourselves, so that I can learn more, not only about me, but the rest of the world.

Paul Barausky:

Well, it's fantastic. I gotta tell you for a guy who hangs his hat in the industrial warehouse space, you know, this stuff's pretty interesting. So I get to talk about a parking lot with four walls and a flat roof. So much more exciting. But I will tell you, you know, like one of my wife's family friends, he created some of the first neural networks for on demand energy, so called that company, you know, collecting data from warehouses and offices and things like that, which seems quaint now, but it wasn't quite when they sold it for several 100. I

Alexander McCaig:

don't know. I mean, just real quick. If you turn it on an AC unit, people would flick them all on at the same time. So these systems, this learning network that you're talking about, they were analyzing it realize, Wow, the power draws insane. And if we do this every day for multiple years straight, our electrical bills are huge, but what if we stagger how we turn these things on? And so that's sort of the beauty and automation and machine learning that we find in those applications heavy

Paul Barausky:

industrial grid, which has been in danger of breaking every day here in town. access for the last week, Clint, any final words here before we sign off for this? Oh,

Clint Sorenson:

Alexander thank you as always love our conversations. love talking with you about all this. Well, I definitely have you back on to talk maybe more about decentralization next time. So, but yeah, fascinating. Thank you, everybody check out Taradale checking Alexander on all the different socials. Paul, thank you for being incredible. He's

Alexander McCaig:

very much everybody. Thank you guys.