Edge of Excellence: Empowering People to Shape the Future

Data Before AI: Why Strong Data Foundations Unlock Scientific Innovation

iuvo LLC Season 1 Episode 12

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 48:02

Behind every successful AI initiative is something far less glamorous: strong data fundamentals.

In this episode, we talk with Jacob Hesterman, Ph.D., Chief Data Officer at Ratio Therapeutics, about how his team built a structured, governed data ecosystem from the ground up and why that foundation is now accelerating scientific discovery, efficiency, and collaboration across the organization.

Jacob shares what it took to bring scientists from different disciplines around a shared data model, why full buy-in took two and a half years, and how that effort created a feedback loop that changed how the entire company operates. We also explore real examples of lab data flowing into analysis systems, how Ratio is positioned for AI because they invested in fundamentals first, and why IT and data teams need to work hand in hand.

Whether you're a data leader, a CIO supporting research teams, or just trying to figure out where AI fits into your organization, this conversation is a great place to start.

Learn more at iuvotech.com.

Mentioned in this episode:

SPEAKER_00

This is The Edge of Excellence, empowering people to think of hardware. Let's inspire, innovate, and explore together. Welcome to The Edge of Excellence. I'm Jess DeForge, and today we're diving into a topic that sits at the center of innovation across nearly every industry. Data.

SPEAKER_01

I'm Brian Beilman, and joining us today is Jacob Hesterman, PhD Chief Data Officer at Ratio Therapeutics. And I'm proud to say that they're one of our clients and a leader in the industry. Jacob is a physicist by training and now leads data strategy for a company pushing the boundaries of radio pharmaceutical innovation. Jacob, welcome to the Edge of Excellence. Thanks so much for having me. Excited to be here.

SPEAKER_00

Yes, Jacob, thank you for joining us today. I'm really excited to have you on. And I'd love to start uh with having you telling us a little bit about your background and how you ended up in your role as the Chief Data Officer.

SPEAKER_02

Sure. Um so by academic background, uh, I have a PhD in optical physics and got my start really in nuclear medicine imaging hardware. Uh, so medical imaging systems used to image radioactivity. Uh several of us then came to Boston to start actually an imaging software and services company, also centered around nuclear medicine imaging, and did that for a number of years, and recognized during that time that the field of radiopharmaceutical therapy was really emerging. And so um pivoted from there to from the sort of the services space into uh the drug development space. And um, that's how I ended up here at raciotherapeutics where we're doing radiopharmaceutical therapy drug development. Um, and my role here. So I'm in charge of our data science, but also because of my background, imaging and radiation dosymmetry.

SPEAKER_00

Very cool. Can you tell us a little bit more about what raciotherapeutics does and what problems you're working to solve?

SPEAKER_02

Sure. So uh really, I mean, what we're working on is we're developing drugs for cancer. Uh, you know, we're an oncology drug development company. And so radiopharmaceutical therapy, in case people don't know, maybe, maybe some folks are more familiar with things like external beam radiation therapy, which is a more mature technology uh in which a typically a collimated beam of radiation is used externally to irradiate tumors to try to kill those tumors. In radiopharmaceutical therapy, it's much more of a chemistry problem in which compounds are designed to target some kind of uh receptor typically on the surface of a cell, and they're injected typically intravenously, systemically, where they'll circulate in the body, bind to those receptors on those those tumor cells, and then attached to that compound is radioactivity. So when that compound is attached to the cell, when that radioactivity decays, it emits a huge amount of energy, which will um through a variety of ways end up killing the cell. And in this way we can have very targeted, sustained delivery of radiation to the tumors and um you know try to help cure the cancer.

SPEAKER_00

I mean that just gives me goosebumps hearing you talk about that. It's like such important work, it's just so impressive and just so cool to hear you talk about it. Um, one of the things that really stood out to us when we were chatting with you on our pre-call was the different leadership mix that you have there, a ratio. So I believe your CEO is an applied mathematician, you're a physicist, and your CSO is a radio pharmacy expert. Is that correct?

SPEAKER_02

Radiochemist and radio pharmacist, yeah. Yeah, we we have a real mix, radio team across a variety of disciplines. And I think it speaks to the field we're in because already, you know, I did mention imaging and and radiation dosimetry, but then also that really it's a chemistry problem. The name of the company is actually ratio because it's so much of an optimization problem, which speaks to mathematics. You know, when you're injecting these radio labeled compounds, it's all about trying to optimize, maximize the ratio between how much of that radioactivity goes to the tumor versus how much goes to other tissues like the kidneys or or other things. And so, in order to go through these phases of being able to, you know, within internally, we have medicinal chemistry groups who are figuring out how to actually design and make the chemicals themselves, radiochemists who figure out how to attach the radioactivity, biologists who are running all the experiments in in the development space, and then we have to translate them clinically. So we have medical doctors as well to help oversee that, those processes, as well as all of the operational disciplines underneath to support it. So uh, you know, just even from a scientific and medical standpoint, it requires a real multidisciplinary team. And I think we've built that in a really exciting way here.

SPEAKER_00

No, 1000%. And how do you think that impacts the way um your company is approaching research and data and decision making, just having that into interdisciplinary team on the leadership?

SPEAKER_02

It it so it's it's helped us a lot because I think we from, you know, I started here almost four years ago, and we were fewer than 10 people at that time. And already one of the first things that that I that was my role walking in the door was to say, okay, um we're even though we're we're small, we're in a we work, we're clearly already starting to generate a lot of data. And it became apparent that that that there were experiments that we were going to, of course, in any space where you're doing something new, there are experiments that you do one-offs, right? You try and oh, we got to look at this, but also if you're going to scale and you want to do things repeatedly, inevitably you're going to run certain experiments and do tasks over and over and over again, the same ones. They become your bell cows. And so from day one, we sat down with all the heads of those different disciplines that we just mentioned on the scientific and medical side and said, What are the experiments that you really run all the time? What do you do every day? And how can we how can we um define or characterize those experiments and the standard inputs, the outputs, the metrics that you're after? And this was this was the very beginning of what became our internal scientific data model that we've used for structuring and harmonizing all of our internal scientific data.

SPEAKER_01

So, Jacob, what occurs to me that do you find, I mean, I've we have other customers or other people I've talked to where they have a bunch of people that are like-minded, or they all they all have a uh biochemistry degree, and they're like, they're all trying to solve a problem and they've all been trained in a certain methodology. And here you have different disciplines. Did you find that that mix that allowed people to like look out think out of the box a little bit or think differently? And you're like, oh, that's a good point. I didn't think about that that way. Do you see that's happening?

SPEAKER_02

It for sure. And this was really this was obviously this was very much a two-way street. Um, and it was funny some of those early conversations. Uh I would say the chemists, maybe a little more than the biologists. Maybe there's something about the word structure that you could make a plan of words in there, but they were right away like, hmm, we understand, like that makes sense. The biologists, maybe a little more, tends to be like, well, no, no two of my experiments are alike. I don't know if this will work. And so that led to more of a discussion, but it it helped in both ways, where I think we could help all the groups recognize actually, we we can in many ways structure and characterize your data, but also we had to learn on the data team that, ah, right, we didn't think of that, or these are nuances or special cases or different dimensions that we need to make sure we factor in that hadn't occurred to us um from the beginning. So it was a it was a dialogue and required a little salesmanship too, internally, I would say, to get buy-in.

SPEAKER_00

Yeah, no, it usually does. It sounds like you did it really well though. Um, so from the beginning, you had placed a strong emphasis on structured data and governance. Why was that such a priority for you?

SPEAKER_02

I'd say maybe a little bit of a glib answer, but not necessarily wrong. Is you know, copy paste can be the death of a lot of things. And it's amazing how quickly you do not need very many people all trying to work off of individual spreadsheets in their own SharePoint areas, where you come into a meeting and somebody opens a spreadsheet and says, Well, this is what I have, and somebody else says, Well, this is what I have, and it's because one person made an edit somewhere and didn't save it somewhere else. And so, yeah, you know, I think often, at least for me, I would think of that in terms of like a big company problem, but those things happen very quickly with not many people. And so, so we had a strong sense that on a couple fronts. One was just from the standpoint of making sure we had things like single source of truth and and provenance so that we knew this this is where these data came from, this is the official place where they live. But also um, we felt that this is going to help us scale and also take advantage of sort of set the stage to be able to take advantage of more modern data science tools that are that are out there. And I'm I'm I'm a little bit tiptoeing around words like AI and machine learning, but these are the things that can help really facilitate the use of those tools if you have clean data. And I guess one other little thing there, it helps from a quality perspective, because you almost um it helps push people towards compliance and completeness when entering data. Because if folks, if you if you want to get data into a structured data environment this way, in this way, and people do it either inconsistently or not according to the right ontology or incompletely, then you you set up gates automatically that will push back a little bit to say, well, can't take that until these issues are addressed. And that that helps both set a habit for strong data integrity and compliance, and also um just ensure that those data going in are as clean as possible.

SPEAKER_00

No, absolutely. Now, what did it look like in practice when the first when the company's first starting out and building its data infrastructure?

SPEAKER_02

So, yeah, the very very beginnings were actually just um just structured tables, even in Excel, just to just because that is very accessible and visible and people can understand it. It didn't take very long before we introduced you know relational database to to hold all of these data and um and and very quickly then started thinking about okay, if if we're gonna hold now, if we're moving into the sort of data environment that is that is going to let us grow in the right way, it might not be as accessible to everyone in the company. So we very quickly started thinking about what we call now internally our ratio analysis platform, but which which is a web-based read-only interface that we can enable anybody in the company with the appropriate role and permissions to access all of that data in a um in a in a safe way.

SPEAKER_00

Yeah.

SPEAKER_02

Um we had to build towards that, but those underpinnings actually started getting put in place very early.

SPEAKER_00

Yeah, I think setting that foundation is so important. Um, and we've talked about you know the different groups of people that you're working with. And so you have scientists from different disciplines, chemists, biologists, and others. Um, how did you bring them together around this shared data model?

SPEAKER_02

A couple examples I can think of. So internally, we have our biology team uses one electronic laboratory notebook, commercial electronic laboratory notebook, whereas our chemists use a different one. And one of the first problems that we had to work on was saying, look, these data, so from a design standpoint, we decided to make the chemical structure be sort of the top level of that data hierarchy. And you can imagine if you have a the the comp the structure is sort of a theoretical entity. But then when you physically make the compound, you have a batch, which is a real physical thing, and it's going to be it's going to be assayed in some way from a quality perspective, and then you're going to take that batch and maybe add radioactivity to it, the radiochemists do those biology experiments. Well, they're using all those different tools, but they all clearly link to that same compound, which is the linchpin at the top of the hierarchy. And so one of our first tasks was to figure out okay, how can we, you know, using APIs and all these tools, connect all these data sets within this database so that so that that was a very obvious linker then between like biology and chemistry, right? Because we could say, great, we have now directly associated this medicinal chemistry experiment with this radio chemistry QC with this biology output, and everybody can now peer in and see that together in one spot without having to go into their individual ELNs. We're pulling all the data from there, but this put everything into one location where everybody could look together and see it.

SPEAKER_01

Yeah, you mentioned you know a single source of truth uh on that. And and what I what I'm noting here is is because you know we have many different kinds of customers in all different disciplines. This is this is not, I mean, you're solving a problem that's that's been that is great for ratio, but also this is a problem everywhere. And the fact that you've early on in a company, it's like you're not a very old company that you've had this discipline and realize that this it it's it's it's also pretty very interesting. I think our our listeners are really going to uh to take this in because I think it's a it's a great problem to solve in your and your method there.

SPEAKER_00

Yeah, no, absolutely. And I have to say, Jacob, just as an aside, I noticed this when we talked in our pre-call, and I'm noticing it again now. I'm not a technical person, the way that you speak, it just makes it so easy for someone like myself to understand. And I think that that's also really going to resonate with just a huge range of listeners and people watching in. So I just want to give you some kudos for that. Um it's very nice for me on my side to be able to actually comprehend and understand, you know, what we're talking about here. Um, so you had mentioned one thing that I thought was interesting, uh, that it took about two and a half years before you really saw universal buy-in from the organization. Why do you think adoption takes time, even when the value is like seemingly so clear?

SPEAKER_02

Yeah, that that's and so a couple of things. I think universal is a good word. Like it that obviously different people came to to appreciate um the system that we were putting in place at different times. Right. But I so I think a couple things. One, practically, everybody's a bit busy. We're we were startup, everybody, it's go, go, go all the time. Everyone has a lot to do. And it's hard not to think, oh, I have like you go to submit the data for your experiment, and some system kicks back and says, uh uh oh, this is missing or this is not quite right. I think people are like, is this really do I really have to do that? You know, that that's inevitable. And I don't, I don't, that's not a fault of anyone. Everybody's busy and trying to do it. So this seemed like an extra step, yeah, to some people. Um, and and it was because it was an extra step. I mean, that's why it felt that way. Um, in a lot of ways. Now, I would say we do put energy into trying to recognize, hey, what is what is time consuming here? Are there things we can do to make this less painful? But you can't eliminate um some of those steps completely. So I think that was one of the things that took a little time. And two was just um, I think part of it is the passage of time and why we had to do this. Because part of what we said early on was look, two years from now, we're probably not gonna remember this experiment that's fresh in our mind right now. And if we don't have a system by which we're storing this and can recall it or query it really easily, then we're gonna be hunting through old spreadsheets or PowerPoints or just trying to rely on our memories, which might seem like they're correct to us, but often aren't. And so I think that was part of the reason it took a little time because it wasn't until in the moment everybody, we were small, there weren't that many experiments. Everybody could kind of hold them in their heads.

SPEAKER_00

Yeah.

SPEAKER_02

But after a couple of years, people are like, it would come up, people would be like, Do you remember that experiment we ran? And I was like, I don't really, but give me 10 seconds and I can pull it all up in exactly the same structure and format that we're used to looking at, and we can compare these things directly. And I think that was a little eye-opening for people, like, oh, or you give another specific example, which was one of our scientists asked a question around like sort of mouse body weight and one output parameter that we look at, and just I wonder if there's a relationship across all of the studies that we run. And and one of my data guys within about an hour was able to present him the answer to that question, pulling up all of the experimental data for every experiment we run in that way and presenting it all together. And I think that was just another little instance that was kind of like, I see now. And so that that then became fed into what became a real positive feedback loop where people started really thinking about like when there were one or two little, let's say, value propositions that came forward like that, people it got people's brains turning, like, if you could do this, could we do could we do this? Could we do this? And sometimes, you know, sometimes we say, absolutely, that's perfectly something we could do. Admittedly, there are times we're like, that would be nice, but no, that's it. Um but then also what happened was as we were growing as an organization, we noticed a little bit of a paradigm shift from us on the data team needing to chase scientists down to say, hey, we found out you're running these experiments, we need to get that data in the data model, to scientists coming to us and saying, hey, we're gonna be adding this new assay in the lab. We want to make sure it gets in the data model in the right way. And that's kind of a little bit of a subtle distinction, but made all the difference in the world in terms of us being able to really get buy, you know, it represented buy-in across the organization.

SPEAKER_00

No, absolutely. And it sounds like that essentially you earn their trust, right? And so to get that buy-in and for them to finally get that aha moment of like the fully understanding of why you were doing things the way that you were doing, you know, built that trust. And then all of a sudden they're now coming to you asking if there's ways that your team can help them implement, you know, different things, which is really, really cool. And I think similar to what Brian was saying, applicable to all industries and all businesses, is how important that communication and building that trust is in order to move these processes along.

SPEAKER_02

Or I have one other example springs to mind a little bit. We have a um, and this is kind of sometimes these things you you stumble into a little bit. I was walking by uh the office of one of our senior scientists, and I saw them putting together a PowerPoint by hand, sort of putting in a graph, like using using our platform to download the data, but then putting in this graph and this structure and this table and this graph, and then going to the next slide and doing it again. I was like, you know, what are you doing there? And found out that this was for them and their whole team, this was a really convenient way to look at a lot of data systematically. And I said, well, that's something we can help with. And now, because we have everything like this, what happens is uh Tuesday mornings, when people come in ahead of a weekly meeting, we kick off a job that actually automatically builds a PowerPoint in exactly that structure for each of the programs that that team wants to look at. So they can just sit down at 8 a.m. with their cup of coffee and open that PowerPoint up, and it's been fully built out automatically for them to review, saving them literally hours of manual effort every week.

SPEAKER_00

Yeah, you solve their pain without them asking for it. That's like the best type of service you can have. I feel like our subject matter experts at Evo are do similar things for us, except we're fully virtual. So it's a little bit harder. They can't walk by my office and say, just let me help you with that, you know. But we are trying to um incorporate like departments like mine to say, like, what are your pain points and how can we make it better? Um, and just the knowledge that you are bringing to the team to be able to see what they're working on. To your point, they may not have ever come to you to be like, you know, what's kind of painful is like putting these these slide decks together. But you having, you know, the the knowledge base that you have and seeing it in real time is just so cool that you're able to bring that value. And then I assume it just helps expedite the science that you're working on because they're now not wasting time on something like that and they can put their efforts toward something more important.

SPEAKER_01

Yeah, I also think that what occurred to me is like uh I think a big big part of trust and is around communication as well. And so you're you're allowing them to communicate in the style, the thing that how they communicate. Some people, you know, PowerPoint is their their way they communicate. Some people are excel, some people are they look at data different ways, and you uh recognize that boy, this is really the way they like to consume data. And that's I've I found that true to in in just messaging, even some people say I want a video, some people say I want to excel. And then, but oh, PowerPoint. So um I'm actually I feel vindicated because I I like PowerPoint and I'm one of the you know few people here. But uh but that's uh that's that's really great. That's really great.

SPEAKER_00

So, Jacob, in our pre-call, you had shared an example that involved laboratory bio. Biodistribution experiments that generate large volumes of data. Could you walk us through that process a little bit?

SPEAKER_02

That's, you know, I mentioned, I think I used the word Bel Cow earlier. That's that's one of our main experimental uh assays that we use for assessing new new compounds or existing compounds. And it is very labor intensive for the um laboratory staff and involves um it involves a lot of, without going into too many details, let's just say a lot of animal dissection and a lot of of measurements of a lot of data, and then actually a lot of numerical processing on the on the back end as well. And so this was something that, you know, not just because you can do a lot of things, doesn't you can't do everything, and it doesn't mean everything's worth doing, but this was a case where it made sense to try to really optimize that workflow because we use it so heavily and there's so much computation involved. And so that was an example where we now have the ability to this was a nice example of sort of interconnectedness to the systems internally. So uh our laboratory scientists are using their electronic laboratory notebook to, well, actually, they're let's say they're measuring things on the scale, which we have automatically read into the electronic laboratory notebook. As soon as that data are all complete, we've actually built an engine in that web interface that I mentioned that anyone with the right permissions can then log in and kick off to run all of the analysis steps that are required for that process. And it generates all kinds of output spreadsheets, but also graphs. And you might want to share those or with a certain project team. So we actually have that also connected to Slack, which we use internally, so that if someone kicks off that job, it generates this output, which automatically gets pushed down to the appropriate Slack channel. So everybody has real-time visibility and you can see people. It's it's a little funny now, even people are so excited. All the scientists, and I count myself in that group, they get so excited about the projects that are coming, and we on the back end have full visibility that we can tell when there's an exciting experiment because they know that the people are in the lab doing the measurements and they're just sitting there running this tool over and over and over again, multiple people, waiting for those data to hit so that they can see the results. We always, it's almost like a little view counter. We can see, we'll chat about it. Like, oh, everybody's excited about something because there's 12 different people trying to run this tool simultaneously. Um, so I think that's just a really nice example of something that, and maybe there wasn't an out-of-the-box solution that worked for that either. Like we are always trying to think of these problems from a make-by-partner type of mindset. We're a small company, we're a drug development company, we're not a software development company. So we view ourselves as trying to complement these tools. So, first, it's is there something we could just license or buy to do this work for us, the way we want it done optimally? Um, if not, then say, all right, let's take the time to build this out ourselves. And um, this was a case where I think it was worth doing because it's become just a really useful tool for us internally.

SPEAKER_01

Well, it sounds like also I what I've been impressed by is you have this observability about the usage of the tool. Um, because I think often people build a tool, you don't really know, you have to survey people are using it, and you have real-time view into this going, okay, I think this has value. And then you can you can you can look at the people using it and and maybe they're waiting for like, okay, I need re-increased resources, or or maybe that's the thing you you focus on. That's the laser where it goes, okay, let's how can we make this even better for you?

SPEAKER_02

It's a great it's a great point because you know you try to have a high batting average in those types of tasks, like when people are asking for things, and think inevitably we all have built things where you're like, this is gonna be great, and then it doesn't actually get adopted. And so when you have visibility to, okay, wow, people, and sometimes it's unexpected too. Like, what was maybe a throwaway comment? You're like, yeah, we can set that up, and all of a sudden you're like, whoa, five different people are using this all of a sudden. And exactly to your point, Brian, maybe that's where we're like, oh, we need to put more resources into this and make this real because this is clearly something people wanted, even though it hadn't really bubbled to the to the surface yet.

SPEAKER_00

Yeah, no, that's really cool. And what happens once the data enters the system in in uh through this process? So we have the data that moves from the lab into your data model and analysis systems. Like what can you just walk us through that a little bit?

SPEAKER_02

Sure. Well, so so we talked about it a little bit with the biodistribution tool. That's one example. There are many assays that that are run similarly. Um so it I guess it's a little fit-for-purpose on um, you know, what what the program is or how those data need to be used. We do have, I mentioned that web-based platform. We have a lot of different widgets on there, and uh different users might want to interrogate those data in different ways. So I guess um it's both available sort of on demand for someone to call up, or we can run processes in the back end to to connect it and report out on it, as I described earlier, as people need.

SPEAKER_00

So you mentioned this very briefly, the buzzword AI. AI is obviously a major topic right now. Um, but you have emphasized the importance of having strong data fundamentals first. Can you talk to us about why that is so important?

SPEAKER_02

I would I would say a couple things there. So, one, you know, I do still often think, just on my personal self, uh, you know, of more traditional data science and machine learning methods and doing things in a supervised way or acting on structured data. And I do think what we've done internally with with our scientific data has has helped there a lot, um, or really adheres to those types of principles primarily, but not exclusively. Um, actually, the success of that platform, and I didn't mention that yet, has led to us say, okay, and this this worked pretty well. Well, this isn't the only thing we're doing as a company. What are other areas where we can um potentially introduce the same kind of concepts? And we started looking at our operational data in a similar way and thinking about, hey, we have all of this data across finance and corp dev and legal and HR. And we structure and interconnect those data in similar ways. And there in that task, we also said, just we have a feeling we're we're using already a lot of, let's say, off-the-shelf machine learning AI tools for certain tasks, right? Like out of the box. Like, you know, I have a big background in image processing, and there are a lot of really nice tools in image processing that we use that that just exists. But at the same time, we felt maybe we're underutilizing LLMs, or we should at least be looking into that more. And that really was a beneficial thought process because what we recognized was I think both of those things were true. There was, in that, those things being one, there was an opportunity for us with our operational data to do a better job of structuring it, interconnecting it, you know, building out API connections, making more of an integrated data ecosystem there, just like we did in the scientific data. But also there was a real opportunity to leverage uh the ever-improving LLMs to better handle our unstructured data. You don't have to structure everything, you don't need to structure everything. So you need the right tool for the right data. And so I think by by looking for opportunities to introduce efficiencies and um and and potential value by structuring data, we we can do a lot, but also by saying, look, a lot of these tools have now evolved to be able to handle unstructured data and we should use them in the same way. So we we did it at that time then, and this comes back to the make by partner comment. We brought in a commercial tool that could just go in and index all of our internal unstructured data and make that very accessible to a larger swath of the company as well. I think that was another another realization was that was that we can build out some of these tools, but there's a maybe always going to be a little more of a a little bit of a learning curve to those, which is which is we try to make the bars low as possible, but for some folks it's just it's still not enough. Whereas if there's something where you can just have that chat capability and get it all that structured data, maybe that's even just a lower barrier to entry for a larger swath of the organization. And so the last, I would say nine months or so, we've really been attacking that problem on the on both of those fronts.

SPEAKER_01

Did you, when you mentioned earlier, earlier in the conversation, we talked about getting people to realize the value of putting the structured data in. And now you're you're advancing to a little bit more unstructured data and and you know, finance and operations and so forth. Did you find did you need similar kind of buy-in, or do they say, hey, we already saw this success over here, we want to do the same thing here? Was there still some of that resistance? Or or you just said, hey, this is unstructured data, we're gonna let LLMs do a little bit more of the work here, as curious.

SPEAKER_02

Yeah, uh, I would I think both of those things were true. So with the with the commercial product I mentioned, which I won't won't go into gory detail on there, people, it was just a little faster because you could just say, Well, one, nobody had to do anything extra. It was just look, here's a bar you can type in. Now it knows all of your PowerPoints and all of your SharePoint, and you can just find something more easily or ask it a question. And that was great. In terms of, you know, I could give uh an example on the um structuring of the operational data that had a um, we ran into some of the same, a little bit of the same. I wouldn't, I think pushback's too strong a word because there had been a proof of concept, people could see that there was something there, but that say, look, look, we have to at least sit down and think think this through. I'll give a specific example. Um we wanted to connect all of the contracts that we had in our contract management software to all the budget line items that we had in our budgeting software. And that did require a certain amount of elbow grease. There was no way around it. But what did it do? It did two things. One, it gave us, it made those connections for us, which then everybody saw, like, oh, okay, now this can read directly from here, and these things are married. And that was a one of those like sort of little clicking aha moments. This is helpful. But also it helped establish a process that people could follow moving forward. That in this case, our head of legal was very happy about because he said, Oh, this is now going to help ensure that we have compliance when new contracts or new budget items come in. Because now by getting all these data structured in this way, it essentially guided what our process should be for new things coming in. And now I can make sure that those processes processes are adhered to much more seamlessly.

SPEAKER_00

Now I'm curious about how you see AI and traditional data science working together moving forward.

SPEAKER_02

There are, it's it's a it's a good question. I think a lot of people are thinking about this right now, and there's a lot of answers to that question. Maybe I'll give a um and and actually we have to keep on working at it. I'm just thinking about a couple of conversations I had internally just a couple weeks ago around some of these new, you know, some of the tools like codecs and things that are emerging more and more. I would say this is maybe specific to our needs right now in one specific area. But what we found right away was almost immediately like, oh, for all the user interfaces we're gonna write in the near term, um, we should be relying on these tools heavily. They work so well, they're really clever, they really know how to do this. For some of the more um niche scientific and numerical calculations that we have to do internally, though, it it became clear like, oh, we have to be a little more, a little more cautious or provide a little more oversight in when using these tools with those types of processes, because they they're often very confident in what they're telling you, but yeah, a little more prone to error. And so I think that was that was helping us recognize um where to devote our human resources in a more pointed way versus where we could rely, not completely blindly, but a little more, put a little more trust into some of these tools from a this is from a software development standpoint, um, out the gate. And of course, they're always good for things like code review and documentation and things like that. They're really strong. So I guess I'm speaking a little more to things that evolve that are maybe a little more on the on the creating side of new tools.

SPEAKER_00

So I'd like to ask you, what advice would you give organizations that want to build stronger data foundations?

SPEAKER_02

So I yeah, I'll just speaking from from the experience I hopefully laid out in this hour, I think part of it, a big part is communication. Yeah. You have to really sit down with all the different stakeholders and under try to the extent that you can. I mean, they are they are subject matter experts in their area for a reason. And it's not that you're not going to supplant them in that role, nor do you want to, right? But you do need to take time to understand why they're doing what they're doing and and what what's important to them in that vein so that you can tailor your data infrastructure to to hopefully meet their needs. And it's again a two-way street. You you need to also try to work with them to make sure that, hey, they may have to flex a little to help support your needs. And so I think establishing that communication paradigm is is really important. Um, and then recognizing, I think another another thing that we tried hard to do was recognize well, that how to build this as a progression. You're not gonna go from zero to 60 overnight. So what are the rate limiting, like what is the number one rate limiting step, or what is the number one most important thing, and maybe tackling that first and sort of bit building out that sophistication um one step at a time, and also recognizing that some things don't need to be forced into this box either, and trying to understand when, what what are the things that are worth taking the time and and to to build into this infrastructure versus what are things, hey, yep, we recognize in a perfect world these data or these experiments would or or this operational data, whatever it is, would fit into this ecosystem, but we have we have to recognize that there are limited time and resources, and so this is something we're just going to accept maybe a little more manual process, um, or or that it's going to be an outlier, and that's okay. And so I think those are some things worth bearing in mind. I I would say if I said if I answered also selfishly from someone who has to manage the team building a lot of these things, it it's it's there's also a bit of a balance with um you start receiving a lot of requests, which is great. You want requests, but I think sometimes people don't recogni don't always appreciate the development work required to fulfill some of those requests versus the amount of time that it actually takes for that request to be done the way it's currently being done. And so I think establishing again, and that comes back to communication, being able to have enough comfort that you can say, look, I'm really glad you brought this to me, but I think this is maybe not something that makes a lot of sense to build out right now. This other thing you brought that was perfect, and you know, trying to look and say, like, you don't want to you want that to come to you, but also have to have the ability to push back a little bit sometimes and say, you know, we we can this is uh again comes back to that idea of we can build almost anything, but we can't build everything.

SPEAKER_01

Do you find, I mean, what we found a little bit is um maybe you're if because you mentioned um LLMs and and and AI that good at maybe presenting data, but there's some you know you gotta be careful with how it uh you know how it's not always accurate. But I I find um that we're getting things that used to take six weeks sometimes we can do it in a couple of days because of because of how well uh writing code and and and uh even even applying security constraints and and all of the checks that do you find the same thing? Are you are you able to give your your your company more faster than you think you could have done in the past? I yes. I mean the short answer is yes.

SPEAKER_02

We can uh you know, if I think back to what we've been putting together, if we tried to do this 10 or 15 years ago, the size of the team would probably be be a little different. Um or or the size of the team would be the same, but the output would just be um would be less. Um it certainly helped with that. And again, especially with a lot of those peripheral pieces around around the the edges, you know. I I do feel like at least here we still focus a lot of our developer time on on the core of those tools, but some of those those pieces like the code review and the documentation, and as you just said, like security aspects or like, oh, I need to set up some sort of authorization for this this tool for login, like those types of things are uh we're increasingly relying on these AI tools to help build those out.

SPEAKER_00

Now we're we're getting close to the end of our conversation here, but I would love to know what excites you most about the future of data in scientific research.

SPEAKER_02

We start here in the the drug development phase in you know, preclinical or discovery, whatever, whatever, however you want to call it. And it's a little more um it's it's a a research space, and there's less uh formality from a regular like a regulatory and documentation standpoint around that. So we can really sandbox easily and connect a lot of things. And I think it it teaches us a lot. And what I'm very excited about and I think appropriately, some of those things can be, it takes more time for those types of techniques and and and methods and tools and analyses to mature so that they're they're um they're they're used maybe in more of a clinical setting, whether that's clinical trial or actual clinical like standard of care, things like that. I I think this came to mind also because radiopharmaceutical therapy, which is our space, is also relatively well, actually, in some ways it's been around a long time, but it's experienced a big, a big resurgence in the last few years. So there's a lot of very active development there. And we see it with regulatory agencies too. They're trying to sort it out. So I think one of the things I'm most excited about is to see some of these methods and tools that we're developing in more of this early phase or research space mature so that they're and so that they actually get adopted in actual clinical use and are benefiting patients and helping to increase efficiencies and in a lot of those areas that I think all of us sometimes get frustrated in because we feel like, why are these systems so archaic? Or why can this new thing that I read about not be applied to me? And I'm hoping we really accelerate the pace at which those kinds of things can be deployed practically in in the real world.

SPEAKER_00

No, absolutely. And given what you all are working on, I I'm excited for that as well. Yes. Brian, I want to pass it to you if there's any questions that you would like to ask before I I ask our final questions, Jacob.

SPEAKER_01

No, I think I'm uh I'm uh I'm still kind of I'm soaking it all in. This has been great.

SPEAKER_02

Well, I didn't I feel remiss. I didn't even mention, you know, we talked a lot about data, but clearly our partner in all this internally is the IT team as well. And you know, at least here, I think in a lot of places, appropriately, those groups are are are are maybe one and the same. Here we have a data science group and an IT group, but they are working um hand in hand. And I think that's something that's been really, you know. I often when I think of uh my team, because I IT reports up into me as well, I actually present it as a spectrum, which is a lot of fun, where I have when I lay out the team kind of left to right on a slide. I have IT on one side and some of my more, let's just say pure scientific data folks on the other side, but try to lay it out in a way that shows, like, look, these are these are people technically on the data science team, but they could easily be on the IT team, or these are people on the IT team who could easily be on the data science team because that's really this continuum of skill sets. And so it's really neat within that group to see how how they can connect that way, where we're going with the team from something that's superbly technical all the way through highly scientific and and all the connection points in between. So that's been also really fun to build out and see those interactions.

SPEAKER_01

That's great to hear we're we're fans or we're fans of IT.

SPEAKER_00

And I think it's really powerful what you're saying, Jacob, to kind of lay it out in that way so that you know, across departments, everyone can see how you're all working together and how one contributes to the other. Um, because I think it's easy to get siloed sometimes with what you're working on. And so having someone in a leadership position that kind of zooms out and makes it a point to help all the departments see how you're all working to the same purpose is really important and powerful for any industry. So we like to ask uh first-time guests a question at the closing of the episode. Um, if you could tell us something interesting about you that may surprise people.

SPEAKER_02

Okay. Well, actually, anyone internal to Rachel, this probably won't be a surprise. But um, so I'm uh I'm a lifelong birder, and uh I guess I would take that opportunity to to make a plug for um, especially for those in Massachusetts or the Northeast for Mass Audubon. Um I'm a board member there, and and so any chance I get, and it's spring, so migration is coming. This is the time to get outside and bird watch. So uh I guess I would say that's that's maybe maybe a personal item, but also I would just recommend that everybody do anything they can to get outside and and yes, enjoy the migrating song birds. Yeah.

SPEAKER_01

Well, yesterday I was I was uh talking to Jess and she says and she says, you have a lot of birds that and she could hear them through like and I I didn't have the window open or anything. So I I went out earlier because I I'm not a professional birder, but I but I do have a book because I have bird feeders. I want to, you know, I identify them, but I uh yeah, I'm uh I that's great that you do that because I think it's uh it's a great place to have birds.

SPEAKER_02

You know, people say I hear that a lot. I'm not a birder, I'm not a birder. And then someone says, but I have a book and I try to work out and feel and I'm like, you are a birder. I'm sorry, TV.

SPEAKER_00

Yeah, you can't have the book and not be a birder.

SPEAKER_01

I know. Well, it's I do I do like birds, so yeah, because maybe I'll maybe I'm more of a birder than I thought. Exactly.

SPEAKER_00

I am not a birder, I do not have the book, but I'm curious, are there is there like a special like tool set that you have as a birder? Are there like is it just binoculars? Is it just like what what does that actually look like?

SPEAKER_02

Binoculars are a must-have, but if we have two minutes and want to tie this back to the AI conversation, the the the the tool that's really become indispensable for a ton of birders and has helped a lot of new birders get exposed to bird song is called Merlin. Um it's put out by Cornell. Cornell is the big bird, it's not the only, but it's the big bird university. And it's essentially like Shazam for bird song.

SPEAKER_00

Oh, that is so cool.

SPEAKER_02

Yeah, so if you are, it will, it will recognize where you are geographically and what time of year it is, and when you it will record the bird song that it hears outside and tell you what what what you're hearing singing.

SPEAKER_00

I feel like you're converting me. I feel like just to try this out, that is such a cool idea. And actually, Jacob, I'm gonna link all that in the show notes for anybody that's curious. Um yeah, absolutely.

SPEAKER_01

Oh, awesome. So, so Jacob, for our listeners who want to know more about ratio therapeutics and the work your team is doing, where would you recommend they go?

SPEAKER_02

Uh, they can go to our website, ratio tx.com. Um, we're located, we're here in Boston, we're in the seaport, but that that's probably the best starting point. Um, and uh yeah, we we'd be happy to uh, you know, that that's probably the good starting point for for anybody to learn more.

SPEAKER_00

Jacob, thank you so much for joining us today. Today's conversation reminds us that the most powerful technology strategies don't start with tools, they start with foundations. Data governance, structure, and collaboration across teams create the conditions where innovation can thrive. If you enjoyed this episode, be sure to subscribe and visit iuvotech.com for more conversations exploring leadership, culture, and technology at the edge of innovation.

SPEAKER_01

And as AI continues to evolve, the organizations that invested early in these fundamentals will be the ones best positioned to take advantage of what comes next. Thanks for listening, and until time next time, stay curious.

SPEAKER_00

Thank you for tuning in to the edge of excellence. We hope today's insights empower you to shape your future and rise to your full potential. Let's continue to grow, innovate, and lead by pushing the boundaries of excellence.