Infinite Curiosity Pod with Prateek Joshi

What it Takes to Build a BI Platform | Colin Zima, CEO of Omni

Prateek Joshi

Colin Zima is the cofounder and CEO of Omni, a data platform that combines the consistency of a shared data model with the speed and freedom of SQL. They recently raised their $69M Series B led by ICONIQ Growth. He was previously the Chief Analytics Officer at Looker.

Colin's favorite book: Blink (Author: Malcolm Gladwell)

(00:01) Introduction
(01:10) What Is a Data Model and Why It Matters
(03:27) Gaps in the Modern Data Stack
(05:38) The Staying Power of SQL
(07:29) Origin Story: Why Omni Was Created
(10:13) Lessons from Building the MVP
(12:48) Go-to-Market Insights: Zero to Ten Customers
(16:02) Founder-Led Sales and Marketing Tactics
(18:58) Company Building: Recruiting and Product Challenges
(21:34) Product Positioning in a Crowded Market
(23:26) Design Philosophy in Enterprise Software
(28:21) Omni's Tech Stack and Development Strategy
(28:57) Real-World Use of AI Inside the Company
(31:01) Future of Data Tooling and Role of AI
(33:49) Rapid Fire Round

--------
Where to find Colin Zima: 

LinkedIn: https://www.linkedin.com/in/colinzima/

--------
Where to find Prateek Joshi: 

Newsletter: https://prateekjoshi.substack.com 
Website: https://prateekj.com 
LinkedIn: https://www.linkedin.com/in/prateek-joshi-infinite
X: https://x.com/prateekvjoshi 

Prateek Joshi (00:01.854)
Colin, thank you so much for joining me today.

Colin (00:04.663)
Of course.

Prateek Joshi (00:06.364)
Let's start with the basics of data exploration and especially in the world of AI and LLMs, it's becoming even more important. So when it comes to an enterprise setting, like what are all the tasks involved in exploring the data?

Colin (00:24.482)
I mean, it's pretty straightforward. It's like, it's sort of like any supply chain. have to collect it. So usually you have to get it out of its source systems. because you can't really do as much as you want out of, you know, data isolated and all these systems. You have to collect it. You have to store it. You theoretically have to do some transformation on it. Cause it's probably not in the shape that you want. And then you have to share it with people. So I think that tends to be the core of the stack is some sort of extraction load. Transform and then present.

Prateek Joshi (00:52.734)
Right. Actually, I love that supply chain analogy. It says it makes it more, you can visualize it better. Okay. Now, in your, Omni, you talk about data model. So before we dive into that, can you just explain what a data model is?

Colin (01:10.636)
Yeah, mean, a data model is really just a representation of your business with data. So kind of back to that pipeline of data, your data is sitting in all sorts of systems. Salesforce, your operational database, your web logs. The way that it's stored is often not the way that users think about it. There isn't a clean table in your Salesforce that has your ARR.

There isn't a customer list probably. There isn't even really a sense of a purchase. There's a whole bunch of tools that together collect those things, but you actually need to manipulate them and change them and sort of define the corners of what they mean for an end user to say, how many customers do I have? Or for, you know, the sales leader to say, how much ARR did I close this quarter?

Prateek Joshi (01:59.444)
And the reason we need to have a shared data model, is it for consistency? Is it for speed? Why does the need exist?

Colin (02:09.75)
It really is consistency at the core. The ultimate reason that you're doing all this is to try to make decisions in the business. And for people to sort of collect together and make good decisions, they need to be looking at the same things and sort of agreeing on the foundation of the decisions. So I think there's some subtlety here, because at the margin, I think in certain product areas, for example, true consistency doesn't matter. If you go look at web traffic,

Does it matter if you had a million people or like a million and five people on your website? It literally doesn't. It does matter if your web traffic doubled or if it's going up in given regions and things like that. And so I think when you sort of think about consistency, the sort of quality level is actually different in different areas. Like I have to present out my financials to my investors on a regular basis.

It's very good if those are consistent over time. Because if I hand them Q2's numbers, and then I hand them Q2's numbers and Q3, and they're different, that's actually a little bit of a problem. It's also why people bother you about your expense reports. And so the data model is really just the vehicle for creating that alignment so that people have the foundation and actually can start understanding what the business actually means.

Prateek Joshi (03:27.444)
That's great. the analyzing data or data itself, it has existed for a long time. And we have many sources of data databases, many tools to extract and load and transform. So if you had to explain the current landscape of tools available for the whole supply chain, how would you describe it and where are the gaps?

Colin (03:45.656)
Yep. Yep.

Yeah, it really actually goes back to sort of those levels. So I think about the core as there's extraction tools. So the probably the most common one that people are familiar with is 5Tran, but Stitch was in the space where our CTO came from. They were, they were a competitor. And then in the past you had things like MuleSoft and Informatica, but things that take data from one place and put it in another place. Then you've got the data warehouse. So things like Snowflake, Databricks, BigQuery, Redshift, but also transactional databases like Postgres, know, Microsoft SQL Server.

things that hold data. Increasingly, transformation has become a pretty key part of the stack. So actually taking those tables in the warehouse, turning them into other tables that make more sense for a business context. So dbt is kind of the canonical example there. We've got things like coalesce and tobiko that does SQL mesh. They do transformation in the warehouse. And then above that, you've got the bi layer. And we are unfortunately one of

probably 10,000 different BI tools that exist, but kind of the most canonical examples, Power BI from Microsoft, Tableau, Looker. There's a lot of tools besides Omni in that layer as well. And then I would say you've got a whole suite of other solutions because there are other things in the stack. So you've got observability tools. So like Monte Carlo is an example of those.

or big I, or there's a ton of those datafold. You've got cataloging. So what data exists? And some of the major vendors do things like this, but now you've got a whole bunch of other things. Reverse ETL is another one that takes data out of the database and pushes it into other tools. So I would say the core is extract database transformation BI. And then you have a sprinkling of these other tools that sort of help monitor and observe and sort of tie together those layers.

Prateek Joshi (05:38.128)
And in your new material, you talk about the speed and freedom of SQL. Obviously SQL has been around for a long time. What makes SQL great? Like why makes the staying power so strong? What makes it great?

Colin (05:43.661)
Yep.

Colin (05:51.02)
Yeah.

It's actually sort of the staying power that has made it strong. Like, I think the sort of beauty of just creating a standard is that everyone learns the standard and can know it. like English is a great parallel. Like, is it the best language in the world? I feel like objectively it's been decided that it's not a very good language. Like there's lots of exceptions. It's a tricky language to learn, but like.

A lot of the world knows English now. And so in some ways it's become an international language and thus it's a standard. And when people have tried to sort of replace or adjust SQL, SQL is just the language of data in organizations now. And obviously like Excel is actually another language of data in sort of a different way. But I would argue that spreadsheets and SQL are sort of the two canonical languages of data. And so the, kind of tongue in cheek answer is it doesn't really matter if SQL is a good language. It's just the language.

And so you kind of have to do that and trying to fight it is dangerous because it's the language that everyone knows and the language that everyone works on. So it makes it easy for BI tools because we have to write SQL ultimately. It makes it easy for databases and transformation because they're going to create SQL as the input. it's just become a very strong standard.

Prateek Joshi (07:07.828)
That's a very good point. think it's like it's almost doesn't matter. Is it objectively the words? It doesn't even matter because it's used by so many people and it's been here for so long that it's just language. Okay. Now I'd love to go back to the launching of Omni. How did you validate the need before launching it?

Colin (07:29.314)
Yep. So I was at a company called Looker for eight years before this, kind of one of the previous generational Franklin-like winners in the space. We were acquired by Google. And Looker's core value prop was based around data modeling and

centralization. So what I would call like very high governance BI, like the most enterprisey of enterprise data. And actually back to sort of the SQL Excel balance, my career actually started in finance. And so I loved a lot of the sort of Excel direct interface, like touch and feel your data components. And there was always this frustration as we building Looker that I wished that we could do a little bit more of those things. And

Frankly, we never really acted on it. We were a very centralized tool. That was our promise. That was what we worked on. But as Looker got acquired and we had some time to reflect, I had always wanted to build a product that could look and feel like Excel or SQL, very fast and decentralized, while still incorporating a lot of the core tenets of Looker. And the core vision of the product was like, let's just do that.

Like I think that product needs to exist. The challenge is like, does the market agree with you? but for, know, the first year of building, we were sort of just building a product for me, which is it's wonderful. It's like very easy to evaluate the product when you're building just for yourself. Because it's either great or it's not. And if it's not, then you go, you go adjust it. so like we were building the product that I really wanted. and in terms of like why I thought the market wanted it.

A lot of it was just because I was in the space and sort of had seen all of the different BI tools. And I think when some people are building, they sort of dismiss competitors or sort of think that the way that they do it is bad. think I'm like very far in the inverse. Like I want to learn about the really nice thing that every other company is doing and bring it all into one place. And kind of over my time at Looker,

Colin (09:31.138)
There was like, there were so many good things that like the modes and periscopes were doing on SQL, you know, Tableau for visualization, obviously Excel is sort of there forever, but also that Looker was doing around governance. And like the really simple thesis was like, why can't we just put these all in one place and make them talk to each other? And that's, you know, what we set out to go do.

Prateek Joshi (09:50.138)
amazing. I love that framework of if somebody is doing something really well, just bring it in. You can centralize it and then you add your own spin and it becomes something great. Okay, so now that you validated the need, how did you decide what goes into the MVP? Because obviously you could build until the end of time but you have to ship. So how did you decide?

Colin (10:13.218)
I mean, the core idea was we were hoping that people would use it like two months in. So like I was actually one of Looker's first customers. And I actually used Looker before they had dashboards. Like they barely had visualization. You had to build the data model in the command line. It was very young product. And so I was like, maybe we can build a young product also and just give it out to people. And so I mean, literally for the first year, I probably did 200 calls with everyone that would talk to me. And I said like, hey, come try this.

And so we put it in people's hands. And I think what we slowly learned was that what we were hoping was an MVP was not actually an MVP to customers. And part of this was in a lot of markets, kind of want to come in, like what I say is like come in the side door. Like you don't want to replace an existing thing. You sort of want to be an add on and exist and then sort of like bleed into the core. And we were hoping that that's how we would enter the market. The feedback that we got from customers was like, it's interesting. You're doing.

Like I see the potential of what you're talking about, but unless you replace the existing tool or tools that I have, it's not worth my time. And so sort of the MVP got bigger, the more people that we talked to, we realized that we needed to build a whole BI tool. We needed to build dashboards. We needed to build like a SQL experience and a modeling experience. And it was just like, we had to build a full BI tool that could go replace the one that you had. And like really for 12 months.

the, you know, for the first 12 months, we didn't really have a customer. would say maybe at like the 10 month mark, a former product manager at Looker, I got her to turn off her BI tool and start using Omni essentially by, by force and like, by like guilting her into it. And so she was our first, you know, daily user, not paying us. And she gave us a lot of really good feedback. That was like very honest. like you just, you need someone that is going to create a feedback loop with you.

and be honest about whether the product is good or not. And then I would say over the next six months, we cleaned up a lot of products. I thought that we were going to even win deals in that six month period. We really didn't. It just so happened in like May of 2023. So about 15 months into the company, we had like six trials going at the same time. So people that were using the product and all of them closed one. So we just won them all. And that was, was like, that was the moment that people sort of said like,

Colin (12:37.184)
Yeah, this is enough product that it's a BI tool now. And so it, my strong belief is like, get things out there and your customers will tell you, but customers were just telling us for 18 months that we didn't have enough product.

Prateek Joshi (12:48.916)
Right. Amazing nugget. think the first 18 months, it's brutal. It's a grind. And so if you look back on your journey, like zero to 10 customers, always a grind. You just mentioned like the product learnings. Looking back in terms of go to market, like what are the key takeaways if you were guiding someone now who's in their zero to 10 phase? Like what are the learnings?

Colin (13:09.238)
Yeah. I mean, I really just think that you need to talk to literally as many people as will talk to you. One thing I love to cite is actually one of our, not really a competitor, but not really a partner either, but a company called Hacks, they make notebooks and it's a very nice product. Barry, their CEO, wrote a post that I point everyone to called...

commitment engineering, I think it is. And it's effectively a process of asking for a little bit more out of your customer over and over again. You say like, Hey, will you give me 30 minutes? Hey, will you give me an hour? Hey, can I watch you use the product for two hours? Hey, will you pay me money? And I really think like your early customers is just a process of convincing people to give you more of their time and effort until you have shown them that the product is worth their time and effort. Like they're really just paying you in.

their attention and you need to talk to as many people as you can and find as many people as you can that will give you their attention. So like for us, our advantage was I was just in the BI space for 10 years with most of our company. Like 50 % of our company has come from Looker out of the 90 people that we have now. And that doesn't include the other half, like many of which were in BI. We leaned heavily on an ecosystem of friends and partners and customers that we've built up over 10 years who

We knew as people entrusted us and that was the way that we got into the market was like, Hey, these guys are smart and I've worked with them for the last 10 years. I want to see what they are doing and I will give them my attention. think there could be a very different advantage for a different type of person. Like if you're 22, just out of school, you don't have a network of 5,000 former customers to go talk to, but you probably have another set of advantages that you can use to go find people to give you time.

to go prove whether you have product. Like you could show up their door, you could do more work for them. I think you have to find the advantages that you have as an individual and lean on them as hard as you possibly can and then get people to give you their time and be super honest about whether your thing is good or not because people will happily lie to you that your stuff is awesome and if they're not using it, then it's not awesome.

Prateek Joshi (15:26.505)
Right.

Right. And it made a great point, customer attention. In fact, attention is all you need in the sense that in the first 18 months, you ask for your customer's attention and it's amazing. All right. Now, after the early first handful of customers, clearly, as you mentioned, 18 months in, you know something is working here. After that, what sales and marketing experiments have you run?

ask that initial phase and what worked and what didn't.

Colin (16:02.604)
I mean, I think the one thing that we found that was really effective was almost like mechanizing what I was just talking about. So like our huge advantage, I think as a company or one of our huge advantages is that our three founders have like a very deep network in the data space. So Jamie, Chris, and I have all been in space for 10 years. We know a lot of people and what we did was we really used our likeness and sort of our network.

as the vehicle for scaling the company. So most of our outbound goes through one of the three of us. Like a lot of the email comes from me. And if someone responds, I am on the thread talking to that person. And similarly on LinkedIn, like a lot of the outbound or essentially all of our outbound is through one of the three founders and us having conversations. And it's not terribly scalable, but it is a way for us to drive people into our interest funnel. Because like we are trustworthy and we understand and we can have conversations with people.

So I would say using, like for us, was using the founders as sort of marketing points more than the product even. Like it's really interesting to build a product and then not pitch your product to people. know, it's just like we were pitching our experience. We're just like, hey, we're really experienced in the space. Do you want to see what we have? Like anyone could say that about any product in the whole world, but that was actually what we found was more resonant than explaining our product to people. So.

I think that was a really big one. mean, certainly just like, we, hired a marketing, like a marketing leader from, Ripley who had like a very different style than I've seen in data analytics. Like it's just much more programmatic, much more sort of growth hacky. and I think that approach has just been really effective as well. Like we ran a recent campaign where we sent people notes and told them to take their significant other out to dinner for Valentine's day and, and sit on a demo for us. And it's like a.

It's a completely weird campaign, but we actually had really high response rates. And so I think we've just been good about trying things and getting people to talk to us. Yeah.

Prateek Joshi (18:01.522)
Right.

Right. I think that that's a very underrated hack. if the founder, founders just doing outbound, like many times when another early stage founders talk about, do we need a demand in person? The answer is you already have a demand in person. It's called the founder, right? Meaning you got to do it. And it's very effective. People want to respond to founders because you built it and you're in the best position to talk about it. So I think it's a, that's wonderful. Okay.

Colin (18:30.9)
And it's not that expensive in your, like, if marketing is the most important constraint on your business, which if it's not, you have a great business, but I think it is for most businesses. Like you should be spending time and it's not a lot of brain power to have conversations with people. Like it's sort of just like, Hey, I think this thing is cool. you? And people are sometimes like, no. And sometimes they're like, tell me more about it. And like that is a meaningful driver of the business.

Prateek Joshi (18:39.284)
Right.

Prateek Joshi (18:58.152)
Right, 100%. And outside of go-to-market, what are the biggest challenges you had to overcome to build this company in terms of team and product and just company building in general?

Colin (19:11.894)
I think that we've been very blessed. Like I think that I've had an enormous number of unfair advantages. Like we've been really well funded. A lot of our early funding was from former members of the Looker board. So again, like our trust carried our funding. I mentioned 50 % of our company came from Looker and Chris, about 10 % of the company came from Chris, our third co-founder's company.

So even recruiting, I felt very blessed because great people have called us up on the phone and said like, Hey, I would love to come work there. And we're like, yeah, you're great. Come, please come work here. and even on the customer side, like I would go back to it is always hard, matter how good your product is. Like, I think we have the best product in the whole world already. I truly believe that it's still hard to convince people to try the product and to go find them.

So like for me, it is 100 % marketing all the time is the hard stuff. And like, I would say the big thing is that the early part of ambiguity before you have fit is really challenging because at least for us, like I thought that we had a great product before our customers did. And that's a really difficult moment because you have to sort of connect together why you think it's great, but why your potential customer doesn't yet.

and sort of be honest with yourself. And so like we've had some bumps on the product side where, like one of the examples is in that period, one of the issues was just that the product was kind of buggy. like we had a lot of error pages pop up and sort of resolve on refresh. And that didn't matter to me is what I realized because I knew the product was good underneath it and it was doing all the things, but it turned out it really mattered to customers. And so I think the intellectual honesty of.

What are you missing and what is not good is the other sort of really important part that you need to bring because I like we made some mistakes building that we need to go back and fix and you have to be very honest with yourself about those.

Prateek Joshi (21:09.926)
And that's a great point. And when you think about positioning a product here in the market, because business intelligence, so much money is spent on it, but also because of that, so many companies and tools exist. So when you think about a product, especially positioning a product like this in the market, how did you think about it on day one and how do you think about it today?

Colin (21:34.902)
Yeah, I think you have some very important decisions to make. So going back to sort of the earlier conversation, like are you going to come in on a pocket and be really, really good in that pocket? Or are you going to like go straight in the front door and try to do everything? And I think there's something very clean about doing something small really, really well. So like.

If I'm making a new soda and my soda is sweeter than everything else, it's really easy for your branding to say like, I am the sweetest soda or something like that. like you can control that position in the market. You can be really clear about it. I think at the same time, you have to decide whether or not that piece of differentiation is important enough to sort of bring your product through. And I think the realization that we had was we weren't trying to do something at the corner of the market. We couldn't be like, we're better at this one little thing.

The pitch of Omni is essentially we do what all of your other BI tools do at the same time. And it's frankly like terrible branding because you're saying I do everything and you don't specifically sort of talk to any one person. But we were able to sort of lean on again, like our experience as just like, know the space well, we do the Looker stuff, we do the Tableau stuff, we do the SQL stuff all at the same time. And so we...

I like to say it was like a very venturi approach, which is like, we're going to say that we're going to do the biggest possible thing. And if it works, it's great. And if it doesn't, it doesn't. But I think you want to make sure that you understand what bet you are taking. And ours was like, we have a very diffuse brand across a lot of different things, but I think we can execute on it. And I think in another context, that could be a bad decision. And you really want to focus on your strength and sort of double down there. I don't know.

Prateek Joshi (23:26.036)
I want to talk about the role of design when it comes to enterprise. Because the common observation is that consumer products are amazing, gorgeous, and beautiful, but whereas enterprise products, don't get the same attention to design. how do you think about design, not just the look, but the total product design when it comes to enterprise products? And how do you make sure that customers continue to enjoy and love using your product?

Colin (23:55.47)
think it is tricky. I think the reason for that divergence is because the buyer of enterprise software is not always the user of enterprise software. The buyer, like the attention of consumer software, is the consumer. So you're very deeply linked to your buyer. At Looker, I want to say that our first designer was like our 150th employee.

And I think it always showed in the product. We got a lot of feedback that our design needed more employees for the whole life of the company. At Omni, in our first 11, we had two designers. So that was a very important adjustment that we made towards, I think, thinking that it's important. I mean, I think, kind of to your point,

Prateek Joshi (24:18.43)
Mm-hmm.

Colin (24:40.218)
People like using software that feels good. And so like, do think over time software is improving in terms of its quality. Like if you look at modern versions of things attacking Salesforce, they are much better products than Salesforce. I think it's also important to remember that where for a consumer product design could actually carry your whole product philosophy. In enterprise software, that's not true.

like we are very frequently in an RFP process where someone has 200 features and probably not one of those 200 is design feels good. It's like can make dotted line graphs, like can alert in under two minutes. There's like a very, very long feature checklist that you need to hit. And so I think the balance is you, you really do want to make things that feel good, especially I think for us where we're pitching, like we do everything and sort of broadly, we're a better feeling product.

But you need to find the right balance of who your consumer is and what they care about also. A lot of our users care about power and depth behind the scenes. But I think nearly every enterprise company is under invested in design also.

Prateek Joshi (25:50.618)
And two designers in the first level, that's a strong commitment. In the early days, what should designers work on? On a day-to-day basis, because everybody's doing everything, there's so much to do, what are they doing on a day-to-day basis?

Colin (26:04.269)
Yep.

Colin (26:07.598)
I mean, our designers actually ship a lot of code. So I think you need to figure out the way that your eng product design sort of triforce works. Like in some contexts, you might have really design conscious engineers, and they're able to ship beautiful things. In another context, you might have engineers that sort of acknowledge weakness in the front end and need a lot of help from design.

Like similarly, like your product team could fill a lot of design gaps also. So I think it's going to depend on the strength of your team and what they do. I think the thing that we have seen that I sort of strongly believe is I think if your design team can't touch the front end of the app, they're very disadvantaged towards a true like experienced design person because

it creates a sort of like, made a picture, you go implement the picture. Whereas if you know what it costs to build your picture, you might draw the picture a little bit differently. And so I just, I really think no matter what, it's really important that your design team is very like intimately familiar with front end coding. So they acknowledge sort of where to cut corners from a design perfection point of view.

to make an app that builds quickly and can ship things. So I mean, they were doing a mix of experience design, but also, like, we demo product every single week. If you go look on our website, we have about 700 videos. Every week, we cut up like 10 of things that we're working on. Our designers ship just as much stuff as our engineering team. And like, sometimes it's small little things, but they're doing the same thing our engineers are doing.

Prateek Joshi (27:50.046)
Amazing. I love that. And I agree. think even the modern, the people who work at big companies now, they are very used to good looking software. And there was a time when any big company could ship junk and you just had to use it. But now people expect at least the minimum level of, hey, it should look and feel like this. Moving on to the tech stack that you've used to build Omni. Can you walk me through?

the technology stack behind Omni. What have you used and what has worked?

Colin (28:21.374)
So the front end is React. The back end is Kotlin. Like honestly beyond that, I probably can't talk very much about it. This is where I just turn things over to the engineers and say like build modern and use the tools that make sense to you. So like I know we use remix. I'm not the guy to answer those sadly.

Prateek Joshi (28:37.178)
Mm-hmm. Right, right. Yeah. Right.

Yeah. All right. Maybe let's talk about where internally, in a sales product marketing, where do you use AI internally for your own work inside the company?

Colin (28:57.114)
so there's a couple of really good examples that we do internally. So, we don't have a legal team. I'm, I serve as mostly the legal team and our legal process is to put all of our red lines through chat, GBT and compare them to the base contracts and have us explain, have it explain all the differences. So I think a great example is like NDAs in, in data are everywhere. Like you always have to sign an NDA to work with someone. They're.

one of the most pointless pieces of paperwork in the whole world. Like, I don't even know why they are different, but they're just full of words that all mean the same thing. And so our NDA process is rather than handing that to a lawyer and going back and forth on red lines, like put our NDA in, put their NDA into ChatGPT, have it explain the differences. And if there are meaningful differences, we will go back. I think we've done that once. And otherwise we can, you know, get comfortable with it and sign it. And we do that with contract creation and things like that. So.

Certainly for legal, it's been a huge streamliner. There's a ton of it in the product. So I think our point of view has been to sort of sprinkle it in as many places as possible. So like we use Excel in the product as our calculation engine. The internet is amazing at writing Excel functions, actually. So we have a small GPT that can essentially write Excel functions for you and things like that. I mean, I...

Prateek Joshi (30:14.196)
Bye.

Colin (30:19.936)
I know for me personally, I use it sort of tactically to do all sorts of little things that would take, you know, a more substantial amount of time. like turn this into SQL, take this table and turn it into another format, little utilities where sort of the API is messy. I think is the things that they are best at. So kind of a little bit everywhere, honestly.

Prateek Joshi (30:44.596)
All right. All right. I have one final question before we go to the rapid fire round. How do you see data tooling evolving in the next two years? And also, what AI advancements are the most exciting to you as you build Omni?

Colin (31:01.762)
So the simplest one is going to sound the most trivial. I just think that I'm getting faster, has a more meaningful benefit than people realize. So when folks use BI tools, or frankly, a lot of software, there's a gigantic difference between a 10 millisecond, 100 millisecond, 1 second, and 10 second response time.

And for the most part, a lot of AI tooling is kind in the 10 second range, where for a lot of end user use cases, you want it in like the hundred millisecond to one second range. And I think as these models get smaller, but intelligent enough to do the things that they need to do.

Shrinking that time is going to be absolutely massive for being able to drop it into specific places. So the example I love to give is when you use a BI tool and you have to select a filter, you have to go find the field, click on the field, filter the field. When you filter something through ChatGPT, you just say like, filter this for the UK and it finds the UK for you and drops the UK in. Unfortunately, that second one probably takes like four or six seconds or something like that.

And once that four to six seconds, that's still actually pretty good and feels nice becomes 500 milliseconds. I think it will change how all software looks because you can put a search interface then on anything. And it's, it's going to sort of change the experience. So I think that's probably the single biggest thing I think in terms of how AI in general changes data. I think it's really just going to be this expectation that everything is a little bit easier.

And I think there's a little bit of a sort of weird balance here because it's not that hard actually for AI to write SQL. Because again, we talked about like, it's a very standard language, but it turns out that SQL writing is not actually the hard part. It's actually the modeling and semantics of your business. And I'll give you an example. Salesforce spits out all deleted records. And so if you pull things through the API and you don't clean it, every query that you write will have deleted records in it.

Colin (33:04.246)
Now, like certainly an AI could go learn about things like that, but that's just sort of a small example. Like every business has these sort of gotchas at the corner of all the data they collect. Like maybe it's canceled orders, maybe it's tests, maybe it's users in a certain region, maybe something converts currency, then another thing doesn't. And if you are not able to actually contextualize and model all that stuff, you can't do any of the AI stuff on top of it. Like you need to be able to map those things in. So I think the biggest impact in data.

is really going to be the sort of resurgence of semantics and modeling and sort of teaching your database what the business means. And I think most tools are going to need to be doing things like that over the next two years to take advantage of all this stuff.

Prateek Joshi (33:49.332)
Fantastic. All right with that we're at the rapid fire round I'll ask a series of questions and would love to hear your answers in 15 seconds or less. You ready? All right question number one. What's your favorite?

Colin (33:57.752)
Sure.

Colin (34:02.402)
I really like Blink by Malcolm Gladwell. It's kind of an old one, I just, between like, I love pop psychology, but also I think people undervalue the sort of subtle sort of human underpinnings of logical decision making. And that is just an amazing book for thinking about those things.

Prateek Joshi (34:22.44)
Yeah, I love that book. Which historical figure do you admire the most and why?

Colin (34:28.94)
I struggled with this one because you handed this one to me before. I want to say I think I pick like an athlete like like Michael Jordan and I I'm not even a Kobe fan, but I really respect him and I think it's because sort of like the intensity of focus on doing something. I just I respect that so deeply like picking a thing that you're going to be outstanding at and then taking the repetitions to go be so good at that thing.

I just, respect enormously. So it's, it's honestly like a lot of athletes.

Prateek Joshi (35:02.162)
What has been an important but overlooked AI trend in the last 12 months?

Colin (35:07.926)
I think the biggest one is how universally adopted these things are. So when you look back at sort of the adoption of the cell phone, for example, like that came in over a pretty long period of time, like years. my parents use chat, GBT and like my wife is an attorney and she is like a chat, GBT junkie. I think that my, my child, I have three.

But like my 10 year old has sort of normalized this idea of being able to have a conversation with a computer. And I think the breadth of uptake is underappreciated, especially for people that are like late adopters in tech even.

Prateek Joshi (35:46.44)
What's the one thing about data and data tooling that most people don't get?

Colin (35:53.4)
how many corner cases there are. Just the difficulty of date handling and what nulls mean is so unbelievably painful. And it feels completely unimportant. But every single person runs into one of these problems. And they're all different for every single person. And it just sounds like you could wave your hands and ignore that problem and still build good data tools. But it turns out,

Prateek Joshi (36:06.526)
Bye.

Colin (36:17.816)
that you actually need to have really good null handling and really good date handling to build BI tools.

Prateek Joshi (36:23.252)
the null handling. It's funny how every person you've told...

Colin (36:27.37)
Is a null zero? Like, I don't know. Why do people have opinions on this? But people do. Like, it's...

Prateek Joshi (36:34.704)
It's funny. It's funny. All right, next question. What separates great AI products from the merely good ones?

Colin (36:43.424)
I think it's the ability to interact with their results. So I think when AI is used as pure black box, it's an extraordinarily frustrating process. So an example is like anytime you've tried to use a old voice input, like when you tried to call Bank of America 10 years ago, and it could not route you properly, those are some unbelievably frustrating experiences because there's no ability to adjust what is happening.

The really, really good ones when you look at something like cursor or GitHub co-pilot, they have a feedback loop where you can touch and feel and sort of manipulate the results set and actually interact with it back and forth. And so I think when it's not black box, even when it's doing all the black box computation, but lets the human intervene, I think those are the most powerful moments for it.

Prateek Joshi (37:32.212)
That's great observation. I agree, I think many times the frustration comes from, I don't know, I said the words, the voice, they didn't understand it, and now what? Like there's nothing to do. No, no, that's amazing. All right, next question. What have you changed your mind on recently?

Colin (37:52.398)
I think actually, ironically, sort of AI for data analytics. I think I was a little bit dismissive, actually, of whether it could do a very good job of interactively querying. So show me my top performing customers or things like that. And the reason is because that's a fairly ambiguous question. What is top? Can it actually do it? Is it more work than?

does it create more work for you than it solves by actually giving you an answer? And I think I'm actually starting to believe that with enough sort of modeling and semantics underneath it, you can actually teach it to do really smart things. So just telling it how to use certain filters in the application, actually like really training these objects, I'm starting to believe actually that analytics will change more significantly.

Prateek Joshi (38:46.142)
What's your wildest AI prediction for the next 12 months?

Colin (38:50.449)
man, my wildest prediction.

Colin (38:57.038)
I I think that all of this $20,000 engineer for your company AI stuff, people are going to realize is quite poor. So I think that's probably my biggest one.