UX for AI

EP 103 - Art, AI, and the Innovators Who Will Rule the Future w/ Scott Jones

Bonanza Studios

Send us a text

Discover Scott Jones' journey from improv and filmmaking to the frontlines of AI product management. This episode explores responsible AI, neurodivergence, spiritual insight, and why taste may be the most valuable skill in tech. A raw, thoughtful look at how art and intuition are shaping' the future of work.


You can find Scott here: https://www.linkedin.com/in/scottjeezey/

Interested in joining the podcast? DM Behrad on LinkedIn:
https://www.linkedin.com/in/behradmirafshar/

This podcast is made by Bonanza Studios, Germany’s Premier Digital Design Studio:
https://www.bonanza-studios.com/

00:00:00:00 - 00:00:11:17

Welcome to UX For AI.

00:00:11:17 - 00:00:13:21

I think in the age of AI and all these like

00:00:13:21 - 00:00:16:16

crazy like Roquefort, I'm using that. It's so.

00:00:16:16 - 00:00:17:16

It's so like

00:00:17:16 - 00:00:20:08

it's not even comparable to all three Pro

00:00:20:08 - 00:00:24:08

for certain. Like I'm using the for contract creation and the result is,

00:00:24:08 - 00:00:26:13

far better. I would say far better,

00:00:26:13 - 00:00:31:12

I think. I think in this age, people like you will thrive exponentially.

00:00:31:12 - 00:00:32:07

Like you have a

00:00:32:07 - 00:00:36:08

bachelor in fields and economy. So very intuitive,

00:00:36:08 - 00:00:36:22

like

00:00:36:22 - 00:00:38:07

gut feeling

00:00:38:07 - 00:00:39:07

artsy.

00:00:39:07 - 00:00:44:14

Now you are an AI leader. How can that happen?

00:00:44:14 - 00:00:45:21

Without a plan

00:00:45:21 - 00:00:49:07

like you had? I had I plan this. I don't think it would have worked out.

00:00:49:07 - 00:00:50:21

Generally, I've been

00:00:50:21 - 00:00:55:08

since birth, and there's been kind of the hero's journey of going into kind of

00:00:55:08 - 00:00:59:06

a dark journey in my adolescence, but from birth, kind of coming out with this,

00:00:59:06 - 00:01:02:01

creative energy that I didn't really understand.

00:01:02:02 - 00:01:06:23

I didn't have to understand. It just allowed me to express myself in, like, improv comedy.

00:01:06:23 - 00:01:14:23

Wanting to be a filmmaker. So making videos all the time, drawing all the time, art classes, creating sculptures in various ways,

00:01:14:23 - 00:01:16:14

but just always kind of creating.

00:01:16:14 - 00:01:24:11

And then in college, I saw that my, my brother, my family actually all went to the same school, which is pretty funny, but my brother was four years ahead of me.

00:01:24:11 - 00:01:31:09

He was an econ major, and I saw that that gave him a pretty cool career path into things like investment banking, which

00:01:31:09 - 00:01:33:14

I had no knowledge of. But I was like, well, that sounds

00:01:33:14 - 00:01:35:07

lucrative or whatever, so

00:01:35:07 - 00:01:44:10

maybe I'll study the art stuff and economics. And so that was just like a little nugget of insight presented to me that

00:01:44:10 - 00:01:45:09

take it or leave it.

00:01:45:09 - 00:01:52:04

And I was like, well, that that seems meaningful to me. I'm going to work with that. And that's sort of what I just described is really how all of this is unfolded.

00:01:52:04 - 00:01:55:01

I've just kind of gone on a path of this balance of

00:01:55:01 - 00:02:00:10

being very intuitive being for me, it's it's not just like, maybe even just a pause.

00:02:00:10 - 00:02:05:03

I don't have like a sort of materialist view of reality. I think,

00:02:05:03 - 00:02:07:11

I think we are energy.

00:02:07:11 - 00:02:14:05

Even further than that, I get I can get pretty woowoo pretty fast with it, but I generally think all of this isn't even real.

00:02:14:05 - 00:02:20:21

I think this is, this is all vibratory information going really slowly to give the illusion of it being solid.

00:02:20:21 - 00:02:25:17

I think the real reality is behind the scenes in the energetic layers of how the universe works.

00:02:25:17 - 00:02:33:07

I think that's where the real us is, my group. And I'll say, I have a guru. It's this guy. I'm working with him.

00:02:33:07 - 00:02:38:16

And he describes how the real you is the electricity. The power is the light bulb of the body.

00:02:38:16 - 00:02:39:18

You are not the body.

00:02:39:18 - 00:02:45:02

And it's those sorts of insights I've been working on that is all kind of compounded over time,

00:02:45:02 - 00:02:47:16

where now I just kind of, to use that analogy,

00:02:47:16 - 00:02:50:16

I try to honor the electricity that wants to come through

00:02:50:16 - 00:02:59:10

and that'll manifest as ideas, insights, things I'm saying right now. Like I'm not picking my words, I'm just kind of writing this wave of information that wants to come through.

00:02:59:10 - 00:03:08:21

And that's how I approach everything. So we talked about how I'm a musician, I'm an artist, and I work in I, I don't compartmentalize it. I approach it all through.

00:03:08:21 - 00:03:12:22

It feels like channeling where you follow your feelings and,

00:03:12:22 - 00:03:19:02

and that, I know I'm babbling a bit, but it also ties into another insight I've had, that I've,

00:03:19:02 - 00:03:27:19

shared recently with a friend in LA who's like a very leading edge sort of futurist, and he's like a very sophisticated, fine artist.

00:03:27:19 - 00:03:30:19

And one day he said he saw what I was doing with clod.

00:03:30:19 - 00:03:40:15

And he said, you're an artist, man. You're approaching this like art. It's it's the same thing. There is no right answer. It's about I'm trying to feel something or I do feel something,

00:03:40:15 - 00:03:47:19

and I want to create something that expresses that feeling. And I want to get it in front of somebody and ideally have them feel the same thing,

00:03:47:19 - 00:03:51:01

or feel something that causes them to take action and say,

00:03:51:01 - 00:03:52:23

wow, yeah, I do have that problem.

00:03:52:23 - 00:04:00:12

I do want to use this or wow, I do have that feeling. This art makes me feel that feeling like it's all the same thing.

00:04:00:12 - 00:04:04:14

There's no right answers. It's just taste. And it's a point of view.

00:04:04:14 - 00:04:10:22

And in this world of I, as as you said before, you hit record and as we're talking about now,

00:04:10:22 - 00:04:12:03

it's not enough to just.

00:04:12:03 - 00:04:20:19

Oh, I can use the tool. It's like the tool gives you something back and it's on you. What did you taste? What is your point of view? What does that make you feel? Is this good enough? What does that even mean?

00:04:20:19 - 00:04:21:05

That's.

00:04:21:05 - 00:04:25:10

Yeah. I mean, so there's too many things that I can talk about right now.

00:04:25:10 - 00:04:30:18

First of all, you have this brain that neurons shootings, a lot of neurons shooting at milliseconds.

00:04:30:18 - 00:04:31:15

I kind of keep up,

00:04:31:15 - 00:04:32:08

but

00:04:32:08 - 00:04:32:21

which is

00:04:32:21 - 00:04:42:07

and second of all, for for audience reference, you're not working at the AI. You're working at the cutting edge of AI, like you're working at such advanced stuff.

00:04:42:07 - 00:04:44:04

Like literally, I think

00:04:44:04 - 00:04:47:12

the first person I ever come across that could

00:04:47:12 - 00:04:50:04

help me define one and understand

00:04:50:04 - 00:04:51:17

responsible AI.

00:04:51:17 - 00:04:52:12

What is it?

00:04:52:13 - 00:04:56:16

What could it do for us, for the good of humanity? That's what you're doing?

00:04:56:16 - 00:04:58:20

Yeah. I'm coming all the way from

00:04:58:20 - 00:04:59:13

art

00:04:59:13 - 00:05:00:05

and

00:05:00:05 - 00:05:02:05

arts and economy. That's your major. Like,

00:05:02:05 - 00:05:03:02

what should you study

00:05:03:02 - 00:05:03:14

to

00:05:03:14 - 00:05:05:05

cutting edge AI?

00:05:05:05 - 00:05:08:06

Like, I just cannot understand it. And I think

00:05:08:06 - 00:05:09:14

it's a testament

00:05:09:14 - 00:05:14:13

or it's the evidence because like you, you're not the first person also to talk to me about

00:05:14:13 - 00:05:20:06

using cloud for art or using grok for art, or using cursor for art,

00:05:20:06 - 00:05:20:23

I think,

00:05:20:23 - 00:05:23:19

I think this is the era of tastemakers,

00:05:23:19 - 00:05:25:14

people with taste and the

00:05:25:14 - 00:05:28:04

crazy, unique intuitive energies.

00:05:28:04 - 00:05:29:21

I think they are going to

00:05:29:21 - 00:05:42:23

become the most valuable folks in the marketplace. I agree, I have a lot of conversations with people about neuro divergence. It's nice to have that word now, because when I was growing up, people would just say, Scott, you're really weird.

00:05:42:23 - 00:05:47:07

Now we have words for it. And I would. And I have this conversation a lot, too.

00:05:47:07 - 00:05:52:15

This is probably the era not just of taste, but of people who are empirically always assumed to be weird.

00:05:52:15 - 00:05:55:13

Like, this is our time to step up because the normal people,

00:05:55:13 - 00:06:04:13

are just conditioned to say, tell me what to do. Tell me the canonical order of this task, and I'll do it. And then you'll pay me and I get to look smart or whatever

00:06:04:13 - 00:06:06:07

the era we're in now,

00:06:06:07 - 00:06:09:12

you can wait for people to figure out and tell you what to do.

00:06:09:12 - 00:06:13:10

But even what they're going to tell you is irrelevant because everything's moving so fast.

00:06:13:10 - 00:06:14:19

More so. It's,

00:06:14:19 - 00:06:19:23

just to use the art analogy, there's a bunch of paint, there's a bunch of colors, there's some tools,

00:06:19:23 - 00:06:21:09

there's a lot of white space.

00:06:21:09 - 00:06:21:20

Yeah.

00:06:21:20 - 00:06:24:09

So what are you going to do with it? No one's going to tell you.

00:06:24:09 - 00:06:29:05

And if you just kind of want to play it safe and be like, oh, I'm going to doodle or whatever, that's fine. But

00:06:29:05 - 00:06:33:23

there's so much white space. And as we're talking about, it's people who are

00:06:33:23 - 00:06:38:01

probably a little weird, probably have kind of voices in their head, a different perspective.

00:06:38:01 - 00:06:41:17

Who will see this opportunity and say, well, no one's really painted over there.

00:06:41:19 - 00:06:46:08

Yes. This this looks kind of cool. Let me see what I can create. And that's exactly how this whole,

00:06:46:08 - 00:06:51:03

responsible AI angle has come. Come to be. It's been,

00:06:51:03 - 00:07:01:05

what we've talked about kind of being an artist, etc.. I think that maps very closely to being an entrepreneur, where you have a vision, whether I'm painting this or I see a vision of a problem, I can solve

00:07:01:05 - 00:07:02:05

and the idea of

00:07:02:05 - 00:07:09:23

what tools so I have available to start kind of making that picture that makes me feel something, and then get that in front of customers and say like, do you feel this to

00:07:09:23 - 00:07:11:08

and then expand from there?

00:07:11:08 - 00:07:16:00

That's been exactly how this responsible AI journey has unfolded. I learn in increments.

00:07:16:00 - 00:07:22:07

I say, wow, that's what you just told me. How I'm differentiated here with data provenance. For example, like I have

00:07:22:07 - 00:07:29:10

ethically sourced training data for face models that no one else has at a scale no one else has. With GDPR compliant consent.

00:07:29:10 - 00:07:32:02

I have a customer who tells me I'm highly regulated.

00:07:32:02 - 00:07:36:22

I can't put machine learning models into production without that data provenance, and I don't have it,

00:07:36:22 - 00:07:38:18

but you have it. That's amazing.

00:07:38:18 - 00:07:49:01

Now I know. Cool. I have this thing that makes people feel something about responsible AI because they're highly regulated. So I can show that picture to other customers like them and say, I got you.

00:07:49:01 - 00:07:50:01

We're cool here.

00:07:50:02 - 00:07:54:10

And similarly, I found other sort of white space where the market is telling me,

00:07:54:10 - 00:08:03:01

you guys are uniquely positioned for this, for this responsible AI situation there to like demographic fairness. All the other models seem racist because they don't have data and

00:08:03:01 - 00:08:09:01

darker skin tones. But you've already demonstrated from a big tech company that you are leading edge in that

00:08:09:01 - 00:08:09:16

there are two.

00:08:09:17 - 00:08:11:10

I can show that painting to people.

00:08:11:10 - 00:08:19:13

We're not racist. Our models are actually leading edge in demographic fairness, and it just. And more stories like that that unearth themselves to me and become part of this

00:08:19:13 - 00:08:22:17

responsible AI art piece. And then I can show people.

00:08:22:17 - 00:08:24:21

So basically, you are a SpongeBob.

00:08:24:21 - 00:08:29:22

You're a sponge that could absorb as many contradictory signals,

00:08:29:22 - 00:08:32:05

bring them all inside your

00:08:32:05 - 00:08:33:02

energy

00:08:33:02 - 00:08:33:20

field

00:08:33:20 - 00:08:37:21

and make sense of them and create something that everyone that is

00:08:37:21 - 00:08:38:17

has given you.

00:08:38:17 - 00:08:40:05

Those inside could use it.

00:08:40:05 - 00:08:44:20

That's yeah, I don't I don't watch SpongeBob enough to know I experience. Yes.

00:08:44:20 - 00:08:50:00

And I would argue, I would argue we're all like that. We're all doing that. It's just some of us might like I've been,

00:08:50:00 - 00:08:54:14

as we mentioned, kind of being weird. I've been leaning into this for a long time.

00:08:54:14 - 00:08:58:21

I think I started really waking up in my spiritual journey like 15 years ago.

00:08:58:23 - 00:09:07:07

And that led to kind of the kernels of this where the inner voice would be saying, hey, you've got an insight here. This plus this equals that. And I wasn't comfortable saying it.

00:09:07:07 - 00:09:12:05

And I started getting comfortable saying these things like 15, 14, 13 years ago

00:09:12:05 - 00:09:16:07

and just kind of coming out of the show with that, you don't have confidence in it, you're not owning it.

00:09:16:07 - 00:09:20:18

And people would be like, well, that was a weird thing to say. I mean, you're right, but yeah, who cares?

00:09:20:18 - 00:09:28:15

But then you realize, well, actually, there was an insight there that you guys, I connected dots and like change the perspective. So thanks for validating that. I know it was weird, but

00:09:28:15 - 00:09:30:11

whatever. And just over time

00:09:30:11 - 00:09:33:08

you get more in the habit of it's just like public speaking.

00:09:33:08 - 00:09:33:19

Like

00:09:33:19 - 00:09:36:04

I'm really scared to do it. But if you just keep doing it

00:09:36:04 - 00:09:38:16

all right, it's not as scary here to

00:09:38:16 - 00:09:45:01

follow your inner voice, this intuition, this energy, this like wisdom that wants to come through to help us figure out.

00:09:45:01 - 00:09:50:11

And apparently this is how humanity is innovated for thousands of years. Like you don't have to consciously,

00:09:50:11 - 00:09:52:06

oh my God, I have to think of the solution.

00:09:52:07 - 00:09:56:14

It's like the energy is there. It wants to come through and solve problems. You just honor it by

00:09:56:14 - 00:10:00:15

listening to it, saying, what needs to be said and then seeing where it goes.

00:10:00:15 - 00:10:07:14

And I think the sponginess is an aspect of that. Well, I don't think I'm consciously thinking, oh God, I have to connect these dots, but rather,

00:10:07:14 - 00:10:09:01

hey, an insight just came up,

00:10:09:01 - 00:10:10:02

was thrown at you.

00:10:10:02 - 00:10:10:19

What do you think?

00:10:10:19 - 00:10:13:10

Oh, you think that insights terrible, cool learning?

00:10:13:10 - 00:10:16:17

Or do you think that insights cool, right? Let me see where it goes next. Wow.

00:10:16:17 - 00:10:17:07

So,

00:10:17:07 - 00:10:23:16

two two follow up. I wanted to basically dive deep with you on this is that I think

00:10:23:16 - 00:10:25:07

now we are on this topic

00:10:25:07 - 00:10:25:17

or

00:10:25:17 - 00:10:35:12

AI is redefining what work is like, basically touching upon every industry, every vertical, every job growth

00:10:35:12 - 00:10:36:14

and redefining it.

00:10:36:16 - 00:10:39:07

Like when when Excel got introduced,

00:10:39:07 - 00:10:44:21

accounting and finance was up for grabs. But this is just across the board.

00:10:44:21 - 00:10:45:15

So,

00:10:45:15 - 00:10:53:01

I had a conversation with I think I was listening to a podcast and someone says, like, AI is going to basically replace all the,

00:10:53:01 - 00:10:55:02

governmental jobs.

00:10:55:02 - 00:10:57:18

And I think the guest was saying something interesting

00:10:57:18 - 00:10:58:06

that

00:10:58:06 - 00:10:59:22

these governmental jobs like,

00:10:59:22 - 00:11:03:01

you know, writing, signing contracts or like, you know,

00:11:03:01 - 00:11:05:14

creating folders or whatever, there were not

00:11:05:14 - 00:11:07:06

jobs to begin with.

00:11:07:06 - 00:11:10:23

We just these are like artificial definition of what we think

00:11:10:23 - 00:11:13:04

a job is or a role is.

00:11:13:04 - 00:11:15:23

So my question on a broader sense,

00:11:15:23 - 00:11:20:21

the white collar jobs that we have define, were there jobs to begin with,

00:11:20:21 - 00:11:22:01

or were there?

00:11:22:01 - 00:11:22:11

Where

00:11:22:11 - 00:11:26:11

they're created because we didn't have the right technology at that time?

00:11:26:11 - 00:11:27:03

Wow.

00:11:27:03 - 00:11:29:23

That's a fascinating framing of it.

00:11:30:00 - 00:11:32:17

I guess we'd have to define what a job is.

00:11:32:17 - 00:11:33:05

Yeah.

00:11:33:05 - 00:11:34:22

Yeah. To me,

00:11:34:22 - 00:11:39:08

without having neutral on this before at a layer of abstraction, it's. I have

00:11:39:08 - 00:11:45:18

I, as an employer, have something I need to be done. I know these words are too simple, but I have, like, value to be

00:11:45:18 - 00:11:47:05

created or unlocked,

00:11:47:05 - 00:11:50:22

and I'm willing to pay you some share of what that value is worth to me.

00:11:50:22 - 00:11:51:18

But not all of it.

00:11:51:18 - 00:12:04:18

And that maybe that that's like giving in to a helpful equation. So it's like taking the value of what you can do and like what cut you'd get for that. And you can imagine like very quickly that that simple framework would give you this like step function,

00:12:04:18 - 00:12:07:19

of different roles and the value and what you might be

00:12:07:19 - 00:12:08:22

compensated for it.

00:12:08:22 - 00:12:12:10

And to your point, I've had a very similar conversation for me. The,

00:12:12:10 - 00:12:19:19

the data point I go back to is I moved to New York City to join a band, and I also joined a nonprofit health care provider.

00:12:19:19 - 00:12:24:07

And I was like an internal consultant do it because I had previously been a management consultant.

00:12:24:07 - 00:12:26:08

This was a pretty interesting operation.

00:12:26:08 - 00:12:27:07

They had,

00:12:27:07 - 00:12:32:23

2500 nurses and therapists in New York City doing 20,000 home visits each day,

00:12:32:23 - 00:12:38:13

so helping patients in their homes so they wouldn't get re hospitalized so they can recover from surgery, stuff like that.

00:12:38:13 - 00:12:43:02

The business, the nonprofit was about 150 years old at that time.

00:12:43:02 - 00:12:47:14

And as you can imagine, health care. And this was about 20 years ago to that I joined health care,

00:12:47:14 - 00:12:50:17

forever has been like very transactional and paper based.

00:12:50:18 - 00:12:51:06

And,

00:12:51:06 - 00:12:58:03

the year before I joined this company, they had just gone digital. So after what, what it's been like 100 and

00:12:58:03 - 00:12:59:09

decades of,

00:12:59:09 - 00:13:04:11

nurses and therapists out in the field, like noting things on paper, bringing it back,

00:13:04:11 - 00:13:07:08

having it, like, go through different sort of waterworks,

00:13:07:08 - 00:13:11:17

all these different sort of pipelines of people moving stuff, saying, I need to photocopy that.

00:13:11:17 - 00:13:16:01

I need to carbon copy it, whatever this goes to insurance, this goes to a point of record.

00:13:16:01 - 00:13:18:00

This goes to our files, whatever.

00:13:18:00 - 00:13:21:09

They had an army of people for decades who just did that.

00:13:21:09 - 00:13:23:04

And if we go back to my like a

00:13:23:04 - 00:13:28:00

very simple equation, the value in like for me what I'm going to pay for you, the value was this is highly transactional.

00:13:28:00 - 00:13:32:01

We got paper everywhere. We need to move it through kind of the veins of this business

00:13:32:01 - 00:13:38:00

so everyone knows what's happening. We need to move it to the insurance payers so we can get paid. We need to move it to the government. Whatever.

00:13:38:00 - 00:13:42:05

And I need armies of people to do that. So they had literal armies of people

00:13:42:05 - 00:13:43:09

when they went digital.

00:13:43:09 - 00:13:48:11

All the nurses now had these laptops that were like very hardened. You could drop them and feel comfortable.

00:13:48:11 - 00:13:52:12

Their entire system was now put in these laptops. So no more paper.

00:13:52:12 - 00:13:54:08

Those armies, people were not needed.

00:13:54:08 - 00:13:56:15

It's like, well, I don't I don't even need that.

00:13:56:15 - 00:13:57:03

I mean,

00:13:57:03 - 00:13:57:21

I don't want to

00:13:57:21 - 00:14:02:08

diminish, diminish the value they gave or what they were doing as human beings with a purpose in life.

00:14:02:08 - 00:14:05:09

With that value you gave me a filing papers like it's gone.

00:14:05:09 - 00:14:10:10

I don't I can't even pay you that like minimum wage I was giving you because I have nothing like that for you anymore.

00:14:10:10 - 00:14:15:01

So an army of people left right before I joined. And there were still all these, like,

00:14:15:01 - 00:14:17:06

ancient filing cabinets, but they were empty.

00:14:17:06 - 00:14:18:03

So it's like, well,

00:14:18:03 - 00:14:19:09

it's very similar to me.

00:14:19:09 - 00:14:22:10

And when I had this conversation with someone else, he said, hey, did you know,

00:14:22:10 - 00:14:28:19

120 years ago when horses were still everywhere in cities, there were people who were just,

00:14:28:19 - 00:14:30:05

shoveling shit?

00:14:30:05 - 00:14:36:03

Sorry. Pardon my French. There were people in cities whose whole job was just shoveling horse poop.

00:14:36:03 - 00:14:37:13

And when we had cars,

00:14:37:13 - 00:14:38:09

it's the same deal.

00:14:38:10 - 00:14:40:21

Like, I'm not going to pay you to shovel horse poop anymore.

00:14:40:21 - 00:14:46:12

And again, I'm not trying to say these low level or these white collar jobs are getting replaced. Are is

00:14:46:12 - 00:14:50:22

are the same as shoveling horse poop or filing papers. But it's the same sort of argument

00:14:50:22 - 00:14:53:03

I no longer like. This is the value you were giving me.

00:14:53:03 - 00:14:54:10

This is why paying you for,

00:14:54:10 - 00:14:58:11

I now reduces what I need to pay to like nothing.

00:14:58:11 - 00:15:01:08

And that's the conversation that's happening.

00:15:01:08 - 00:15:04:00

And it's a really weird time right now where,

00:15:04:00 - 00:15:09:08

depending on who you talk to, our hiring markets are locked because a bunch of companies,

00:15:09:08 - 00:15:13:13

they can see this sort of model shaping up, but they don't know enough yet.

00:15:13:19 - 00:15:18:03

So you'll talk to hiring managers, you'll hear news stories, whatever.

00:15:18:03 - 00:15:19:05

Depending on like, the

00:15:19:05 - 00:15:23:15

data, you can get one on, one talking to someone or just abstracting, looking at the news

00:15:23:15 - 00:15:30:04

generally, the thought is, I know right now I need people, but if I hire them right now, how do I know

00:15:30:04 - 00:15:33:08

something won't change so that the leverage ratio is different?

00:15:33:08 - 00:15:36:19

I thought I needed two product managers, but it actually turns out I need

00:15:36:19 - 00:15:37:22

a quarter of one

00:15:37:22 - 00:15:40:09

because I can do all these things or,

00:15:40:09 - 00:15:42:01

I don't need anybody

00:15:42:01 - 00:15:50:09

or the people I hired a completely irrelevant because this new system is going to come out in three months, and what I hire them for doesn't even exist anymore, and they don't know what to do.

00:15:50:09 - 00:15:53:01

So it's like a very weird time, where

00:15:53:01 - 00:15:57:22

I would argue no one really knows what to do. And so they're doing nothing just to, like, figure it out.

00:15:57:22 - 00:15:59:06

So that's that's because

00:15:59:06 - 00:16:04:19

you probably have a much better read of the market than the hiring, especially on the employer side.

00:16:04:19 - 00:16:08:23

So is there going to be because like, I don't think there is going to be a

00:16:08:23 - 00:16:14:08

I think from now on is going to be only an acceleration of these models and they're getting better.

00:16:14:08 - 00:16:14:19

And then

00:16:14:19 - 00:16:21:16

they are basically covering more use cases as they're like, you know, as part of their own core platform. Like when you look at the

00:16:21:16 - 00:16:22:17

ChatGPT

00:16:22:17 - 00:16:26:13

like now it has canvases kind of like White Contract for you.

00:16:26:13 - 00:16:32:11

You can edit it. So I think there is going there is going to be more embedded use cases in this platform.

00:16:32:12 - 00:16:38:12

Is there going to be a point of certainty in this acceleration that we think, okay,

00:16:38:12 - 00:16:42:06

based on what we are seeing now, we can commit to

00:16:42:06 - 00:16:45:07

adding 5% more to our headcount

00:16:45:07 - 00:16:47:03

or is there going to be,

00:16:47:03 - 00:16:48:20

indefinite

00:16:48:20 - 00:16:57:07

state of uncertainty when it comes to who we can hire, what role we need, and what value could be created as part of hiring these roles.

00:16:57:09 - 00:17:00:12

Yeah, I can I can kind of point to my own experience.

00:17:00:12 - 00:17:07:13

I'm not representative of kind of the state of the art of AI product management as it relates to large language models. Like, there's,

00:17:07:13 - 00:17:10:16

so many podcasts going out right now, you can see like,

00:17:10:16 - 00:17:19:14

a really fascinating one. This gentleman, Amon Khan, who works at arise, which to me presents this sort of control plane on large language models

00:17:19:14 - 00:17:27:13

where you can do like ab testing and really fascinating, you can compare prompts, models, etc. to for certain use cases and AB tests.

00:17:27:13 - 00:17:37:03

This one took 30s and gave an amazing result. This one took two seconds and gave an okay result. I can live with the okay. Now let me look at my next thing I'm trying to. So far so really fascinating.

00:17:37:03 - 00:17:47:03

That's the interface being used by like AI product managers at places like Uber and DoorDash where they're like getting really in the weeds on production systems.

00:17:47:04 - 00:17:54:07

I'm not living in that world. I'm living on commercializing leading edge computer vision solutions in ways that no one's done before.

00:17:54:07 - 00:17:59:21

But I'm supporting myself as a sort of super. I see this like super individual contributor,

00:17:59:21 - 00:18:05:10

where I'm handling product sales, marketing, business development, partnerships all by myself.

00:18:05:10 - 00:18:08:10

And so I'm stringing together these solutions to help me.

00:18:08:10 - 00:18:10:08

So that's giving me the insight, like,

00:18:10:08 - 00:18:13:04

and that's the lens I can look through to say

00:18:13:04 - 00:18:14:01

in a

00:18:14:01 - 00:18:20:12

I'll use my gurus book here as a sort of or aka organizational structure. Let's say this is an entire company

00:18:20:12 - 00:18:24:16

over here's marketing, here's like customer support, here's sales, here's

00:18:24:16 - 00:18:30:14

legal, here's whatever. As we're saying right now, you're going to start to see these use cases start to subsume stuff

00:18:30:14 - 00:18:31:11

like, oh, marketing.

00:18:31:11 - 00:18:42:17

I had an outbound team who used to do, my email campaigns to find leads and stuff. I'm doing that now, my by myself using Apollo for email addresses clogged for,

00:18:42:17 - 00:18:52:16

pretty snappy, email copy. And then, I'm sending it out by myself. So I'm starting to subsume those marketing functions just from I'm doing.

00:18:52:16 - 00:18:54:12

And I'm not even doing it the most elegant way

00:18:54:12 - 00:18:59:19

you're talking about legal. So that legal box is going to start to get subsumed where it's like, hey, I don't need a lawyer

00:18:59:19 - 00:19:05:11

to draft this contract from scratch. I can now use this template or use this system, this canvas.

00:19:05:11 - 00:19:13:03

Draft a first version that looks really good, fine tune it myself and then realize, oh, this new thing we're doing is two different.

00:19:13:03 - 00:19:17:09

Hey, general counsel, can you look at that paragraph but not start the whole document?

00:19:17:09 - 00:19:19:03

We're at that era now,

00:19:19:03 - 00:19:25:04

and I think that's where this is going, where before you had teams of people doing stuff, now you're going to have sort of,

00:19:25:04 - 00:19:32:00

it's like the Steve Jobs play the orchestra. You can have single people playing an orchestra of AI solutions,

00:19:32:00 - 00:19:36:13

and in some cases, maybe you need a single operator helping or maybe not,

00:19:36:13 - 00:19:40:14

but it's kind of getting to that where you get a new, you get like new leverage ratios where you're

00:19:40:14 - 00:19:43:19

you're able to do things you before didn't have the expertise.

00:19:43:19 - 00:19:46:17

Certainly would not feel comfortable doing it. But now it's kind of

00:19:46:17 - 00:19:51:05

everything's opening up in an interesting way, but also a scary way for people

00:19:51:05 - 00:20:03:00

who are not adapting to that, who used to be inside that part of the box that's getting subsumed. So, back to your journey. So you were the commercial AI, of

00:20:03:00 - 00:20:09:11

Lenovo in 2021, before ChatGPT was became known to public audience.

00:20:09:11 - 00:20:12:13

ChatGPT was known especially the early version,

00:20:12:13 - 00:20:14:12

but it was like not like, you know,

00:20:14:12 - 00:20:15:23

everyone would know

00:20:15:23 - 00:20:17:10

you need to be in the

00:20:17:10 - 00:20:18:18

ins and outs of it

00:20:18:18 - 00:20:20:16

to know that, okay, this is coming,

00:20:20:16 - 00:20:23:13

but, so how did you go from

00:20:23:13 - 00:20:26:20

like, you know, being obsessed with art and music and then

00:20:26:20 - 00:20:31:18

gradually go to taking on PM roles and then land and I role

00:20:31:18 - 00:20:33:18

very earlier than everyone else?

00:20:33:18 - 00:20:38:17

Were I thinking about AI and then to your current role, which is very exciting and

00:20:38:17 - 00:20:42:01

it's sort of segueing into responsible AI, at Real Eyes?

00:20:42:01 - 00:20:44:08

no, no plan, totally,

00:20:44:08 - 00:20:45:05

totally

00:20:45:05 - 00:20:53:17

just following the muse. So it really goes back to to music. And I am still obsessed. Do. I'll just show your show your audience. I'm.

00:20:53:17 - 00:20:55:17

I'm living and breathing music every day.

00:20:55:17 - 00:20:59:23

So my office is also a music studio. And the idea was,

00:20:59:23 - 00:21:06:08

out of college. I was an analyst for several years in different capacities. So, like, using economics.

00:21:06:08 - 00:21:08:15

I studied film, as we mentioned, but

00:21:08:15 - 00:21:12:11

I didn't like working with my classmates on productions. So I,

00:21:12:11 - 00:21:16:22

I kind of steered into theory and criticism and just wrote papers for three years, and then

00:21:16:22 - 00:21:18:21

stretched my brain with economics.

00:21:18:21 - 00:21:19:19

Came out of

00:21:19:19 - 00:21:23:01

school thinking, you know what? I'm not sure I want to go be like,

00:21:23:01 - 00:21:29:02

a grunt in Hollywood. I think I'm going to try this econ pass. So I was an analyst in different capacities and playing music.

00:21:29:02 - 00:21:35:13

I had an opportunity with a band. I was living in Chicago. A band. Asked me to join them in New York, some friends from school, so I did,

00:21:35:13 - 00:21:40:01

I still kept working as an analyst at that nonprofit health care provider.

00:21:40:01 - 00:21:49:01

Then we had an opportunity. The band that I formed in New York. No one really in New York, understood what we were trying to do. It was a kind of a unique sound we were going for.

00:21:49:01 - 00:21:54:17

But a producer in LA learned about us. He learned about us from a feature in a magazine that I got us,

00:21:54:17 - 00:21:56:20

and he called me and said, like, I get it.

00:21:56:20 - 00:22:00:18

I get what you're trying to do. I love your idea. I love your sound. I want to work with you.

00:22:00:18 - 00:22:04:02

So we flew out to record an album with him for a week,

00:22:04:02 - 00:22:13:17

and the album sounded amazing. The mixes he was sending us were amazing. There was a lot of energy and excitement and enthusiasm in LA, and we said, well, geez, we should move to LA,

00:22:13:17 - 00:22:14:21

so I did.

00:22:14:21 - 00:22:19:00

That was June of 2008. I got a job at an internet Yellow Pages provider.

00:22:19:00 - 00:22:24:15

And the other two guys were in finance and June of 2008 was when, like the Great Recession really hit

00:22:24:15 - 00:22:28:02

you. Finance wasn't happening. So the band broke up when I moved,

00:22:28:02 - 00:22:33:02

but it led me to my destiny in multiple ways. I ended up meeting my wife at that job.

00:22:33:08 - 00:22:34:12

She was already there.

00:22:34:12 - 00:22:42:03

And it led me to getting into product management. So I was still an analyst and nothing to do with product. But after a year, I joined the search engine marketing team.

00:22:42:03 - 00:22:51:09

I got really deep in SQL as an autodidact, writing code that was governing spend on the order of, 3 to $4 million a month with the major search engines.

00:22:51:11 - 00:22:55:06

And just by virtue of doing that for about a year and getting technical through it,

00:22:55:06 - 00:22:57:22

just through drinking from the firehose,

00:22:57:22 - 00:23:01:15

one day the leadership team said, hey, Scott, you should be a product manager.

00:23:01:15 - 00:23:05:07

And I didn't even know what that meant. I was like, I never heard of that. What's that?

00:23:05:07 - 00:23:08:11

Sounds interesting. But it was my destiny, so, you know.

00:23:08:11 - 00:23:10:12

Thank you, universe. I didn't have a plan. But

00:23:10:12 - 00:23:15:14

immediately, that job gave me my first experience working with data scientists on AI.

00:23:15:14 - 00:23:22:02

So I've truly been an AI product manager for, like, since day one. I got ownership of a,

00:23:22:02 - 00:23:29:11

they kind of needed me, this product manager. And then they handed me this product, and it was a, premium ad product for an internet Yellow Pages play

00:23:29:11 - 00:23:31:01

owned by a phone company.

00:23:31:03 - 00:23:41:16

And the idea was, providing guaranteed email and phone call leads to additional listings to small businesses you could imagine, like Denver plumbers, Orlando, Denver, Chicago lawyers.

00:23:41:16 - 00:23:50:15

And so I worked with the data science team on the optimization engine for that, where you'd have these businesses paying something on the order of like $2,500 a month

00:23:50:15 - 00:23:53:14

in exchange for a guaranteed, amount of leads.

00:23:53:14 - 00:23:54:07

So the

00:23:54:07 - 00:24:01:08

the data science team was working on the machine learning models, like, hey, I've got all these keywords I could be bidding on. Which ones are the most performant?

00:24:01:08 - 00:24:11:13

I've got all these advertisers and this curve of, like, the performance I'm trying to get, and they're not all going to hit their targets. I'm trying to get many of them in the middle, and some will lose and some will exceed.

00:24:11:13 - 00:24:22:06

I have a marginal dollar to spend on behalf of any of them right now. What should I spend it on? Should I use our owned and operated site where it's constantly loading impressions and I can do yield optimization?

00:24:22:06 - 00:24:27:11

Should it be these keywords I'm bidding on? Should I shut them off? I've got a network partner calling me.

00:24:27:12 - 00:24:29:05

Who should I show and in what order?

00:24:29:05 - 00:24:32:01

So right from day one, I got to dive into AI

00:24:32:01 - 00:24:34:15

without really knowing that it was setting me on this path.

00:24:34:15 - 00:24:43:12

Got really deep in APIs, got really deep in like, programmatic business logic. So like from day one, like the universe blessed me with this crazy opportunity to cut my teeth

00:24:43:12 - 00:24:46:10

and that led me on this path of just continually,

00:24:46:10 - 00:24:48:23

really using kind of fear as my guide.

00:24:48:23 - 00:24:52:19

Like, oh, that sounds scary. I'm going to do it. So that led me to,

00:24:52:19 - 00:24:58:08

to ad tech, martech, internet of Things, Lenovo, as you mentioned, Hewlett Packard Enterprise,

00:24:58:08 - 00:25:06:09

and pretty much all of them have been at the intersection of leading edge technologies and artificial intelligence, just in different forms.

00:25:06:09 - 00:25:07:14

At

00:25:07:14 - 00:25:10:18

Lenovo was the first time it was officially the title.

00:25:10:18 - 00:25:13:13

You are a commercial AI product manager.

00:25:13:13 - 00:25:22:01

And so that was exciting to, like, have the title tied to it. But all other things up to that point still had AI under the hood.

00:25:22:01 - 00:25:25:02

And it was really to your point about ChatGPT,

00:25:25:02 - 00:25:30:21

it was fascinating. At Lenovo, my first year was actually on the consumer side, and I was working on AI, where we had

00:25:30:21 - 00:25:33:18

agents running locally on consumer devices.

00:25:33:18 - 00:25:41:17

So when you fire up your new laptop, tablet, phone or whatever as it is on most manufacturers, they'll say, hey, do you want to share data with us for performance

00:25:41:17 - 00:25:45:10

so we can optimize and understand bugs? And if you say yes to that,

00:25:45:10 - 00:25:53:10

this Lenovo agent measures how you use your device. What apps you use, what peripherals you attach, how you how you behave throughout the day.

00:25:53:10 - 00:26:02:09

And a, a AI team, a team of machine learning engineers I worked with to take those observations and then have models that would say, given this telemetry,

00:26:02:09 - 00:26:07:20

I think this user is a hardcore gamer. I think this user's a business user. I think this user is a student.

00:26:07:20 - 00:26:09:19

Create these sort of audience profiles.

00:26:09:19 - 00:26:12:15

And then I worked with that content management system

00:26:12:15 - 00:26:17:11

where we could we had SDK is tied to all these different Lenovo surfaces in the ecosystem.

00:26:17:11 - 00:26:19:15

And you could create these targeted experiences.

00:26:19:15 - 00:26:27:05

They didn't really know what they had. So I first had to prove how this worked and then said you can this is essentially like an internal audience network. Now you can innovate

00:26:27:05 - 00:26:28:17

and reach very particular

00:26:28:17 - 00:26:34:18

Lenovo customers with very particular messages tied to them just to add value, but ultimately to sell more stuff.

00:26:34:20 - 00:26:40:10

The challenge was content was created by hand. So this was like 2020, 2021

00:26:40:10 - 00:26:46:16

where, as you mentioned, people like OpenAI existed. They didn't call it ChatGPT yet. It was like GPT

00:26:46:16 - 00:26:48:05

one, GPT two,

00:26:48:05 - 00:26:51:14

you needed to know an investor in OpenAI to get access.

00:26:51:14 - 00:26:58:15

I didn't I had just been trying to learn about this. I was trying to figure out, like, there must be AI that can at least create a first version of content

00:26:58:15 - 00:26:59:23

so that these companies

00:26:59:23 - 00:27:05:00

I was trying to figure that out in 2020, 2021, I couldn't like I couldn't find anything,

00:27:05:00 - 00:27:05:21

but I was

00:27:05:21 - 00:27:08:15

like, my resume in that era is kind of Swiss cheese.

00:27:08:15 - 00:27:11:02

I had some interesting jumps due to opportunities,

00:27:11:02 - 00:27:16:08

but one was I was I was poached by some friends from college to join a company.

00:27:16:08 - 00:27:19:03

They built it using low and no code tools.

00:27:19:03 - 00:27:25:10

It was a talent marketplace for hiring marketers. It's like Uber for hiring the world's best marketers on demand.

00:27:25:10 - 00:27:30:08

They knew OpenAI investors they had access to, like, GPT two.

00:27:30:08 - 00:27:33:10

I believe it was in like 2021, 2022.

00:27:33:10 - 00:27:38:11

So they brought me on, to be their VP of product. And that's where I got closer to these systems,

00:27:38:11 - 00:27:39:20

in that era.

00:27:39:20 - 00:27:48:05

And then when I joined, when I was interviewing, actually, a couple of years later for the role I have now at, at realize

00:27:48:05 - 00:27:53:03

one of the co-founders is very much a futurist, and I met him in like July of 2022,

00:27:53:03 - 00:27:58:17

and he was telling me, hey, in a couple of months, we're about to see just explosions with large language models.

00:27:58:19 - 00:28:07:04

OpenAI is about to release something, runaway is releasing insane stuff. Midjourney as well with vision or sorry with, image generation.

00:28:07:04 - 00:28:09:20

It's about to get very spicy. And I was like, well, I,

00:28:09:20 - 00:28:16:14

I don't see what you see, but okay. And I joined in October and like two weeks later is when ChatGPT was released and everything changed.

00:28:16:14 - 00:28:17:02

So

00:28:17:02 - 00:28:19:13

yeah, it's been

00:28:19:13 - 00:28:20:08

been a while

00:28:20:08 - 00:28:24:05

was, it's a very shy adjective to

00:28:24:05 - 00:28:25:21

explain where you have

00:28:25:21 - 00:28:35:06

come along. All that for real, I, I that's that's a really exciting, basically spicy part of the podcast I want to get into. But before that,

00:28:35:06 - 00:28:36:14

I think you're the first

00:28:36:14 - 00:28:44:08

and the right person. I would, I could ask this question because God knows how many keynotes I've been to in these conferences and these

00:28:44:08 - 00:28:47:21

I've, I've listened to, watch to whatever the case.

00:28:47:23 - 00:28:50:10

And everyone talks about responsible AI, but

00:28:50:10 - 00:28:51:03

no one

00:28:51:03 - 00:28:52:09

most likely

00:28:52:09 - 00:28:57:09

did they haven't experimented enough with or they don't have a product that does that.

00:28:57:09 - 00:29:00:02

So their their their explanation, although

00:29:00:02 - 00:29:01:19

there were there are some merits

00:29:01:19 - 00:29:03:20

to it, but it's not really

00:29:03:20 - 00:29:04:22

hitting hard

00:29:04:22 - 00:29:07:04

that you could say I get it now, but

00:29:07:04 - 00:29:12:11

probably you can give us a really good definition of what responsible AI is

00:29:12:11 - 00:29:15:12

and how it can contribute to a better world for humanity.

00:29:15:14 - 00:29:22:20

Yeah. This is something that's top of mind for me. Every day, as I learn more and more about what I'm doing and how the problems I can solve,

00:29:22:20 - 00:29:29:23

I can. I can answer this question through the lens of what I'm working on. I don't know if it'll abstract to like large language models, for example, but perhaps it will.

00:29:30:01 - 00:29:38:07

The lens I'm looking through and I try to create this contrast where on the one side I paint a picture of kind of this gloomy dystopian

00:29:38:07 - 00:29:46:12

tech bro future that I think a lot of people are trying to pull us towards. And then over here is kind of a and it's great. There's a sunny window. There's a sunny window over here painting the

00:29:46:12 - 00:29:49:07

this the sun through the clouds that I'm trying to see.

00:29:49:07 - 00:29:54:08

So over here is a world where you can steal data. You can just build models where,

00:29:54:08 - 00:29:58:08

it's not clear where the attribution came from. The models might not be fair.

00:29:58:08 - 00:30:00:08

In fact, they could be racist. Whatever.

00:30:00:08 - 00:30:12:20

Also, in this world you have, like Sam Altman with orb, where they want to scan your retina, they want to know in detail for the rest of your life who you are so that in detail for the rest of your life, they can recognize you everywhere you go

00:30:12:20 - 00:30:16:06

and potentially give you access to things or cut off access to things.

00:30:16:06 - 00:30:17:23

There's companies like clear,

00:30:17:23 - 00:30:23:06

which is relevant to what I'm working on. They make it very easy now with your government ID to get on airplanes.

00:30:23:06 - 00:30:30:00

And they want their ultimate strategy is they want to have this sort of repository of everyone's government ID on Earth,

00:30:30:00 - 00:30:35:19

and they want to make it very easy for you to go places in the physical world where they can map it back to say,

00:30:35:19 - 00:30:38:00

this is Scott trying to enter this building.

00:30:38:00 - 00:30:43:10

I know, because I've scanned his face and I know it maps to this government ID that I know that I own.

00:30:43:10 - 00:30:46:07

And I'm giving Scott access to that building. And now I'm

00:30:46:07 - 00:30:51:06

measuring on Scott's audience profile. Hey, in addition to going to the airport this often, he also goes

00:30:51:06 - 00:30:51:21

like that,

00:30:51:21 - 00:30:54:16

and they're going to sell that data and they're not being shy about it.

00:30:54:16 - 00:31:03:20

That's kind of the scary world I see where unethically sourced data, models that aren't fair, and people who want to track everything about you and sell it and exploit you,

00:31:03:20 - 00:31:05:18

in the interest of convenience,

00:31:05:18 - 00:31:09:22

and there's more to it that can feel even more evil if you sort of double click on it.

00:31:09:22 - 00:31:11:07

The world I'm trying to create.

00:31:11:07 - 00:31:16:23

So the solutions I'm working on, we recently got into the identity category through a relationship with a big tech company.

00:31:16:23 - 00:31:23:18

They validated us in some really interesting ways. So number one is we have an unmatched data set for testing and training models.

00:31:23:18 - 00:31:28:16

It comes out of our ad testing business. It's the largest in the wild collection of faces

00:31:28:16 - 00:31:29:22

it covers in the world.

00:31:29:22 - 00:31:34:15

And it covers more than 6 million identities from more than 93 countries. But most importantly,

00:31:34:15 - 00:31:39:15

we have GDPR compliant consent, giving us the rights to test and train models on those faces.

00:31:39:15 - 00:31:44:19

That data provenance alone, it's ethically sourced. We're allowed to use it. No one else has that.

00:31:44:19 - 00:31:53:09

And certain companies like this big tech company, they are not allowed to put things into production unless it has that data provenance, because they're so highly regulated internally

00:31:53:09 - 00:31:54:20

and by the world.

00:31:54:21 - 00:31:58:18

And they're not allowed to just, oh, put it into production, figure it out.

00:31:58:18 - 00:31:59:16

They can't do that.

00:31:59:16 - 00:32:02:21

The second order is the racism of these models.

00:32:02:21 - 00:32:10:13

The challenge is most of these models for faces are built using publicly available data sets. And those index largely, lighter skin tones.

00:32:10:13 - 00:32:12:00

And they're also way too easy.

00:32:12:00 - 00:32:17:17

So everyone's model looks great and everyone's model looks fair because most of the skins, most of the faces are white.

00:32:17:17 - 00:32:19:22

What what we've got our data set

00:32:19:22 - 00:32:27:09

completely disrupts that. And it's. You don't just have to trust us. This big tech company, part of how they scrutinized us on our journey to production,

00:32:27:09 - 00:32:32:20

was they guinea pig to us on their own new responsible AI demographic fairness testing protocol.

00:32:32:21 - 00:32:37:15

And in that test, we showed fairness across darker skin tones. It's never been publicly seen before.

00:32:37:15 - 00:32:42:09

So unmatched ethically sourced data and unmatched, demographic fairness.

00:32:42:09 - 00:32:50:01

And then from there, it's the spirit of a cool. So what are you doing with this? What problems are you trying to solve? If you go back to that dystopian,

00:32:50:01 - 00:32:52:19

if you imagine the clouds and like the lightning over here

00:32:52:19 - 00:32:58:18

and like very scary world where they want to know everything about you, they want to track you everywhere you go, and they're probably going to try to sell it.

00:32:58:18 - 00:33:03:10

In my world over here, I'm using these identity capabilities at hyperscale.

00:33:03:10 - 00:33:08:03

I want to solve for you can't lie about who you are. You can't be a bot,

00:33:08:03 - 00:33:11:05

and you can't masquerade on multiple accounts committing fraud.

00:33:11:05 - 00:33:12:22

But I don't care who you are.

00:33:12:22 - 00:33:14:10

I don't care where you've been.

00:33:14:10 - 00:33:16:20

I'm not trying to sell anything about you. I'm just saying it.

00:33:16:23 - 00:33:20:05

It's not easy to commit fraud anymore. Can we agree on that?

00:33:20:05 - 00:33:23:01

You're just not allowed to lie. But you don't have to tell me who you are.

00:33:23:01 - 00:33:27:11

That's the kind of world I'm working on creating. So it could be. For example,

00:33:27:11 - 00:33:35:05

we're working on really interesting opportunities in different industries where you might buy something and then go use what you buy in person.

00:33:35:05 - 00:33:40:18

I can validate when you're trying to buy it that you're not a bot, that you're unique, you're not on multiple accounts,

00:33:40:18 - 00:33:45:12

and that you generally are who you say you are. You're not masquerading as somebody that you're not.

00:33:45:12 - 00:33:51:04

And then at the point of you using whatever you bought, like let's say it's a product you bought, I can validate.

00:33:51:04 - 00:33:57:14

Yep. It's that same person who bought it. And this is all anonymously using just face embeddings generated by my AI models.

00:33:57:14 - 00:34:03:18

And again, I don't have to know. Oh, this is Scott Jones. He lives in Chapel Hill, North Carolina. He's trying to buy this or trying to log in.

00:34:03:18 - 00:34:06:03

And now I need to know. Hey, Scott, show me your government I.D..

00:34:06:04 - 00:34:08:01

Cool. You're still Scott Jones chess

00:34:08:01 - 00:34:16:12

face. Math record. They're not about. They're unique. Yep. They're generally who they say they are. Yeah, it's still anonymously. That same person. Cool. You're entitled to do this.

00:34:16:12 - 00:34:22:09

I don't have to know everything about you to do that. And there's, like, 90% of the world's use cases, I would argue where that's relevant.

00:34:22:09 - 00:34:24:17

And you don't need this heavyweight option where it's

00:34:24:17 - 00:34:27:12

really expensive and extremely invasive.

00:34:27:12 - 00:34:30:19

So I'm exploring all that white space where

00:34:30:19 - 00:34:35:20

I think there's very compelling opportunity with this responsible AI story to kind of change the world.

00:34:35:20 - 00:34:38:12

You don't have to have this surveillance state

00:34:38:12 - 00:34:44:00

everywhere you go. You can anonymously go around and and be held accountable to not commit fraud, but still

00:34:44:00 - 00:34:44:21

maintain,

00:34:44:21 - 00:34:46:08

this really powerful turn.

00:34:46:08 - 00:34:49:00

I just I just learned your digital sovereignty.

00:34:49:00 - 00:34:53:23

Like, you still own who you are in this digital world. You don't have to give it all up. And I'm trying to create that.

00:34:53:23 - 00:35:00:02

So based on what I'm getting is that responsible AI is about ethically sourced data sets

00:35:00:02 - 00:35:00:21

that is

00:35:00:21 - 00:35:02:12

diversified enough

00:35:02:12 - 00:35:04:08

to cover the most

00:35:04:08 - 00:35:06:15

real life use cases.

00:35:06:15 - 00:35:08:12

So no one is left alone.

00:35:08:12 - 00:35:09:18

No one is,

00:35:09:18 - 00:35:12:17

discriminated in any processes.

00:35:12:17 - 00:35:14:23

Also, responsible AI is about

00:35:14:23 - 00:35:17:15

just at the point of transactions,

00:35:17:15 - 00:35:24:10

validating certain use cases to make sure that this is not a fraudulent transaction. The person who says

00:35:24:10 - 00:35:26:21

who's behind it is the right person.

00:35:26:21 - 00:35:28:16

There is no law. There is no,

00:35:28:16 - 00:35:30:02

you know, wrong action

00:35:30:02 - 00:35:32:19

at that point when that is confirmed,

00:35:32:19 - 00:35:37:16

without accessing the identity of the person, without accessing the personal information

00:35:37:16 - 00:35:39:07

through secure measures,

00:35:39:07 - 00:35:39:21

then

00:35:39:21 - 00:35:41:19

the job of responsible AI is done.

00:35:41:19 - 00:35:43:16

The person can go ahead and

00:35:43:16 - 00:35:44:22

do the transaction

00:35:44:22 - 00:35:45:18

without

00:35:45:18 - 00:35:46:18

giving up

00:35:46:18 - 00:35:47:03

the

00:35:47:03 - 00:35:49:04

the valuable asset, which is their own

00:35:49:04 - 00:35:50:08

personal information.

00:35:50:08 - 00:35:56:13

Yeah that resonates. And I think some of what you said does map to large language models. So the first like the fairness aspect,

00:35:56:13 - 00:36:00:13

we have already learned over the past couple of years, like models trained in India

00:36:00:13 - 00:36:01:14

might reflect

00:36:01:14 - 00:36:10:04

or sorry, models trained outside of India, used in India, might not reflect the right cultural sensibilities and might say something that offends somebody or doesn't reflect

00:36:10:04 - 00:36:12:04

the sin crises of that culture,

00:36:12:04 - 00:36:13:03

and vice versa.

00:36:13:03 - 00:36:16:19

So this idea, like what I create over here, might not fit over here.

00:36:16:19 - 00:36:26:23

And like that, those, those sorts of insights have been revealed over time where it's like the, the nuance of who's testing and training is becomes kind of the spirit of what the model does.

00:36:26:23 - 00:36:33:16

And it might violate the expectations of people from other cultures or other other parts or other sort of sensibilities.

00:36:33:18 - 00:36:35:20

Another aspect I would say for

00:36:35:20 - 00:36:41:19

we would abstract to large language models as well as computer vision is like, do no harm like the the output shouldn't hurt people.

00:36:41:19 - 00:36:44:15

Which is something else we're seeing. So you can say,

00:36:44:15 - 00:36:53:23

a chat bot, for example, that is made with good intentions, but still we'll have a conversation with the kid that leads them to want to kill themselves.

00:36:54:01 - 00:36:54:21

It did harm.

00:36:54:21 - 00:37:00:20

So that's not responsible. I would argue it hasn't. It doesn't have the appropriate guardrails to

00:37:00:20 - 00:37:04:16

redirect from a potentially detrimental outcome,

00:37:04:16 - 00:37:07:11

just because it thinks it's doing what it's supposed to do.

00:37:07:11 - 00:37:07:23

Right.

00:37:07:23 - 00:37:10:05

And then so one thing that I'm really

00:37:10:05 - 00:37:15:06

interested in, in sort of like showcasing to the audience and I don't know if you can show,

00:37:15:06 - 00:37:16:04

is that

00:37:16:04 - 00:37:18:02

the U x of

00:37:18:02 - 00:37:19:00

basically.

00:37:19:00 - 00:37:25:01

So basically using vision technology and lots of a lot of machine learning, I don't know how you go about this

00:37:25:01 - 00:37:30:16

to, to basically validate the identity of the person, to allow them to do the transaction.

00:37:30:16 - 00:37:38:05

The UX of this is fascinating, and I've I've seen it a bit. So I don't know if if you are at the point that you can show it to the public.

00:37:38:07 - 00:37:42:13

Yeah. We've I've, I've shown it I've shown it before and I'm happy to do so. Now

00:37:42:13 - 00:37:42:23

So you see

00:37:42:23 - 00:37:44:17

a screen that says fill out a survey.

00:37:44:17 - 00:37:47:23

Cool. So the genesis of this was

00:37:47:23 - 00:37:52:21

I joined Real Eyes two and a half years ago and we have a relationship with a big tech company

00:37:52:21 - 00:37:54:17

where we help them build avatars.

00:37:54:19 - 00:38:03:23

And through that relationship we learned that they were looking for a face verification model. So shortly after I joined, I helped lead us through their RFI, where we essentially went 0 to 1.

00:38:03:23 - 00:38:06:23

In a couple of months on a new face verification model,

00:38:06:23 - 00:38:08:20

and we ended up winning their RFI.

00:38:08:20 - 00:38:10:07

So it's kind of a pinch yourself.

00:38:10:07 - 00:38:12:00

Wow, is this really happening moment.

00:38:12:00 - 00:38:13:09

And we learned a lot from that.

00:38:13:09 - 00:38:22:10

And part of it was all the learnings of the responsible AI positioning. I just told you where part of the reason why we won was our data provenance and our demographic fairness and all these other things.

00:38:22:10 - 00:38:27:11

So those all became part of my sort of understanding of the unique positioning we have.

00:38:27:13 - 00:38:31:13

And at that point, March of 2023, when we won this RFI,

00:38:31:13 - 00:38:37:10

I'm leading a platform business from 0 to 1 at Real Eyes where it's, it's a plug and play library of capabilities.

00:38:37:10 - 00:38:43:13

We started with attention and emotion and leading leading edge, proprietary models for that.

00:38:43:13 - 00:38:48:11

And now with this win, now my platform has Lego blocks, if you will, for solving

00:38:48:11 - 00:38:50:05

problems around identity as well.

00:38:50:07 - 00:38:54:14

And we were wondering, well, this is cool. What other problems can we solve with this?

00:38:54:14 - 00:38:59:21

We've been running an ad testing business at Real Eyes for more than a decade, and the premise of it is

00:38:59:21 - 00:39:03:09

we participate in the online market research ecosystem.

00:39:03:09 - 00:39:08:19

We acquire very discreet audience segments. So if you've ever worked in ad tech, it's very similar.

00:39:08:19 - 00:39:12:06

It could be, for example, like women with credit scores of this level

00:39:12:06 - 00:39:18:13

who drive this type of car, buy these type of products and shop at Walmart, hasn't it? As a very extreme example,

00:39:18:13 - 00:39:24:12

we would acquire that audience through the online market research ecosystem, bring them into environment hosted by us,

00:39:24:12 - 00:39:28:02

expose them to channel specific variations in ads,

00:39:28:02 - 00:39:33:16

and then deliver insights as to what version of the ad is the most performant with that audience by channel.

00:39:33:16 - 00:39:35:01

So you can imagine, we say

00:39:35:01 - 00:39:43:14

on TikTok, you should run this version of the ad, it's got this hero, it's got this color car, this call to action. But on YouTube, you should run this very particular other version.

00:39:43:14 - 00:39:51:05

That business we've been running has given us a front row seat to the study Rise of Fraud Online. So if you go back to that kind of vision, I'm trying to paint,

00:39:51:05 - 00:39:56:09

I just want to prove I want to create a world where you can't be a bot, you can't operate and click farms.

00:39:56:09 - 00:40:05:03

You can't lie about who you are. Those are the problems that are extremely bad in in the internet at large right now, obviously. But in market research, it's insane.

00:40:05:03 - 00:40:10:02

There's so many bad actors making so much money that on the buy side, you can actually assume

00:40:10:02 - 00:40:16:01

on average, 30 to 40% of what you get is going to be bad and you don't know what's what until you get it.

00:40:16:03 - 00:40:19:08

And there's a lack of incentives for the suppliers to solve the problem.

00:40:19:08 - 00:40:22:13

Many of them simply wouldn't have a business if they got rid of the fraud.

00:40:22:13 - 00:40:30:03

And there's lack of tools that actually work as we mentioned before, the kind of best thing you can do right now is very invasive. I can pay a dollar

00:40:30:03 - 00:40:33:16

to see super expensive to I can pay a dollar to see your government ID,

00:40:33:16 - 00:40:37:11

to make sure you are a citizen of a very particular place.

00:40:37:11 - 00:40:41:02

And your face right now in a selfie, is that same person, right?

00:40:41:02 - 00:40:46:11

That that doesn't fit this world where you're paying someone like, $0.50, maybe to do a survey, if you even.

00:40:46:11 - 00:40:49:07

So, the question was, what can you do,

00:40:49:07 - 00:40:55:15

to solve this? And we put together those identity building blocks to address it in a very disruptive way.

00:40:55:15 - 00:40:58:13

And that's what I'm about to show you. So for the purposes of the demo,

00:40:58:13 - 00:41:07:03

imagine I am a panelist in an online market research platform. They know my interests, my work history, my demographics, etc.

00:41:07:03 - 00:41:12:18

I've been invited to do a survey named after Riverside. The platform we're using on this podcast.

00:41:12:18 - 00:41:14:22

I say, great, I'd like to do that survey.

00:41:15:00 - 00:41:17:16

So this demo environment will take a moment to warm up.

00:41:17:16 - 00:41:24:15

Once it does, it will present me with the GDPR compliant consent framework that we've invented for this use case. So,

00:41:24:15 - 00:41:29:22

to your point, as a UX researcher, this is pretty interesting stuff. No one has done quite

00:41:29:22 - 00:41:32:01

anything like this before on the internet.

00:41:32:01 - 00:41:36:10

So we made some interesting choices as we evolved what this presentation would look like,

00:41:36:10 - 00:41:40:10

where we landed on it is to position it as a humanity check.

00:41:40:10 - 00:41:47:00

Conceptually, it's like the next evolution of a Captcha, but far simpler. I just need to make sure you're a human before I let you in here.

00:41:47:00 - 00:41:51:00

But don't worry, you don't have to solve any puzzles. Just briefly turn your camera on.

00:41:51:00 - 00:41:59:01

In the spirit of GDPR and California's CcpA and other international law, you get access to a privacy policy

00:41:59:01 - 00:42:01:21

and a path to update your consent with a unique code.

00:42:01:21 - 00:42:02:17

If you choose,

00:42:02:17 - 00:42:06:18

I say yes, I want to do this. So now we're getting camera access through the browser.

00:42:06:18 - 00:42:13:15

I give that access. We're now capturing the image off my camera, sending it to an endpoint, hosting my identity building blocks.

00:42:13:15 - 00:42:21:17

The image is immediately deleted. So important to note. Again, going back to that anonymous AI positioning, we are not storing images.

00:42:21:17 - 00:42:24:04

We intentionally built an anonymous system.

00:42:24:04 - 00:42:31:23

We're using the image to generate a face embedding you can think about that is a mathematical representation of the image bespoke to my model.

00:42:31:23 - 00:42:39:00

The application at that endpoint created this collection. I just name and it's validated my face embeddings unique. I haven't done this survey yet.

00:42:39:00 - 00:42:43:23

So for the purposes of this market, I presented us three layers of validation anonymously.

00:42:43:23 - 00:42:45:09

The first is personhood.

00:42:45:09 - 00:42:52:20

Extremely likely. I'm a human and not a bot. Very few bots today can get past what I've just shown you, but I've got a roadmap for when they can.

00:42:52:20 - 00:42:55:04

And I'm unique. I haven't done this before,

00:42:55:04 - 00:43:02:15

but as is often the case in this market, those first two conditions are true. I'm a human and I'm unique, but I'm still lying about who I am.

00:43:02:15 - 00:43:05:03

So we also provide demographic validation.

00:43:05:03 - 00:43:08:00

Imagine my profile said I'm a 20 year old woman.

00:43:08:00 - 00:43:11:09

The customer using this would know in real time before I get in.

00:43:11:09 - 00:43:19:16

Very likely a human. They haven't done this before, but they're lying. They're not a 20 year old woman, and you could decide, depending on what you're trying to solve for what you want to do with me dynamically.

00:43:19:17 - 00:43:23:23

So if you imagine I went through that gate, got into the survey, now I'm back outside.

00:43:23:23 - 00:43:33:22

Let's pretend I'm a bad actor. I'm a I'm a professional in a click farm. So I've. I'm operating all day long. I'm multiple accounts pretending to be different people so I can make money.

00:43:33:22 - 00:43:37:01

I've now logged into another account, pretending to be someone else

00:43:37:01 - 00:43:39:13

that other accounts been invited to do the same.

00:43:39:13 - 00:43:40:17

Riverside survey.

00:43:40:17 - 00:43:42:15

So I say, yes, I want to do it.

00:43:42:15 - 00:43:53:13

And again, I'm presented with that consent form. And as the fraudster I say great. I'm excited here thinking I'm about to get paid twice to do the same job. So again, in images captured sent to that endpoint,

00:43:53:13 - 00:44:01:05

my face embedding compared to existing faces connecting. Yeah. And they're with extremely high confidence able to say hey that's a high confidence match.

00:44:01:06 - 00:44:06:00

Don't waste your money. Don't pollute your sample. They've already done that survey. Send them somewhere else.

00:44:06:00 - 00:44:08:02

Does that make sense?

00:44:08:02 - 00:44:09:04

That's fascinating.

00:44:09:04 - 00:44:12:18

So I've got customers in this market using this for,

00:44:12:18 - 00:44:16:13

this sort of gate, as you see, protecting an endpoint,

00:44:16:13 - 00:44:21:11

but also for onboarding and authentication. This this has become a new,

00:44:21:11 - 00:44:28:20

a new lightweight alternative for onboarding and authentication, where at the moment of onboarding, I can validate this is a real human.

00:44:28:20 - 00:44:38:02

They're not already in my population. They're not already on another account. They're not trying to get multiple accounts that commit fraud. And they're generally are who they say they are. So for examples, dating apps,

00:44:38:02 - 00:44:40:17

you wouldn't be able to catfish. You wouldn't be able to say, oh, I'm a

00:44:40:17 - 00:44:42:07

I'm a 30 year old woman.

00:44:42:07 - 00:44:43:04

No, you're not,

00:44:43:04 - 00:44:44:00

you're not that.

00:44:44:00 - 00:44:47:14

So you can correct who you are or you're not allowed on my platform.

00:44:47:14 - 00:44:49:18

It opens up all these opportunities.

00:44:49:18 - 00:44:55:00

And the other angle of this, why we're different is the the economics and the scaling.

00:44:55:00 - 00:44:57:17

The opening price for this is $0.10 per check,

00:44:57:17 - 00:45:04:18

which is very disruptive. The existing category of ID vendor verification, where you use a government ID tends to be a dollar or more,

00:45:04:18 - 00:45:06:13

tends to take multiple minutes

00:45:06:13 - 00:45:08:14

as you just saw ours take seconds.

00:45:08:14 - 00:45:12:01

I can use any available webcam and and it cost pennies.

00:45:12:01 - 00:45:15:12

And at volume commitments it comes down to like fractions of a penny.

00:45:15:12 - 00:45:20:12

So I'll just say like very disruptive. The hardest part is just getting people to know that this is possible.

00:45:20:12 - 00:45:24:04

It's not even close. Look, I mean, your competitors are

00:45:24:04 - 00:45:24:12

there

00:45:24:12 - 00:45:27:01

what they are charging and what you're charging.

00:45:27:07 - 00:45:28:12

I don't even comparable.

00:45:28:12 - 00:45:30:15

It's like you're 100 x,

00:45:30:15 - 00:45:31:03

like,

00:45:31:03 - 00:45:32:00

the

00:45:32:00 - 00:45:33:11

like better and cheaper

00:45:33:11 - 00:45:34:13

competitors.

00:45:34:13 - 00:45:37:22

Yeah, it's it all adds up to a very compelling story like

00:45:37:22 - 00:45:41:02

AI that isn't racist. It's ethically sourced,

00:45:41:02 - 00:45:44:08

it's extremely scalable. And even for hyperscale,

00:45:44:08 - 00:45:48:21

we can be very flexible. Like that big tech company is running us on prem.

00:45:48:21 - 00:45:50:09

We can run on the client side.

00:45:50:09 - 00:45:56:13

We can. It's the most technically sophisticated company I've ever worked at. So we can, like, be like water and furnish integrations and

00:45:56:13 - 00:46:05:19

just about anywhere you can imagine. So it's a very exciting time to have all of that encapsulated in a single offering that is extremely differentiated.

00:46:05:19 - 00:46:10:08

Last question. And I want to accelerate, then segue into ending of it.

00:46:10:08 - 00:46:23:01

The potential of this technology that you showcase is endless. You basically could touch every industry when it comes to verification of identity and so on. How do you go about prioritizing what to build next? It's a,

00:46:23:01 - 00:46:27:16

a fascinating journey there. So as I mentioned, I'm kind of this super icy persona.

00:46:27:16 - 00:46:31:14

So I'm doing all the things I have in the market research space.

00:46:31:14 - 00:46:33:17

We have some advisors who are helping us.

00:46:33:17 - 00:46:34:16

They are

00:46:34:16 - 00:46:37:09

very well known in the industry and thought leaders,

00:46:37:09 - 00:46:46:13

in the other industries I'm working on where I'm trying to get in, I have the co-founders where they have some ideas and insights, and generally I have these sort of advisory boards

00:46:46:13 - 00:46:52:01

where I am sharing what I'm learning in real time, very much on an entrepreneurial journey.

00:46:52:03 - 00:46:56:16

And it's a constant sort of balancing act of taking signals where I see traction,

00:46:56:16 - 00:47:03:23

and trying to take those stories to new verticals where I have a theory about it. So, like, right now, for example, there's this,

00:47:03:23 - 00:47:06:19

EU digital wallet initiative

00:47:06:19 - 00:47:10:04

that launched, legislation for it started two years ago.

00:47:10:04 - 00:47:19:19

And the name of it is that by December of 2027, EU businesses in very particular categories like transportation, for example,

00:47:19:19 - 00:47:20:19

and health care

00:47:20:19 - 00:47:23:16

will need to be able to handle,

00:47:23:16 - 00:47:31:06

verifications of people provided through a new EU digital wallet, which will essentially be like a mobile app,

00:47:31:06 - 00:47:33:09

where it can tie to their government ID

00:47:33:09 - 00:47:33:22

and through a

00:47:33:22 - 00:47:37:00

single point a user can say, this app,

00:47:37:00 - 00:47:43:23

administered by the EU, has validated that I'm Scott Jones and I live in this particular place in the UK,

00:47:43:23 - 00:47:48:15

and I can take this anywhere I go and say, yep, this app proves I am who I say I am.

00:47:48:15 - 00:47:54:22

They have a first version of that that is predicated on using heavyweight government ID checks.

00:47:54:22 - 00:48:06:20

I'm now trying to hack my way into this situation where I have a theory like, hey, I can provide connective tissue in a way you guys haven't even considered yet. I can provide anonymous verification of personhood,

00:48:06:20 - 00:48:08:18

demographics and uniqueness

00:48:08:18 - 00:48:11:09

where the user doesn't have to expose their government ID.

00:48:11:09 - 00:48:13:08

So in this case, it's very much

00:48:13:08 - 00:48:14:19

emblematic of how I'm doing it.

00:48:14:19 - 00:48:17:11

It starts with the hypothesis. Some initial research

00:48:17:11 - 00:48:19:06

like, hey, I think there's a market there.

00:48:19:06 - 00:48:25:19

Finding the ICP of people who are involved in this that, that I think could be valuable to me.

00:48:25:19 - 00:48:34:08

And then it just becomes what I call, like guerrilla networking, where I don't have warm introductions. I'm trying to, like, hack my way into these places to talk to people.

00:48:34:09 - 00:48:41:07

So it's like using very compelling copy in a LinkedIn message. And in this case, yesterday I hit up like 50 people

00:48:41:07 - 00:48:49:12

and many of them responded. They liked my message, and one of them took a call with me at 2 a.m. immediately in Korea because he's selling products for this problem,

00:48:49:12 - 00:48:51:01

and he said, your pitch caught my eye.

00:48:51:01 - 00:48:52:13

I haven't heard of anything like that.

00:48:52:13 - 00:49:00:07

So that's that's kind of the way it's like being very intentional. I see an opportunity. Can I research that? Can I validate there's something there?

00:49:00:07 - 00:49:07:12

Just kind of the entrepreneur's journey and then saying, like, what more? Here's like a gated test. Can I learn enough to say I should keep going

00:49:07:12 - 00:49:10:02

and then continually triaging those in this market?

00:49:10:02 - 00:49:12:16

Yeah, it looks like there's something there, but they're not ready for it yet.

00:49:12:16 - 00:49:22:18

This one's really hot. I'm going to double down on that as fascinating. And I truly hope that your solution Real Eyes become the standard for verification, because

00:49:22:18 - 00:49:28:12

the dystopian future that you have painted is very scary. And I hope we never land on that future.

00:49:28:12 - 00:49:30:10

Yeah, I'm it freaks me out too.

00:49:30:10 - 00:49:33:18

I have, I have 11 year old kids like twins. So

00:49:33:18 - 00:49:42:05

I'm definitely looking to create a world that is sunny like this. Thanks a lot, Scott. I think you are truly an admirer person. Fascinating person and one of the most

00:49:42:05 - 00:49:44:01

fascinating podcast I ever recorded.

00:49:44:01 - 00:49:47:19

Wow. You are very kind. Not going to let that go to my head.

00:49:47:19 - 00:49:50:12

My wife keeps me very humble here. She just tell me the.

00:49:50:12 - 00:49:51:02

Yeah.

00:49:51:02 - 00:49:51:15

Thanks.

00:49:51:15 - 00:49:52:17

Thank you for having me.

00:49:52:17 - 00:49:59:11

You.