
UX for AI
Hosted by Behrad Mirafshar, CEO of Bonanza Studios, Germany’s Premier
Product Innovation Studio, UX for AI is the podcast that explores the intersection of cutting-edge artificial intelligence and pioneering user experiences. Each episode features candid conversations with the trailblazers shaping AI’s application layer—professionals building novel interfaces, interactions, and breakthroughs that are transforming our digital world.
We’re here for CEOs and executives seeking to reimagine business models and create breakthrough experiences, product leaders wanting to stay ahead of AI-driven product innovation, and UX designers at the forefront of shaping impactful, human-centered AI solutions. Dive into real-world case studies, uncover design best practices, and learn how to marry innovative engineering with inspired design to make AI truly accessible—and transformative—for everyone. Tune in and join us on the journey to the future of AI-driven experiences!
UX for AI
EP. 91 - AI-Led Automation Will Define Tomorrow’s Companies w/ Lennard Kooy
In this episode, we sit down with Lennard Kooy, founder of Lleverage, to explore how AI-driven automation is evolving beyond technical barriers. From his days leading marketing tech at ITG to building a next-gen automation platform that starts with just a prompt, Lennard shares his vision for a frictionless future where AI meets real business needs. We talk about UX patterns, agent-first interfaces, and why the next great SaaS products won’t look like SaaS at all.
You can find Lennard here: https://www.linkedin.com/in/lennardkooy/
Interested in joining the podcast? DM Behrad on LinkedIn:
https://www.linkedin.com/in/behradmirafshar/
This podcast is made by Bonanza Studios, Germany’s Premier Digital Design Studio:
https://www.bonanza-studios.com/
1
00:00:00,000 --> 00:00:03,000
Welcome to UX for AI.
2
00:00:03,000 --> 00:00:08,880
Lennard, I really appreciate you coming on the podcast. I know you're busy and that's
3
00:00:08,880 --> 00:00:14,120
the whole purpose of this podcast, to bring busy people like you on the pod and give them
4
00:00:14,120 --> 00:00:21,480
30 to 45 minutes and basically give us a download of how they see this AI thinking moving forward.
5
00:00:21,480 --> 00:00:26,440
We've been actually on the rabbit hole of your posts and like reading your posts, very
6
00:00:26,440 --> 00:00:31,680
interesting insights, especially with DeepSeek and What's Happening Friends, which potentially
7
00:00:31,680 --> 00:00:37,800
we can touch upon as well. But I think how we can sort of conversation is sort of like
8
00:00:37,800 --> 00:00:42,880
your journey to found leverage. I think that's something that's interesting because you were
9
00:00:42,880 --> 00:00:49,520
the group's tech CEO of ITG. And then at some point last year, you realized that, okay,
10
00:00:49,520 --> 00:00:54,720
I want to move out. I want to go after this. So what were you seeing? What was your read
11
00:00:54,720 --> 00:00:59,480
of the market at that point that you realized that, okay, there is an opportunity. I want
12
00:00:59,480 --> 00:01:00,480
to run with it.
13
00:01:00,480 --> 00:01:07,600
Yeah. So I used to be CEO of a group of marketing technology platforms, one of which I founded
14
00:01:07,600 --> 00:01:12,960
and sold to the group that I became sort of the tech CEO. So we had two CEOs, but I won't
15
00:01:12,960 --> 00:01:18,240
bother you with it. And they were all marketing technology platforms. And yeah, what this
16
00:01:18,240 --> 00:01:24,440
marketing like we create help people or companies create ads effectively, right? What is an ad?
17
00:01:24,440 --> 00:01:30,640
What is pixels and words? What is AI very good at? Pixels and words. So when generative AI
18
00:01:30,640 --> 00:01:35,440
sort of came about, we obviously want to bring that into our products, but it did feel a bit
19
00:01:35,440 --> 00:01:42,200
like I was on the sidelines, right? So someone else was inventing how this would work and
20
00:01:42,200 --> 00:01:46,800
what it was. And then I was trying to adopt it and then put it into the products that
21
00:01:46,800 --> 00:01:51,020
we have. But it didn't feel like I was in the midst of it. It was more like, hey, you're...
22
00:01:51,020 --> 00:01:53,360
You're making someone else rich.
23
00:01:53,360 --> 00:01:58,980
Basically. And that feeling just kept with me quite so from the early days when, let's
24
00:01:58,980 --> 00:02:05,600
say early 2022, when this, or late 2022, when this sort of came about in early 2023, I started,
25
00:02:05,600 --> 00:02:09,880
yeah, getting that feeling. And at some point I just thought, hey, I need to, I need to
26
00:02:09,880 --> 00:02:15,920
stop what I'm doing here and I need to get, let's say active in where it's actually happening.
27
00:02:15,920 --> 00:02:21,280
It also fell together with, let's say complicated, earn out constructs that ran out from a previous
28
00:02:21,280 --> 00:02:26,680
deal. So the timing was just right. So then decided, hey, let's start over. Let's start
29
00:02:26,680 --> 00:02:27,680
in this place.
30
00:02:27,680 --> 00:02:33,840
So obviously like your experience running a marketing consultancy agency or whatever
31
00:02:33,840 --> 00:02:38,020
the case may be, it really helps you with sort of like understanding what you could
32
00:02:38,020 --> 00:02:45,280
do with the AI, right? So from your marketing background to leverage, which is automation
33
00:02:45,280 --> 00:02:52,000
tool. So you basically skipped the pixel and text that AI is really good at and you arrive
34
00:02:52,000 --> 00:02:58,400
at the automation. So maybe like walk me through that journey, how did you sort of like arrive
35
00:02:58,400 --> 00:03:01,360
at the automation as a way to go with the AI?
36
00:03:01,360 --> 00:03:07,040
Yeah. So maybe to clarify, the platforms that we had in our portfolio was a marketing automation
37
00:03:07,040 --> 00:03:13,340
platform, was a creative automation platform. So you had automation, but in marketing, right?
38
00:03:13,340 --> 00:03:18,260
So let's say, so to give you an example, we had big B2C brands like Heineken, they used
39
00:03:18,260 --> 00:03:23,600
our platform to create all of their ads, but sort of they made one and then our system
40
00:03:23,600 --> 00:03:27,700
automated all the versions that they need, translations, size adaptation. So it was an
41
00:03:27,700 --> 00:03:34,000
automation play. So I already knew that space from how does this work from a workflow, a
42
00:03:34,000 --> 00:03:39,560
UI angle, et cetera, et cetera, but it was in a specific sector. And so the leap towards
43
00:03:39,560 --> 00:03:44,680
building an automation product in AI wasn't necessarily the biggest leap. It was more
44
00:03:44,680 --> 00:03:49,080
like it is a very different product, right? Because with the products that we have, we
45
00:03:49,080 --> 00:03:55,320
targeted marketing departments exclusively and sort of create digital creators. And now
46
00:03:55,320 --> 00:04:00,160
we are going after much more like, hey, can we help automate business processes? And they
47
00:04:00,160 --> 00:04:05,680
can be in the broader sense of the word, right? They can be, I don't know, I have like a claim
48
00:04:05,680 --> 00:04:11,340
processing for an insurance company, or I need to screen contracts or I need, so they're
49
00:04:11,340 --> 00:04:16,380
not necessarily in the marketing realm. It can be very broad, but the concept of automation
50
00:04:16,380 --> 00:04:21,340
and building products in it isn't that new to me. It's just like the type of processes
51
00:04:21,340 --> 00:04:27,640
and type of users we go after very differently. So, and we actually started out as more like
52
00:04:27,640 --> 00:04:34,200
a tech platform. So the initial idea that we had seven months ago was, hey, we have all
53
00:04:34,200 --> 00:04:39,460
these developers in our previous organizations. They struggle quite a bit with building generative
54
00:04:39,460 --> 00:04:44,360
AI solutions that actually work. That was mainly due to the fact that they just didn't
55
00:04:44,360 --> 00:04:49,580
understand all the constructs that you need to make AI work, like really at that scale.
56
00:04:49,580 --> 00:04:55,040
And let's say working in the last mile, you need stuff like factor databases and embeddings
57
00:04:55,040 --> 00:04:59,620
and fine tuning, et cetera. And they struggled a lot with that. And so initially we said,
58
00:04:59,620 --> 00:05:05,080
okay, can we build a product that sort of abstracts away all those complex constructs for developers
59
00:05:05,080 --> 00:05:14,400
and turns a normal developer into an AI developer? What we pivoted about two months ago towards
60
00:05:14,400 --> 00:05:19,440
more internal process automation, because we just felt that there was a much bigger
61
00:05:19,440 --> 00:05:25,360
market for it. Because initially we're just selling to tech departments of tech companies,
62
00:05:25,360 --> 00:05:30,400
which is sort of a sub market. And now it's more like, hey, every type of company that
63
00:05:30,400 --> 00:05:36,720
has a process that is repetitive and currently done manually, can we build a platform that
64
00:05:36,720 --> 00:05:40,560
sort of underpins that to automate it? So it's a bit of a different way, but it was a bit
65
00:05:40,560 --> 00:05:41,560
of an evolution.
66
00:05:41,560 --> 00:05:47,660
I think the land and expand of your new pivot, I think, has much more potential than just
67
00:05:47,660 --> 00:05:53,080
like selling it to tech department. But also there is going to be a difficulty in a sense
68
00:05:53,080 --> 00:05:58,520
of finding the right people in your organization to talk to. Because now essentially you can
69
00:05:58,520 --> 00:06:03,640
touch marketing, you can touch customer service, product, tech. Everything is up for grabs.
70
00:06:03,640 --> 00:06:09,480
So I think it's going to most likely introduce some complication when it comes to sales.
71
00:06:09,480 --> 00:06:17,320
So how do you go about, of course, you know what they say, a great product with a bad
72
00:06:17,320 --> 00:06:21,620
product, not bad, a mediocre product with great marketing beats a good product with
73
00:06:21,620 --> 00:06:27,160
bad marketing every day of the week. So now you sort of change the pivoted towards this,
74
00:06:27,160 --> 00:06:31,440
which I think is a very, very interesting potential. And I love the demo of leverage.
75
00:06:31,440 --> 00:06:36,040
And I would love to talk about it as well. So have you thought about how you want to,
76
00:06:36,040 --> 00:06:37,480
like, basically make sales?
77
00:06:37,480 --> 00:06:42,440
Yeah. So funnily enough, this is more like a traditional sales problem, especially the
78
00:06:42,440 --> 00:06:48,280
market that we're going after, right? So we say, okay, mid-market non-tech companies, those
79
00:06:48,280 --> 00:06:53,640
are sort of our sweet spot because they don't necessarily have the people already in place
80
00:06:53,640 --> 00:06:58,560
to help them leverage the constructs of AI. So we're going to help them do that and automate
81
00:06:58,560 --> 00:07:03,340
sort of core processes for them or help them to automate it themselves. But those people
82
00:07:03,340 --> 00:07:10,400
aren't necessarily the people that go online, try to get a free trial. So you need to go
83
00:07:10,400 --> 00:07:15,600
to them. And that is more like, I call it a traditional sales problem, but interestingly
84
00:07:15,600 --> 00:07:20,520
enough, like all the people that work with us on leverage, they have that background
85
00:07:20,520 --> 00:07:26,580
of more enterprise sales. So how do I scale an organization that gets to the buying persona,
86
00:07:26,580 --> 00:07:32,000
gets let's say a relationship with them and convinces them that we're the right partner
87
00:07:32,000 --> 00:07:37,300
to do this with, which is something that we've done in the past. So it's very different
88
00:07:37,300 --> 00:07:42,840
than creating a product and hoping you go viral. And then it's very much almost traditional
89
00:07:42,840 --> 00:07:48,880
enterprise sales, SaaS sales. And then there is how can you make that as efficient as possible?
90
00:07:48,880 --> 00:07:54,560
So we used to live in a world where you needed SDRs and AEs and solution engineers, and there's
91
00:07:54,560 --> 00:08:00,640
people walking around to do sort of rev ops, et cetera. I think we're moving towards a
92
00:08:00,640 --> 00:08:09,040
situation where like the personal contact like you and I are having and sort of an agent,
93
00:08:09,040 --> 00:08:13,600
executive has with the client is very important because you need to build that trust and you
94
00:08:13,600 --> 00:08:17,720
build that trust for humans. But everything around it, you can automate to an extent like
95
00:08:17,720 --> 00:08:22,720
the role of an SDR, like getting to that personal outbound, that's something you can fully automate.
96
00:08:22,720 --> 00:08:27,160
And when someone gets through the gate, like actually delivering him the product. And we
97
00:08:27,160 --> 00:08:31,680
can talk about that from a UX perspective, because that's very important and interesting,
98
00:08:31,680 --> 00:08:36,040
is you can automate that to a large extent as well through agents, like the interaction
99
00:08:36,040 --> 00:08:40,000
that we currently have with our clients, hey, tell us how we go about this. That is something
100
00:08:40,000 --> 00:08:46,800
you can automate if you think about it very well. So you'll still have that traditional
101
00:08:46,800 --> 00:08:51,800
sales problem of getting to the person and then getting that person to trust you, but
102
00:08:51,800 --> 00:08:56,740
getting to it and then afterwards, let's say getting the solution to the customer, that's
103
00:08:56,740 --> 00:09:01,680
something you can automate. The bit in between convincing that's just a manual play.
104
00:09:01,680 --> 00:09:08,720
So let me summarize it. Your thesis here, which I'm really aligned with, is that automation,
105
00:09:08,720 --> 00:09:13,040
because there is a lot of, not a lot, but there's negative connotation with automation
106
00:09:13,040 --> 00:09:18,600
when it comes to AI, especially in bigger enterprises, because for very good reason,
107
00:09:18,600 --> 00:09:22,280
people do not want to like fire people. You don't want to like say, okay, you are not
108
00:09:22,280 --> 00:09:27,840
going to have a salary from next month with us, right? But that's not the point of automation.
109
00:09:27,840 --> 00:09:34,320
Because the point of automation as you basically profess is that let me get the busy work out
110
00:09:34,320 --> 00:09:42,880
of the way, so your AE, which is at its core, their role is to nurture human relationship,
111
00:09:42,880 --> 00:09:48,240
can just focus on that part. Yeah, yeah. So I think internally, we say,
112
00:09:48,240 --> 00:09:54,640
probably in three or four years time, we'll live in a world where human interaction, creativity,
113
00:09:54,640 --> 00:10:00,240
and strategy are sort of the core ingredients of your day to day job, right? Currently,
114
00:10:00,240 --> 00:10:05,360
that's not the case, right? So there's a lot of, let's say, a mundane repetitive work,
115
00:10:05,360 --> 00:10:09,880
even if you're not in data entry or something, the amount of time you spend in following
116
00:10:09,880 --> 00:10:15,120
up an email, putting stuff in a CRM, like editing this podcast, et cetera, that will
117
00:10:15,120 --> 00:10:20,780
probably diminish significantly over the coming three years, significantly more than it has
118
00:10:20,780 --> 00:10:25,660
done in the last 20 years, which will change how people work, right? So there is a lot
119
00:10:25,660 --> 00:10:30,800
more emphasis on human interaction because you can't really automate that real creativity.
120
00:10:30,800 --> 00:10:36,220
So I'm not talking about, Hey, can I just write a blog post about a certain topic that
121
00:10:36,220 --> 00:10:41,760
is quite, yeah, like just information, but there's no creative angle. So real creativity
122
00:10:41,760 --> 00:10:49,000
and, and strategy, right? So those three components are probably going to go from maybe 20, 30%
123
00:10:49,000 --> 00:10:57,180
in every day job to maybe 80%. Now, that will mean that let's say we as humans will just
124
00:10:57,180 --> 00:11:01,920
work differently. That doesn't, I don't see a future where we won't work because that's
125
00:11:01,920 --> 00:11:07,160
just in our DNA. We'll do different things, but it's not like there might be a small period
126
00:11:07,160 --> 00:11:13,160
of time where we see AI causing layoffs and there might be a bit of a bigger portion of
127
00:11:13,160 --> 00:11:17,860
the, of humanity not having a job, but I don't see that as a sort of an existential threat,
128
00:11:17,860 --> 00:11:21,860
but we'll solve it through, through different constructs. That's what you mean humanity
129
00:11:21,860 --> 00:11:26,660
does. And we'll find different things to do for those people that are of higher quality,
130
00:11:26,660 --> 00:11:30,620
if you want to see it like that. Yeah, I want, I don't want, I want to be careful, not going
131
00:11:30,620 --> 00:11:35,120
down that rabbit hole of what's going to happen. AI is going to, because I, I'm generally very
132
00:11:35,120 --> 00:11:40,140
positive and optimistic person. And I think, you know, just maybe the my two cents on that
133
00:11:40,140 --> 00:11:44,940
before we jump into the leverage and the UX of it, which I think is brilliant, is that
134
00:11:44,940 --> 00:11:50,160
I think whenever there is a technology that unlock massive productivity, there is going
135
00:11:50,160 --> 00:11:56,680
to be a new wave of jobs that we could not see that before. And you can go all the way
136
00:11:56,680 --> 00:12:04,080
back and study every technological revolution. And you can see, for example, with factories,
137
00:12:04,080 --> 00:12:12,380
now we need the new wave of jobs and occupation cafes next to the factory nurseries next to
138
00:12:12,380 --> 00:12:16,680
the factories that these are all the jobs that were created, but there were factories
139
00:12:16,680 --> 00:12:21,880
now instead of farms. So I think, I think it's just very short sighted and I don't want
140
00:12:21,880 --> 00:12:26,420
to engage in such conversation ever again, because I've been talking about this with
141
00:12:26,420 --> 00:12:31,700
a lot of folks for the past year. I'm like, okay, my position is this, I could only, but.
142
00:12:31,700 --> 00:13:01,660
Yeah, I agree. There might be some friction periods, right. But in the end, humanity was, right. That is the most realistic scenario. So, I am a big fan of an item. And I think in your space, everyone knows an item and the likes of an item. I think the pattern, the note based pattern of creating automation is sort of like, is a pattern that sort of, so to speak, is archetype in your space, right? My frustration,
143
00:13:01,660 --> 00:13:31,580
with an 18 is, I think that's what you're trying to address. You need to know at least a sufficient level about tech side of things to be able to use an item. The brilliance of what your demo in your website here is, you start the automation with a prompt. Tell me what you want to do. And I think that's brilliant. That's my, that's actually addressing my pain point with an item, because I'm a busy guy, I only have one to two hours time,
144
00:13:31,580 --> 00:14:01,500
spent to get something done. Otherwise, I have to go talk to my accountant to find this, to decline it, to this and that. I need that prompt. Yeah. So that is basically, and there's a lot of room for us to cover, let's be clear, but that is what we're trying to get towards is like, when I look at an N8N or a APR or anything. It still requires the person using it to put in quite a lot of effort to automate the process, but also to understand what they need to automate that process. And
145
00:14:01,500 --> 00:14:30,860
that one is maybe even let's say more critical. You go in there and you have to understand what the building blocks can do for you and what your process looks like and how those building blocks layer on top, let's say, or solve a part of that process. I think what generative AI unlocks is a very different paradigm where you tell the system what you want to get out of it. And it will help you do the heavy lifting or finding out the constructs that you need to automate the process to get there through just intelligent interaction with you as a user.
146
00:14:31,000 --> 00:14:45,880
So that could be like in a situation where you just say, Hey, this is what I want to automate. And then there's a sort of an agent going back and forth with you. Okay. Hey, okay. Can you tell me maybe where, where those documents live? Okay. You can help me authenticate it. So there's a much more.
147
00:14:46,360 --> 00:15:00,900
Let's say frictionless process of getting to the end result rather than requiring the user, like you say, to do all the heavy lifting of the, Hey, I need this Lego block for this and this for this and this for this or this might not work. I have to go back again.
148
00:15:00,980 --> 00:15:19,280
So what we are very intrigued about at leverage is can we build a UX? Can we build a product which starts with the end results and then tries to help you as actively as it can to get to that end result with as little as, let's say information it needs from you.
149
00:15:19,520 --> 00:15:30,980
But you need to have some context and information, but building up that context of what you're trying to automate, let's say in the background will make it a lot easier for you to, to get to the end point that you, that you need.
150
00:15:31,280 --> 00:15:44,360
That might be a need as simple as, Hey, let's start with a, what do you want to automate today? And that already does 50% of the work. And then I need this from you. I need this from you. Okay. And then there we are. What about this output? I don't like this. Okay. Why don't we change that? Right.
151
00:15:44,360 --> 00:16:01,040
So that's a very different paradigm than yeah, this old school SaaS paradigm. So I have these, these boxes that I need to connect and they do a certain thing. I think that is, I mean, I don't get me wrong. I think, and they then it's beautiful products, but I think we can live in a world that is different.
152
00:16:01,140 --> 00:16:19,940
I think so. So here's the thing. My, as I was playing around with N, I've got, I had my chat GBT open. I had my DeepSec open because the DeepSec, some of the responses is much more hands on than chat GBT. And I was basically trying to make sense of the flow I wanted to create.
153
00:16:20,080 --> 00:16:37,960
Then I would try to mimic the replicated in N8N. So as I was doing this, Lennard, I asked myself, why don't I do that in N8N itself? So the LLM behind it knows that I am trying to create this automation and these are the steps.
154
00:16:38,120 --> 00:16:52,960
So it already started building those steps in the background and for whatever, you know, integration or data or document or service construction or prompt that needs ask me about, okay, now for this stage, I need this from you.
155
00:16:53,120 --> 00:16:59,920
Exactly. That. So you're pretty much describing our product vision. So that is sort of, Oh, wow. Okay. That's, that's interesting.
156
00:16:59,980 --> 00:17:12,900
Yeah. So sort of an, you'll get an agent that is sort of the interaction that you're currently having with an AI to sort of build your workflow in the context of where you need to build it. Right. So that is the key ingredient, because that can make it also a lot more efficient.
157
00:17:13,460 --> 00:17:25,080
Because currently you still need to interpret what you're seeing on the screen of N8N, then context switch to chat GBT, then feed it in. It does, it has imperfect information. So there's a lot of friction in that process still.
158
00:17:25,300 --> 00:17:45,020
But if you have that within the context of your automation, and it already knows all the context and can also therefore give you a lot more accurate results, it will even be a better experience than just bringing chat GPT into the N8N. Right. Because you have basically that that GPT you're talking to has as perfect information as it can.
159
00:17:45,120 --> 00:17:48,620
Right. So it will give you way better answers or suggestions what to do.
160
00:17:48,700 --> 00:18:10,420
So I guess, so how do you go about, to me, it's a very ambitious vision, especially having worked with this automation tool. How do you go about limiting the responses that the LLM offering to me as I am trying to make sense of this automation to make sure that those recommendations can be implemented in the platform.
161
00:18:10,560 --> 00:18:20,100
So it doesn't go off the tangent so much that you cannot translate those recommendations or steps into a workflow in leverage.
162
00:18:20,300 --> 00:18:38,600
Yeah. So the key here is giving the agent you're interacting with, let's say, yeah, it sounds a bit, but you give it like the guardrail technique. Right. So it says, hey, you have, this is what our platform can do. That's a starting point, right. You have these tools, you have these capabilities, you have these workflows, you have these templates.
163
00:18:38,600 --> 00:18:53,000
And this is what you can advise. Right. And then you start feeding it the context of your specific situation. And what effectively is doing in the background is how can I match what you're trying to do with what I can offer.
164
00:18:54,560 --> 00:19:24,160
And that is you have to always have the starting point of what it can automate. Because yeah, the difficulty is if otherwise it goes on to the agent might go on off to a certain path. But if then that can't be done in a platform that's a bit useless, there's flexibility in here. Right. So you can always solve stuff with sort of custom code blocks, etc. But I honestly want to try not to get there. Because there's a lot of value in you as a user understanding the flow when it's done for you. Right.
165
00:19:24,160 --> 00:19:48,680
So can I follow what is happening here and then, so the moment in time, I start introducing, let's say custom Python blocks. You're off, right? We're gone. You and I, then it's unreadable. You don't know what's happening then in that block. So what you, what we want to get to is, hey, can I have it generate the blocks in a descriptive way and also make it understandable and accessible to user even after it's done.
166
00:19:48,760 --> 00:20:18,720
Because it's a bit like a lot of these processes aren't what we call a one shot approach. Right. You make a version of it. And then you have to have sort of a back and forth with you still. Right. So and that we are thinking about it in such a way that you probably the first couple of runs, you add a human in the loop node, which sends you a slack my message and says, Hey, this is the output what I've gotten to. What do you think of it? Do you want to improve it? Can I go left? Can I go right? You give some instruction and we'll try to.
167
00:20:18,840 --> 00:20:48,700
Incorporate that in the workflow again, to make it better. And at some point you're at a point, Hey, every time it gives me an output, I'm happy with it. I can just let it run, but this is an iterative process. Right? So not even from a technical constraint partially, but also from a human constraint, right? The, um, ID that you from the get go have a perfect view of how it would work and what you kind of outputs you're gonna get and need. That is a bit, it's not like reality. Sometimes that's the case.
168
00:20:48,700 --> 00:21:18,360
Sometimes it is. So there is, there is an iterative approach to. I love what you're saying. I think, I think if I want to add to it, I think one of the advantage, because you know, the likes of an AT&T and they have their audience, people that are have computer engineering background. It's very clear to me, but for especially mid market, up market folks that they want to use leverage. They are head of sales, right? And it's being advertised to them that, Hey, leverage can help you.
169
00:21:18,360 --> 00:21:47,560
Create any automation with ease. I think the advantage of an automation platform, like leverage to become successful in this segment, which is very difficult to actually, I would say to succeed. And AT&T has an easier job because he can basically target all the agencies and you know, solo developers and whatever the case may be. Is that get me to the first result as soon as possible. Because to your point, and I totally agree with it because the real.
170
00:21:48,260 --> 00:22:02,160
Work of fine tuning a workflow comes after setting it up because you have to run it. You have to see the output. You have to ask your team to look into the output. You need to understand what's working, what's not working.
171
00:22:02,180 --> 00:22:17,040
If the output is desirable, according to your ideas and how we can iterate from, I think getting to the first output, get into the first run is the problem is the bottom. And if you can solve that, I think that would be fine. You hit, you're going to hit the wrong one.
172
00:22:17,100 --> 00:22:46,040
Yeah, yeah. So that's what we're, we're trying to build basically like, like the aha moment as quickly as possible, right? And then then then someone is already invested and then they will, they'll spend some time to get it to the point that it actually is let's say the last mile also works. But it's actually like you said, like, if someone, if we can get to a point where the person that has the problem has the constructs to actually automate that problem and see results like the feedback loop as you call it is very quick, then I think you have something that can really scale the problem.
173
00:22:46,040 --> 00:22:50,320
Like I said, there's still a ton of work to do on our end, but that is what we're trying to build towards.
174
00:22:50,360 --> 00:23:15,580
Another area that I wanted to get your take on is there is going to be different people using leverage with different technical proficiency, right? What's your thoughts on having customized user experiences, depending on your technical proficiency? For example, if I am a head of sales, probably I shouldn't be seeing 90% of the customers.
175
00:23:15,580 --> 00:23:17,860
90% of what a tech leader should be seeing.
176
00:23:18,640 --> 00:23:19,580
That's a very good question.
177
00:23:19,860 --> 00:23:26,580
Currently we have a sort of one size fits all and I could argue that it's still too technical for a lot of the, let's say the audience.
178
00:23:26,580 --> 00:23:40,460
What we want to get to is can we adapt the, whatever we show on the context that we know of that person? So can we, let's say we know the company you work for, we know is role.
179
00:23:40,460 --> 00:23:47,040
Can we then change the constructs that we offer based on that information? And whenever you start building stuff, we'll know more about you.
180
00:23:47,060 --> 00:23:52,640
Right? So we'll know if you added a box, you clicked it away, then it might not be for you.
181
00:23:52,640 --> 00:23:57,960
Right? And if we know the workflow you're trying to, or the process you're trying to automate, that's more context that we know.
182
00:23:57,980 --> 00:24:11,400
So we build up that context in the background and then we want to use that to make your life easier. Right? So if you're a sales leader, you probably just want to see CRM integrations and you don't want to see an HTTP request, yeah. Right.
183
00:24:11,780 --> 00:24:16,400
But if you're, if you're an automation expert, then you might actually want to see some of those constructs.
184
00:24:16,520 --> 00:24:20,160
So it is really trying to, especially with a horizontal product.
185
00:24:20,320 --> 00:24:31,440
I think in a couple of years time, people will sort of expect this behavior from horizontal products, right? It will, it has to become sort of a vertical feeling for them while it is a horizontal product.
186
00:24:31,440 --> 00:24:35,920
So you want the constructs to be offered to you that you can comprehend and that are useful for you.
187
00:24:36,340 --> 00:24:39,100
And because comprehension is one, but usefulness is the other.
188
00:24:39,100 --> 00:24:39,340
Right.
189
00:24:39,360 --> 00:24:48,940
If I'm going to offer you, I don't know, an integration to a data doc or whatever, even if you understand what it could do, it's not relevant for you.
190
00:24:49,120 --> 00:24:54,760
Right. So we just, yeah, don't, don't show it. Right. So it is, there is, there's comprehension there is useful.
191
00:24:54,800 --> 00:25:03,480
So you have to take those two things into account and then try to mold the UI in such a way that they're, that it's as relevant to you as we can.
192
00:25:04,140 --> 00:25:06,220
That's, that's, that's very interesting.
193
00:25:06,220 --> 00:25:08,460
Comprehension is different from use.
194
00:25:08,900 --> 00:25:12,140
That's I need to think about it, but I think you were spot on.
195
00:25:12,220 --> 00:25:16,140
So there is a, there's a voice echoing.
196
00:25:16,140 --> 00:25:17,040
I don't know what happened.
197
00:25:17,200 --> 00:25:18,340
Do we have an echo or?
198
00:25:18,660 --> 00:25:20,000
Yeah, there is a voice echoing.
199
00:25:20,760 --> 00:25:21,860
I'm not hearing myself.
200
00:25:22,940 --> 00:25:23,620
I don't know what happened.
201
00:25:23,640 --> 00:25:24,480
Did it change anything?
202
00:25:26,020 --> 00:25:28,380
And we, we cut this part, we cut this part.
203
00:25:28,380 --> 00:25:28,860
No worries.
204
00:25:29,000 --> 00:25:38,100
Comprehension is different from use and AI is, because of its, you know, it's in, it's in its DNA.
205
00:25:38,140 --> 00:25:40,900
It's a general, has general applications.
206
00:25:41,100 --> 00:25:44,020
Like it basically can touch every use case.
207
00:25:44,040 --> 00:25:51,360
So you want it to be general, but has vertical feel and use to it, which I think is very spot on.
208
00:25:51,720 --> 00:25:51,960
Yeah.
209
00:25:51,960 --> 00:25:55,120
And I think we'll probably be one of the first to do this.
210
00:25:55,140 --> 00:26:00,180
I think in all honesty, in a, in a couple of years time, this will be expected behavior.
211
00:26:01,600 --> 00:26:01,880
Yeah.
212
00:26:02,380 --> 00:26:10,320
Even within vertical products, the user will expect that the UI adapts to their level, to their role.
213
00:26:10,440 --> 00:26:12,640
And currently we hardly do that.
214
00:26:12,820 --> 00:26:17,220
There's any, not, not a lot of products that incorporate this into AX.
215
00:26:17,220 --> 00:26:17,420
Right.
216
00:26:17,420 --> 00:26:17,740
Sure.
217
00:26:17,740 --> 00:26:17,900
Sure.
218
00:26:17,900 --> 00:26:18,980
There are some user roles.
219
00:26:18,980 --> 00:26:22,200
You might show something or not show something, but I think it's very limited.
220
00:26:22,280 --> 00:26:26,780
Well, I think in a couple of years time, this is just how people will
221
00:26:26,860 --> 00:26:28,460
expect to interact with software.
222
00:26:28,540 --> 00:26:29,740
Yeah, that's fascinating.
223
00:26:29,780 --> 00:26:35,660
Another core principle of UX for AI, which I think is very relevant, is being proactive.
224
00:26:35,980 --> 00:26:37,400
AI can become proactive.
225
00:26:37,780 --> 00:26:42,880
So imagine like a scenario that you're fully integrated into one enterprise operation.
226
00:26:43,220 --> 00:26:44,140
You know what's happening.
227
00:26:44,260 --> 00:26:48,600
Like every bits of data come through you or to great extent.
228
00:26:48,900 --> 00:26:53,880
Do you think you can get to a point that you can proactively suggest, look,
229
00:26:54,320 --> 00:27:01,840
there are, based on your operation, we identify three other potential operation
230
00:27:03,880 --> 00:27:05,880
touch points that we can automate.
231
00:27:06,000 --> 00:27:09,100
Here's a suggestion you can, here's the automation recipe you can use.
232
00:27:09,420 --> 00:27:10,200
Do you think it's possible?
233
00:27:10,440 --> 00:27:12,080
Definitely, I think you can get there.
234
00:27:12,100 --> 00:27:17,080
I think the usefulness will, will increase over time as you build more context of the
235
00:27:17,080 --> 00:27:20,720
organization, but yeah, definitely those are things that you can think of.
236
00:27:20,720 --> 00:27:26,520
I mean, if you already know as an AI of, as a platform that you guys are using a
237
00:27:26,520 --> 00:27:28,280
OneDrive to store all your files.
238
00:27:28,280 --> 00:27:31,560
So you use this platform for your European use, that and etc.
239
00:27:31,800 --> 00:27:37,180
And you have the system interacting with pieces of those, let's say infrastructure
240
00:27:37,180 --> 00:27:42,100
that you're currently using, it will scan form an opinion on inefficiencies that are
241
00:27:42,100 --> 00:27:42,700
in there, right?
242
00:27:42,700 --> 00:27:47,100
So, and then you can try to translate that into, Hey, here's a workflow that's
243
00:27:47,740 --> 00:27:50,480
do this data classification job automatically for you.
244
00:27:50,480 --> 00:27:56,020
For example, I think that will become more valuable over time, the more context or the
245
00:27:56,020 --> 00:28:00,420
more workflows or agents are already in there because then you can ask AI to connect
246
00:28:00,420 --> 00:28:00,940
the dots.
247
00:28:01,080 --> 00:28:03,600
There's also, so yes, this will be possible.
248
00:28:03,660 --> 00:28:05,980
There's always also the human element to it.
249
00:28:06,000 --> 00:28:06,260
Right.
250
00:28:06,260 --> 00:28:12,620
So you also need people to actively agree with that specific thing.
251
00:28:12,620 --> 00:28:13,940
You can't just start doing it.
252
00:28:13,980 --> 00:28:14,200
Right.
253
00:28:14,200 --> 00:28:20,080
So there is, there's always the, you have to be very careful in using it as a suggestion
254
00:28:20,160 --> 00:28:23,560
rather than sort of, Hey, I automate this away as well.
255
00:28:23,560 --> 00:28:27,860
Some people might like that, but I think human nature is also that they like control to
256
00:28:27,860 --> 00:28:28,300
an extent.
257
00:28:29,000 --> 00:28:32,480
So there's almost, there's also a UI construct to it.
258
00:28:32,480 --> 00:28:38,120
How do I offer this in a way that it doesn't feel intrusive, but it feels like actually
259
00:28:38,340 --> 00:28:39,500
someone trying to help.
260
00:28:40,220 --> 00:28:45,440
So I think there's also, yeah, you have to consider that as well, but I definitely think
261
00:28:45,620 --> 00:28:51,100
it will be able to, if it has like a solid context of your organization, it will
262
00:28:51,100 --> 00:28:53,900
definitely be able to suggest automations as well.
263
00:28:53,900 --> 00:28:54,060
Yeah.
264
00:28:54,060 --> 00:28:58,940
I mean, the cool thing about what you're building, imagine that an enterprise is
265
00:28:58,940 --> 00:29:02,280
going to be using leverage for most of their operation.
266
00:29:02,560 --> 00:29:10,040
Then you can automatically in the background, create a service blueprint of that, of that
267
00:29:10,060 --> 00:29:11,860
enterprise and their entire operation.
268
00:29:12,180 --> 00:29:18,460
So you can exactly map what kind of automation booklet flows are being used for this
269
00:29:18,460 --> 00:29:22,220
enterprise to deliver their value, to the value, the promise to their users.
270
00:29:22,540 --> 00:29:22,900
Yeah.
271
00:29:23,120 --> 00:29:25,180
I think that's a very cool thing to do.
272
00:29:25,340 --> 00:29:25,640
Yeah.
273
00:29:25,940 --> 00:29:26,300
Yeah.
274
00:29:26,660 --> 00:29:27,060
You can.
275
00:29:27,260 --> 00:29:29,140
We're not there yet, but you can get there, right?
276
00:29:29,140 --> 00:29:35,060
So I think the beauty of AI is that it has infinite memory in a sense, right?
277
00:29:35,060 --> 00:29:39,380
You and I, you have so much GPU of what you can process.
278
00:29:39,380 --> 00:29:43,800
And like if you are, let's say working on one flow, one, let's say a particular task
279
00:29:43,800 --> 00:29:50,080
that is sort of taking up a lot of your memory, like what AI can do is just, it has
280
00:29:50,140 --> 00:29:52,960
that, but it's like, it can scale that to infinite.
281
00:29:54,340 --> 00:30:01,020
So yeah, you get to a situation where it is like your operational efficiency guy at
282
00:30:01,020 --> 00:30:04,380
some point, knowing everything at all times, right?
283
00:30:04,420 --> 00:30:07,300
And then I can also execute it instantly, right?
284
00:30:07,300 --> 00:30:11,660
If you get to that point, then, then the possibilities are almost endless.
285
00:30:11,700 --> 00:30:14,100
I think it will be a staged approach to get there.
286
00:30:14,180 --> 00:30:15,480
Yeah, I think so too.
287
00:30:15,700 --> 00:30:21,700
And I haven't really, to be honest, seen a good examples out there of application,
288
00:30:21,860 --> 00:30:28,260
even chat GBT itself, they claim to have a memory, but I don't see how it's serving me.
289
00:30:28,340 --> 00:30:29,340
I haven't seen that.
290
00:30:29,980 --> 00:30:34,860
It's a good promise, but I haven't seen good application of leveraging memory to
291
00:30:35,020 --> 00:30:38,100
provide more customized responses over time.
292
00:30:38,340 --> 00:30:43,120
I think we're not there yet from a technology standpoint, from a technology standpoint,
293
00:30:43,120 --> 00:30:46,340
is one we're not there yet from an application standpoint.
294
00:30:46,340 --> 00:30:51,060
I think we have to get people first to be convinced that they can, let's say, automate
295
00:30:51,100 --> 00:30:53,140
one process that they can comprehend.
296
00:30:53,660 --> 00:30:57,700
And I think a lot of the platforms, a lot of organizations are still in the phase,
297
00:30:57,700 --> 00:30:59,500
Hey, can this actually do one thing?
298
00:30:59,500 --> 00:31:00,320
Well, right.
299
00:31:00,900 --> 00:31:06,380
In order for you to trust it to do sort of suggestions and multiple things at once,
300
00:31:06,380 --> 00:31:10,900
it's rather you need to have confidence that they at least do one process and can do
301
00:31:10,900 --> 00:31:13,220
that, let's say at the level that you require it.
302
00:31:13,220 --> 00:31:17,020
So, and I think we're still as a society, we're still in that phase, right?
303
00:31:17,020 --> 00:31:24,580
Can AI, let's say, solve particular problems before I can let it solve more
304
00:31:24,740 --> 00:31:26,900
interconnected problems or types of...
305
00:31:27,540 --> 00:31:33,540
I want to be cautious about you and lunchtime, but I really need to ask you this
306
00:31:33,540 --> 00:31:41,140
question that there is a trend towards moving and using smaller size model than
307
00:31:41,140 --> 00:31:42,100
the big ones.
308
00:31:42,260 --> 00:31:49,180
I think for especially for app delivery efficiency, I feel like those models are
309
00:31:49,180 --> 00:31:54,100
much better models than these like general purpose models that are really big.
310
00:31:54,100 --> 00:31:54,860
What's your take on that?
311
00:31:55,460 --> 00:31:58,740
I think in the end, it will be more about speed than in cost.
312
00:31:58,780 --> 00:32:02,180
We started like a year ago when you would have asked me that question.
313
00:32:02,260 --> 00:32:07,700
I hardly agreed, but that was also mostly because yeah, if you use a smaller model,
314
00:32:07,700 --> 00:32:08,420
it's cheaper.
315
00:32:08,420 --> 00:32:12,180
So yeah, if you have just a specific task to carry out, it's just way more cost
316
00:32:12,180 --> 00:32:17,340
efficient. What you've seen in the last 12 months is that the speed at which the
317
00:32:17,340 --> 00:32:22,260
cost of these large models is decreasing is very rapid.
318
00:32:22,620 --> 00:32:27,220
Now, why I'm saying this is that, yes, maybe a small model is then more, let's
319
00:32:27,220 --> 00:32:29,460
say cost efficient or a bit faster.
320
00:32:29,460 --> 00:32:34,900
But if those, if the general model is so, let's say fast and so cheap that you
321
00:32:34,900 --> 00:32:36,780
don't really have to care, right?
322
00:32:36,820 --> 00:32:41,380
In comparison to the use case that you're solved, the cost is neglectable, whether
323
00:32:41,380 --> 00:32:46,060
you use that, that's a small fine tune model or let's say the bigger one.
324
00:32:46,220 --> 00:32:47,100
It doesn't really matter.
325
00:32:47,340 --> 00:32:47,660
Right.
326
00:32:48,300 --> 00:32:51,860
And the beauty of let's say using the bigger one is that you don't have to
327
00:32:51,860 --> 00:32:55,340
think about, hey, is this small model actually good at this particular task?
328
00:32:55,340 --> 00:33:00,380
So it takes away a lot of your cognitive load because you need to decide to use a
329
00:33:00,380 --> 00:33:02,740
smaller model that is a, hey, can it do this?
330
00:33:02,740 --> 00:33:03,740
Oh yeah, I have to test it.
331
00:33:04,020 --> 00:33:08,940
So what I'm saying is that yes, if you have a very repetitive high-volume task, it's
332
00:33:08,940 --> 00:33:10,260
probably worth looking into.
333
00:33:10,380 --> 00:33:17,100
I think we'll start living in a, in a world where the user isn't bothered by that.
334
00:33:17,100 --> 00:33:17,460
Right.
335
00:33:17,940 --> 00:33:18,180
Yeah.
336
00:33:18,220 --> 00:33:23,300
Like Sam Altman also posted, yeah, we realized that, yeah, we made it quite hard
337
00:33:23,300 --> 00:33:27,020
for users to understand what the difference is between the 03 mini and the
338
00:33:27,020 --> 00:33:32,260
4.0 and 01 mini and we're going to move to a place where you just ask the query
339
00:33:32,260 --> 00:33:36,780
in the background, we'll decide whatever happens, which is sort of a, they're
340
00:33:36,780 --> 00:33:40,380
moving towards, hey, we'll have one model to rule them all and we'll just make,
341
00:33:40,380 --> 00:33:44,380
we'll make sure that that we pick the most efficient avenue in the
342
00:33:44,420 --> 00:33:45,660
possibilities that we have.
343
00:33:46,300 --> 00:33:50,020
And I think that is also something you want as a user.
344
00:33:50,300 --> 00:33:51,700
You want simplicity, right?
345
00:33:51,700 --> 00:33:53,540
You just want, hey, here's my task.
346
00:33:53,580 --> 00:33:57,460
Give me the most intelligent, cost efficient, let's say solution for it.
347
00:33:57,620 --> 00:34:03,100
So I think fine tuned smaller models are temporary thing.
348
00:34:03,260 --> 00:34:03,420
Yeah.
349
00:34:03,460 --> 00:34:07,660
Basically there will be an auto routing to the appropriate model based on the
350
00:34:07,820 --> 00:34:12,500
inquiry and then it's basically a bit difference between Linux versus Mac.
351
00:34:12,820 --> 00:34:16,820
I mean, you have to sort of, everything yourself in Mac, you get it like in a
352
00:34:16,820 --> 00:34:18,300
package, it's safe and secure.
353
00:34:18,300 --> 00:34:19,220
And it works.
354
00:34:19,300 --> 00:34:19,580
Yeah.
355
00:34:19,580 --> 00:34:23,180
And so there's, there's intelligent routing, but there's also just if these
356
00:34:23,180 --> 00:34:27,020
models get better and better and cheaper and cheaper, then at some point you don't
357
00:34:27,020 --> 00:34:27,740
really care.
358
00:34:27,900 --> 00:34:33,180
Like if you're now paying one buck, let's say $1 for a particular task, then it
359
00:34:33,180 --> 00:34:37,860
matters, but in the, if it in the end is one cent and maybe using a fine tune
360
00:34:37,860 --> 00:34:42,740
model is a half cent, but yeah, for you, like you're, the value is a hundred,
361
00:34:42,780 --> 00:34:45,780
then whether it's one cent or half a cent, you don't really care, right?
362
00:34:45,780 --> 00:34:46,940
It's neglectful anyway.
363
00:34:47,100 --> 00:34:50,380
So what I'm saying is as long as those large man means models keep
364
00:34:50,380 --> 00:34:54,300
progressing costs, keep going down, at some point you can't be bothered to
365
00:34:54,300 --> 00:34:55,740
use a fine tune, small model.
366
00:34:55,780 --> 00:34:59,580
And Lennard, this has been, I mean, I could talk to you about, I think you're sort of
367
00:34:59,580 --> 00:35:03,780
like on the same wavelength when it comes to like, especially the UX side, but I
368
00:35:03,780 --> 00:35:05,980
want to be, you know, cautious about time.
369
00:35:05,980 --> 00:35:07,980
And like you have a lot on your plate.
370
00:35:08,300 --> 00:35:11,020
Any last words to wrap our conversation?
371
00:35:11,900 --> 00:35:12,900
How do you see yourself?
372
00:35:12,900 --> 00:35:16,900
How do you see leveraging the next year, especially end of 2025?
373
00:35:16,980 --> 00:35:21,260
Yeah, I think like this is just, I haven't, I've never been in such a time.
374
00:35:21,340 --> 00:35:22,340
Let's put it like this, right?
375
00:35:22,340 --> 00:35:24,460
There's just so much going on there.
376
00:35:24,460 --> 00:35:26,020
It's just overwhelming.
377
00:35:26,220 --> 00:35:26,860
Yeah, it's overwhelming.
378
00:35:26,940 --> 00:35:28,420
Well, yeah, to an extent.
379
00:35:28,420 --> 00:35:30,620
And the world around us is changing so fast.
380
00:35:30,740 --> 00:35:35,020
I think what has been true six months ago, isn't that true anymore, right?
381
00:35:35,020 --> 00:35:38,540
They, everyone initially thought, Hey, open AI might win the race.
382
00:35:38,540 --> 00:35:42,180
Then Google came up, they sort of beat them in models quite quickly.
383
00:35:42,180 --> 00:35:46,660
Then some, some Chinese guys came up with something that was almost as good.
384
00:35:46,660 --> 00:35:51,180
Like it's just, and then like all the constructs around the infrastructure
385
00:35:51,180 --> 00:35:53,260
are moving, startups coming up.
386
00:35:53,260 --> 00:35:55,940
I think it's what I would say to people.
387
00:35:55,940 --> 00:36:01,060
So listening is I think agility is more important than anything else at this
388
00:36:01,060 --> 00:36:04,900
point and way more important than it was 10 years ago, right?
389
00:36:04,900 --> 00:36:09,100
So can you keep up with what happens in the market and can you, let's say,
390
00:36:09,100 --> 00:36:12,380
translate that in such a way that it, that it benefits you.
391
00:36:12,540 --> 00:36:16,900
And I've, yeah, that goes for UI UX as well, right?
392
00:36:16,900 --> 00:36:19,940
So now we had this conversation and this, what we discuss sounds
393
00:36:19,940 --> 00:36:21,620
like a beautiful idea, right?
394
00:36:21,780 --> 00:36:25,860
But maybe in six months time, the paradigm has shifted again and
395
00:36:25,860 --> 00:36:27,340
we have to redo this conversation.
396
00:36:27,380 --> 00:36:30,220
So I think agility matters more than anything.
397
00:36:30,300 --> 00:36:32,700
I cannot, I think it's so spot on.
398
00:36:32,700 --> 00:36:39,180
I think we have evangelized being agile in the startup scene for
399
00:36:39,220 --> 00:36:45,540
quite a while, for many years, but this is, this is a new level of agility
400
00:36:45,540 --> 00:36:48,780
that I'm trying to accustom myself to it.
401
00:36:49,140 --> 00:36:53,860
Literally every day, every deep sick was introduced two, three, three weeks ago,
402
00:36:53,860 --> 00:36:57,100
four weeks ago, and completely changed the equivalent.
403
00:36:57,220 --> 00:37:05,700
And I think, I think folks that thrive in this like fast pace environment.
404
00:37:06,340 --> 00:37:09,820
They are going to reap the benefits of their, you know, work.
405
00:37:09,860 --> 00:37:14,980
And what, what I will also say that, and this is something that I'm struggling
406
00:37:14,980 --> 00:37:20,060
with is that, okay, you need to be on top of all the news and to literally every
407
00:37:20,060 --> 00:37:23,700
day you have to basically go through your ex LinkedIn, whatever the case, see
408
00:37:23,700 --> 00:37:27,940
what's happening, but at the end of the day, those who will win are the ones
409
00:37:27,940 --> 00:37:30,380
that work hardest, so you need to balance.
410
00:37:30,860 --> 00:37:32,740
It's also making smart choices, right?
411
00:37:32,740 --> 00:37:36,180
So there's working hard, but it's also making the right choices.
412
00:37:36,180 --> 00:37:41,900
And sometimes the better choices made often not acting instantly.
413
00:37:41,980 --> 00:37:46,020
I mean, like working hard as one, but you also have to be smart about it, right?
414
00:37:46,020 --> 00:37:50,820
So, and the one can sort of offset the other, but if you do both, make the right
415
00:37:50,820 --> 00:37:53,700
choices and work hard, then obviously you have to best bet, right?
416
00:37:53,700 --> 00:37:55,260
But there is, there's two angles.
417
00:37:55,460 --> 00:37:56,140
Thanks, Lennart.
418
00:37:56,220 --> 00:37:56,820
Appreciate it.
419
00:37:57,420 --> 00:37:57,980
All right, man.
420
00:37:58,260 --> 00:38:01,020
If there's anything I can help or to do, let me know.
421
00:38:01,060 --> 00:38:07,140
And let me know when you have it cut up and posted to your, to your Spotify channel.
422
00:38:07,260 --> 00:38:08,620
Yep, I will tell you.
423
00:38:09,460 --> 00:38:11,740
Thank you for listening to UX for AI.
424
00:38:12,180 --> 00:38:16,260
Join us next week for more insightful conversations about the impact of
425
00:38:16,260 --> 00:38:21,300
artificial intelligence in development, design and user experience.
426
00:38:21,300 --> 00:38:23,880
(upbeat music)