The FODcast

Prompts, Products, and Profits: The AI Revolution Nobody Saw Coming with Sander Berlinski

Tim Roedel and James Hodges Season 7 Episode 13

What’s really driving the AI revolution in digital commerce? According to Sander Berlinski, Director of Innovation at valantic, the answer isn’t chasing futuristic hype, but focusing on what’s working right now.

In this FODcast episode, Sander breaks down how AI is already transforming retail, from automating compliance checks to reshaping creative roles; and why data foundations, not shiny tools, should be every retailer’s first priority. 

We explore how his team built an EU Accessibility Act checker that plugs straight into e-commerce systems, turning compliance risks into improvement tickets, and why this kind of focused solution shows AI at its best.

We also cover:

  • The roles being most disrupted by AI, from customer service to front-end design
  • How platforms like Shopify are democratising creative content through natural language prompts
  • Why unglamorous data architecture is the biggest barrier to meaningful AI adoption
  • The compliance risks of employees experimenting with generative AI tools like ChatGPT
  • Practical advice for mid-market retailers balancing quick wins with long-term preparation
  • A glimpse into the near future of hyper-personalised product pages and physical-digital integration

If you’re in retail, digital commerce, or innovation, this is a grounded conversation about separating substance from hype and preparing your business for an AI-enhanced future.

Simply Commerce is the leading supplier of talent into digital commerce across technology, digital marketing, product, sales, and leadership.

Find our more about our approach and our services within digital commerce recruitment here: https://simply-commerce.co.uk/




SPEAKER_00:

Hello and welcome to season seven of the podcast. The podcast focused on the future of digital commerce hosted by Simply Commerce. Season seven promises to continue to bring you some of the industry's brightest minds across the globe as we unpick the sector and where it's heading. From war stories to strategy and technology deep dives to future trends. We cover it all as we continue our journey to have one of the most popular podcasts in commerce. Before we start, if you enjoy our content, please do hit the subscribe button on whatever platform you're listening on. Like and share on socials.

SPEAKER_01:

Welcome to another edition of the FODCAS. And today I'm really, really pleased to be joined by Sander Balinski. Sander is the director of innovation and strategic growth for Valantic. Sander, it'd be good to hand over to you to get a bit of a better intro about who you are and your background.

SPEAKER_02:

Yes, thanks. So just like you said, I'm a director of innovation at Valentic. Valantic is a global SI, so global solutions integrator. We're now with approximately four and a half thousand people scattered throughout Europe. And my responsibility is on innovation, which means that on a day-to-day business, I'm uh most of my time busy with implementing AI solutions, not only for our customers, but also uh for Valentic ourselves. And the strategic growth part that you said is uh uh is maybe not only on innovation, but if for instance a certain technology or a certain new version of Salesforce or uh some uh Google uh tool comes out, then it's my responsibility to scale those solutions uh to a mature uh business line.

SPEAKER_01:

So you'll be fully on top of all new AI evolutions then.

SPEAKER_02:

I try to be. My yeah, I so I have to watch all the trends, uh see through the clutter and see what works and what doesn't work uh at this moment in time, which actually quite a challenge. The whole world of AI, as you know, is full of buzz, full of hype, full of hysteria, maybe even. I try to uh uh see through the clutter and uh see what's useful and what's not.

SPEAKER_01:

Well, this is it, and actually uh this is one of the reasons I was really interested in in having a conversation with you because AI is a conversation point every time I have any kind of conversation in a podcast or just a client call or a meeting, but there's so much and it's really hard to disceminate what's real and what's actually having adding value. So that's why I think the audience will be really interested to hear what your take is. And I want to start by just asking if you could, in one sentence, uh what would be the single AI use case that you think already pays for itself of Atlantic?

SPEAKER_02:

Well, we've built an automatic uh uh EU accessibility act checker. So based on an I on AI, you can check your whole uh uh customer-facing uh e-commerce platform or uh website uh for any issues, and it doesn't only check that, but it also is integrated with whatever system you have, so you can directly um uh see those issues, and they will be taken to a JIRA or whatever your tool is uh to create tickets there with uh improvements on UI, on uh technology, on whatever is needed.

SPEAKER_01:

And that's a Valanti built tool.

SPEAKER_02:

Yeah, it's a Valantic built tool, yes, it is. Yeah, yeah.

SPEAKER_01:

Nice that sounds impressive. And from you in from your perspective, in in terms of digital commerce, where will the the kind of the magic of AI that we will talk about, where will that be first felt, do you think?

SPEAKER_02:

I think there are there are multiple places. So um all the uh roles and all the tasks that are relatively um easy, so to say, that are uh repetitive, that are not really complex to do. So you will feel it uh in customer service uh if you are a retailer or a brand. So you will feel it a little bit maybe in front-end development, where you can see that AI can replace already quite uh quite some stuff. You will feel it in uh designing, so the things that are not too complex, that are repetitive, and where you can automate a lot of stuff.

SPEAKER_01:

And are you seeing that already? The particular front-end dev piece. So you are you seeing that um already have an uptake and uh the demand for those types of uh produce?

SPEAKER_02:

Yeah, yeah, it's it's insane. I think the most of our uh revenue right now goes into those uh uh design front-end uh stuff, both on the development as in design uh side, and not only the things that our um uh data engineers, because you need suddenly data engineers to create uh front-end AI solutions, uh, but also what we see at our partners. I think one uh nice example to mention is I've been to um uh Toronto to Shopify, and they presented their new innovations also on AI, and within their system, within the system of Shopify, you can literally say, Hey, I want to uh have a banner with floating shoes because I sell shoes and I want that banner uh to not only have floating shoes, but I want those floating shoes to be actual products in my e-commerce platform, and I want to be uh that the customer is able to click on those floating shoes as a sort of a game, and with typing that in in that large language model style, it actually works, it transforms it into something that fits your house style, fits your brand, and actually works. So that's pretty amazing, and it's already here. So that's already available to customers, yes, it is, yeah, yeah.

SPEAKER_01:

Okay, that must change. Yeah, that changes the dynamics complete, doesn't it?

SPEAKER_02:

In terms of the types of people you need to teams, yeah, indeed, especially on that, because the products that uh retailers need to serve their customers is not that different from two years ago. But the way how companies like we deliver it to uh our retailers and brands, that is totally different. And also the people that uh retailers need to make these things work are totally different. So the composition of your workforce is changing really rapidly, while the uh the output, so uh e-commerce platform design, uh uh A-B testing, data analytics, that's all the same.

SPEAKER_01:

Okay. And if we talk a bit about you, uh because I think the audience to get a bit more understanding of you because you started as a digital commerce strategist, um, and now you're responsible for a team of around a 1200-person unit within that AI part of the business. So, what pulled you into AI full-time?

SPEAKER_02:

Yeah, yeah, I think I've been always been fascinated about new technology and maybe like a lot of little boys in uh in also gadgets and things that are uh uh working but also not working. Um, and I actually, before I was a digital strategist, I worked at uh the University of Tilburg as a teacher and uh researchers on uh digital consumer behavior. So this is the crossroad in technology and new channels, marketing channels, but also in the way that people interact with those technologies, and that's what I find fascinating. So, fast forward uh 10 years later, if you look at AI, that's still how do people interact with certain technology? Because AI and especially generative AI and large language models make this interaction with technology suddenly human-like, and that's what I find really fascinating.

SPEAKER_01:

Yeah, I agree. And our research team um mentioned to me you've you've apparently called yourself the marketing joker. Uh do you care to explain why?

SPEAKER_02:

I'm not sure if that's the right translation to English because, as you probably have heard, um uh English is not my native language. Um, but I am interested in a broad range of topics. And if you look at a company uh like Valentic, we have so many really good hyper specialists. We have those data scientists, we have uh uh UI uh consultants that are specialized in certain A-B testing on checkouts. So we have we are a house of hyper-specialists in certain markets, segments, and technology. But I'm one of the odd ones here because my interests go all over the place, and that's also quite convenient if you are responsible for innovation. So I have an interest in many different topics from technology for to people to business models to marketing. Um, and if you want to ask questions that go real in depth, then I probably am not the right person because we have a lot of specialists who know a lot about this specific topic. And going to your comment, why I'm called uh the marketing or the strategy or the sales joker, uh, that's then if there's something special coming up, special requests from customers or from end users, and people don't know where it needs to sit or land in our organization, then it probably will end in my inbox. Uh, hey, we don't know what to do with this situation. Can you please help?

SPEAKER_01:

Yeah, that makes more sense. I think the translation doesn't quite work, but I understand now.

SPEAKER_02:

Yeah, yeah, yeah, yeah, yeah.

SPEAKER_01:

Um, so we look at the uh LM piece and the launching paper. If you're launching a new commerce brand, um uh LM is a table stage, right? It's got to be uh it's a fundamental. What would you build differently now from a startup? I mean, any we're talking two years ago, right? A lot of change.

SPEAKER_02:

Yeah, yeah, a lot of change. So what we see is that the most work in making these LLMs work and wherever you deploy them is uh around uh making sure that your data works. So there's a lot of work in um uh consolidating data, cleaning the data, and making sure that these LLMs can talk to your specific data. Um, and if you are starting with a new brand, or if you're starting with a new uh company, then I would try to build that data foundation uh from the ground up really well. Then you have one time, spend a lot of time to make sure that all your solutions can communicate with each other easily, that data is gathered in a single place in a single way, and then you can deploy whatever um uh data model or large language model on top of that. It makes it a lot easier than with uh legacy companies.

SPEAKER_01:

So, when we're talking data from uh from a retailer perspective, you're talking CDP and understanding the buying habits of the consumer, that kind of thing.

SPEAKER_02:

Uh yes, and no, because the CDP uh is indeed sometimes a place where you can uh store customer data, but I'm not talking only about customer data. If you want LLMs really want to work well, you want customer data, you want product data, you maybe want uh marketing data. And the CDP is not always the right place to store all those kinds of data. So you need a place like a data lake or uh data warehouse or how you want to call it, where you can connect all these different data points on the consumers, on uh uh the uh products, on the marketing channels, on whatever, uh, to one single source where you can uh deploy your uh LLMs on.

SPEAKER_01:

And that goes back to what you were saying before about where the demand is for resources, so data engineers, data architects, that kind of thing.

SPEAKER_02:

Yes, yeah, there is a lot of uh change happening. So a lot of these roles were always important, but they were uh most of the time they were the ones that were sitting in the IT department. Uh, and now their uh data engineers are not only needed in IT, but also in the more client-facing um uh teams uh on customer service, on uh marketing. So we see that at least capabilities, data engineering and even data science are more and more uh going towards the teams that are customer-facing instead of just the IT department.

SPEAKER_01:

And I'd be interested to get your opinion on this actually. This is not commerce specific, but I uh listened to a podcast around uh Elon Musk and Grok, and the next iteration is going to be on synthetic data only.

SPEAKER_02:

Yes.

SPEAKER_01:

I can't I struggle to get my head around that.

SPEAKER_02:

Yeah. Oh, I am fascinated by the topic. So when I at work at the university, we were already dreaming of this situation because if you're doing research, and I will translate it to commerce uh in a minute, if you're doing research, the more data you have, the better uh uh you can see if something works or not. But it's really time consuming and really difficult to get a data set with, for instance, real users. Uh so what we sometimes see that if you are, for instance, a manufacturer and you sell uh certain parts that are uh expensive, uh I don't know, for uh for your for trucks, but you don't have hundreds of thousands of people visiting your um uh your e-commerce platform, but you have a couple of hundred, uh, then it's really difficult to do a b tests on uh conversion because the sample size is too small. But with synthetic data, you can send synthetic consumers to your platform and run a B test statistically uh as it were real consumers. And then the question is do these synthetic data sets look a lot like if you would have real consumers? And now we see that companies uh like uh Elon Musk, but also the research companies that are here are doing uh a B test. So they do uh a B test with real customers and they're doing it with their synthetic data sets, and it's scary how close the results uh look alike uh of those synthetic data sets as the uh real human interactions.

SPEAKER_01:

Interesting. So that will give the retailers an advantage whilst they still try to build up their own data sets.

SPEAKER_02:

Yeah, so the dream is that you can unli have unlimited access to uh data and can and now maybe some if you're a retailer and you have uh uh uh a decent amount of traffic and you want to a b test your checkout, then you can maybe run one, two, three, or four uh b tests a month, but then you can run uh four a day if you want, because your data set is uh uh endless.

SPEAKER_01:

Yeah, unlimited.

SPEAKER_02:

Nice unlimited, yeah.

SPEAKER_01:

Yeah, that helps. That helps. Um I do find it very interesting. Um okay, so you you're getting a client on board. Um, what's the first workshop that you run with these clients? Yeah.

SPEAKER_02:

Uh especially if we talk about AI, then uh of course, uh first we talk a little bit about uh about strategy, but what we want to do as soon as possible is um see and look at their processes and see where is their waste. And because AI is what I said at the start, really good in uh automating or replacing or helping with tasks that are uh repetitive, that are manual, and that are relatively well, not too complex, I would say. So we're gonna look for those spots where uh how do your processes uh in marketing or e-commerce uh are um uh created? And where do we see a lot of manual or a lot of repetition? And then we are going to look for use cases to solve that.

SPEAKER_01:

Okay, and although compliance is uh generally quite boring is something I think that you probably need to address, right? What's the biggest compliance pothole that you're seeing right now that you have to really be careful and help your customers avoid?

SPEAKER_02:

Well, uh there are many, uh, but we see a lot about data sharing. So these new technologies um are really uh easy for individuals. Uh so everybody's using ChatGPT, for instance, and we see so many customers that think, oh, I need to analyze my um um uh purchase or returns data, and they just uh download uh a CSV file and upload it uh to ChatGPT, not anonymized at all, not nothing, and that's a problem. So being in control of what individuals do while simultaneously, simultaneously allowing individuals to experiment with these tools and make their own work better, there are so many compliance issues involved there.

SPEAKER_01:

So let's just play that out. If if you've got a uh retailer that has a huge list of all its customers, uh it's got some data around their buying habits and it wants to analyze the data, runs it through Chat GPT, but that data then becomes available to anyone else.

SPEAKER_02:

No, no, no, that's not the case, but it is prohibited by a law to share and use uh uh privacy-sensitive data with any with whatever external system or whatever external server without any measures. Uh so it's not that if you put in your uh name and addresses into chat GPT uh and I will uh want to search, hey, what is Tim's uh name and address? I won't find your address, your address. So that's they are in theory, they are good protected systems, but by law, you cannot just put any uh privacy sensitive data in the large language models.

SPEAKER_01:

And it's then that that data is protected as well as that that data set is generally protected if there's a data breach.

SPEAKER_02:

Yes, indeed. If there's a breach, then you have a big problem. So anonymizing those uh data sets is uh needed, and that's where um uh it gets a little bit tricky because if you are an individual marketer or someone working at an e-commerce department or team, then you are using a large language model to speed up things. So you want to do a quick analysis and not first anonymize it and then uh all these steps in between. So building a system that allows it for easier anonymization of your customer data allows individuals to uh speed up their processes and work better and faster and more efficient and effective with large language models.

SPEAKER_01:

And are there any clients where your advice is for them to build their own LLM or is it simply too complicated?

SPEAKER_02:

It's really complicated. So uh what we do advice is to make variants of existing models, but if you want to build your own models, it's expensive, it's complex, and there are a lot of rules and regulations, especially in Europe, uh, around that. So if you want to build your own LLM from scratch, then you have to be a really, really, really big company. Uh, we now have, for instance, on um uh uh local scale in the Netherlands where I'm from, there you have a company that's trying to build their own version of TET EPT uh and it's tailored towards um a government, so really specific narrow-neath use case. And I think they've now spent around uh 14 million euros, so one four million euros just for the prototype. So just give you a bit of a sense what the amount of money is that you need to create an LLM from scratch, yeah.

SPEAKER_01:

Uh it's substantial, clearly. Yeah, it's substantial. And uh just on the last question on the um on the LLMs. So the total cost of ownership clients will want to understand. Yeah, if we're all using the same five or six LLMs developed by big companies, yeah. How do you manage that piece? Because surely that is very difficult to keep control of because the those going is huge. So the cost we've already seen the cost got chat GPT now, version five, the proper top-end model is$200 a month. So that's continue, right?

SPEAKER_02:

Yeah, yeah, it is. So keeping not only compliance but also cost and control is really difficult because a lot of people are using experimenting with uh LLMs, and also if they are using uh the API, even though the tokens that you're using from OpenAI aren't that expensive, it adds up quite quickly, and especially adds up if you're building an agent and your agent isn't well built, and it's just keep calling the API over and over and over and over again. So the way that we solve this is with orchestration tooling that uh allows uh not only um uh for certain people to build components and spend tokens, but it can also monitor how many um how is the usage per LLM, per use case, per agent, and per everything. And there are luckily uh also not only custom solutions but also relatively standardized solutions. And the one that we are implementing quite often at customers is from IBM, and it's called the IBM Watson X Orchestrate. Uh it's a long name, but the Watson X Orchestrate does what its name says, it orchestrates all these components of LLMs and uh especially tailored towards agents, and uh creates really a good dashboard, but also uh controls uh um roles and rights, who is allowed to do uh what, who is allowed to spend things, and not you can have budget limits, things like that.

SPEAKER_01:

Okay. The um we did kind of touch on this. We're just gonna move on to how this is impacting teams and talent, right? Um, we did touch on this, we talked about the front-end devs, we talked about data, data architecture. Be interesting to know what roles you're hiring for now, or you think you may need to that just simply didn't exist.

SPEAKER_02:

Yeah, good question. I'm not sure if there will be roles that didn't exist. So we probably will rebrand them a little bit. Uh so if you're for uh were an uh um uh, for instance, in online marketing, an SEO consultant, you're now uh uh a GEO consultant uh or uh whatever consultant is needed. That means that you're doing essentially the same work, making sure that your customers are being found within ChatGPT or within uh uh perplexity or whatever solution. So it has a slightly different skill set, but not entirely different roles. We do see, just like I explained earlier, that we need way more people on uh engineering, data engineering, data science, uh, and all the other data roles that are needed today, and maybe a little bit uh less uh people on uh uh front-end design, for instance, or copywriting or other uh repetitive but relatively uh uh incomplex uh work.

SPEAKER_01:

And you said GE from CE. Yeah, what was the G stand for?

SPEAKER_02:

Uh generated AI.

SPEAKER_01:

Right, okay.

SPEAKER_02:

Yeah, there are different names circling around here, and uh I think there's not a single name yet. Um, but uh we have now at least uh called it uh GEO, and um what we help our customers here is not only making sure that you're uh if you're uh being Googled, that uh uh the AI overview within Google is giving you as a brand or as a retailer, but also that you are easier uh to be found in uh perplexity, mistral, uh, grog, whatever large language model there is around.

SPEAKER_01:

Okay, and we uh we talked previously about re-skilling people, marketeers into prompt engineers is one of the things we talked about. Are there any lessons you've you guys have learned from having to go through that process? And it's been probably a quite quick transition.

SPEAKER_02:

Yeah, it's um uh it's a continuous learning loop. So we've spent more time on learning and development than ever, and a lot of that is is just basics on AI because if you're educated and you're trained as a marketeer, you're not necessarily trained um uh in certain tasks that are needed for the marketeer of today, if you have been doing that training a couple years ago. So we're training a lot indeed in uh the basic like prompting, uh, but also how do these um uh large language models work? We provide a lot of hackathon style um uh training where we say, hey, there is a problem. Uh, for instance, you are well, let's take the example of the designer again. You are a designer and uh you need to design uh not five uh ads but thousands for uh 20 different target groups within 24 hours. Good luck, it needs to be on brand, etc. So uh to be able to do that as a designer, where do you start? So we before these kind of hackathons, we provide learning about prompting, we provide learning about how these um uh image creating large language model work, and then we also support them during the hackathon um with people that do have the experience in order to have them this real learning by doing approach. And what we have to feel once we have the feeling that those people, this uh example designers, but can be any role, have the right skill, then we uh deploy them to our customers.

SPEAKER_01:

Yeah, makes sense. And if we if we we understand the teams, uh we understand the change in the dynamic of the teams, what um the kind of buy versus build to if we do you think that pendulum is gonna swing once Microsoft and SAP start to release their toolkits? I mean they're not they're not just gonna sit and see what happens, right?

SPEAKER_02:

No, no, um I touched upon before that creating these solutions is expensive. Um, we talked about the LLM specific, but making a good AI solution is quite expensive. So the ones who are moving the fastest are the big boys, uh uh SAP, Microsoft, uh, Salesforce, Adobe, you name them, Shopify, um the ones who have the budget. So you can expect that even though I think it's still necessary to train all your works uh and um also look at what tools can support my use cases right now, if you wait a little bit, then you will see that in well, if I look in my glass uh sphere, uh in two or three years from now, uh the big vendors will have a majority of the solutions that you need as a retailer today in their standard solutions or in add-ons.

SPEAKER_01:

Okay, so if if that's the assumption, how should the mid-market retailers prepare for that, in your opinion?

SPEAKER_02:

Yeah, uh good question because mid-market uh is always uh a little bit of a difficult squeeze because the really large vendors are um investing a lot in these uh solutions, but the mid-market isn't necessarily able to uh have the the best and biggest contracts with uh Microsoft, SAP, Salesforce, etc. So the mid-market, what they can do today is at least make sure that they uh look at again easily where is uh my process going manual? Uh, where can I speed up uh and generate more revenue? First, try to experiment themselves with current solutions and uh and tools that exist, and in the meanwhile, just play the waiting game a little bit, but uh a little bit because you need to invest already in um uh in those use cases today, so don't wait for the Microsoft's and SAPs because your competitors will uh out outpace you otherwise.

SPEAKER_01:

When you say play the waiting game, yeah, do you mean uh as a mid-market retailer, wait to see what the adoption is and the success is of some of the big brands with the products?

SPEAKER_02:

Yeah, yeah, indeed. Because what we now see is it's it's it's it's a storm. We're we're uh we're on a boat in a storm and we don't see uh land yet. Um but slowly but surely we see these use cases on agents, on uh back end dev support. Uh we see them emerging through the mist. But if you are uh a mid market retailer and you don't have the budgets uh to make a lot of mistakes, then you don't want to invest too much in these experiments where you Don't know with what the value will be that comes out of it. Let Microsoft, SAP, and Google do that experiment for you. That's way better, way more efficient. And the good thing about this time is that all these big vendors are really in a race for the market share. So what we can expect is that as soon as something is possible, they will probably try to deploy it. So even if you would have the budget to do a lot of these solutions all by yourself, probably uh Microsoft or uh whatever company will be uh faster than if you would do it yourself and probably a little bit cheaper as well.

SPEAKER_01:

Okay. So wait if you are in that mid-market section, just wait and see what actually gets traction and what's working.

SPEAKER_02:

Yes, and there the uh the weight is one side of the sentence and the other one is C. So you really have to make sure that you uh read, spend a lot of time and following uh what is um what all these companies are doing, but also try to test their solutions because unfortunately, a lot of uh companies say we have the best, the biggest, and uh the best working AI tool that there is on the planet, and if you actually work with them, you see you will see that there are a lot of things not as mature uh as they claim to be. Um, so try to create opportunities for yourself to test the use cases, test tools uh without having to invest fully in them.

SPEAKER_01:

And if we fast forward, say let's say three years 2028, um, in your view, what does the product page look like?

SPEAKER_02:

The product page, um I think the product page will be uh highly uh personalized, and what highly personalized means they can be in different levels. I think three years from now, I don't think that um we will have a a product page that is on all aspects fully tailored to your needs. Um, but what I do think is that um since AI allows us to create scalable uh digital assets, so scalable uh image text, etc., it also allows us to do hyper-personalization, finally um do it well, because hyper-personalization or personalization in general is something that's been a promise for the past years, but always was the problem. We have a CDP that allows to have all these segments and have uh a product page that is tailored to specifically to your needs, but we don't have the content to feed all those personalizations with. And now, with the help of AI, we can finally make digital assets on such a large scale, like images, videos, product information, even dynamic pricing. So all those components that uh are promised for a couple of years now by a lot of people that call it personalization or hyper-personalization, they will all be infused in the uh product base, I think.

SPEAKER_01:

And if we if we expand on that a little bit, uh I've been talking to clients about what what we've branded digital, which is a combination of physical and digital, right?

SPEAKER_02:

Yeah, yeah.

SPEAKER_01:

And one of the things I'm trying to understand is to make the seamless experience, if you walk into a store, yeah, ideally you're recognized, uh, they understand who you are, your buying habits, why you might be here, age, gender, all those things. Yeah, but no one wants to keep downloading apps on their phone for every store, right? If you look at the electric charging points in the UK, everyone had a different app and they realized that some people just were just not doing it. Yeah, so how are you going to create that seamless experience when people walk into a store without overcoming that particular problem?

SPEAKER_02:

Interestingly, it will always be a little bit of a problem because you are uh uh well and you are um waiting for the consumer to give you the information to to be recognized because it won't there are no systems that do it automatically, so there must be a reason for the consumer to say, Hey, uh it's me, and I'm the same person that you've seen uh online, or vice versa. And on the lowest, easiest level, uh you can uh do things uh like um um using an email address and combining those email addresses uh uh of using those email addresses to send out invoices in store. So then you can connect what's been bought in store to what's doing online. That's the that's the easy stuff. What we're now trying to do with some customers, and we're experimenting together with a partner of ours who is really specialized into physical store environments and making those digital is really narrowing the gap between what is online and offline. And one of the use cases, for instance, is if you are uh shopping for uh trousers, for instance. Uh and I'm not sure how many times you do that, but uh I think it's not the most fun thing to do. Uh online you see these trousers, but you don't see the fabric, the size, etc. So you want to do it in store, but in store it's always busy, it's always you have all these employees that are uh either not available or too much available. So, what we now do is we have uh uh it's a prototype. We have a place where you can type in your email address, then you can uh uh see what products you have liked uh in your digital environment, and the products that you've liked in your digital environment, you can go to a shelf, and then the shelf there will blink a light and where you can see where the product is located in your size and your color that you've selected. Then you can go to the dressing room because the dressing room is also connected in this protein type, and it's it's actually built, it's real, you can real test it. So you can walk with the trousers to the dressing room, and then you think, well, it's maybe not the color that I want. And in the dressing room, you can just say, Yeah, you see your trousers, push a button and say, I want it black. One of the people in the stores gets a message, brings you the black one, and then you say, Well, I need a size bigger, size bigger isn't available, but within that screen in the fitting room, you can have it delivered uh at your home within minutes. So this elevated experience, all these systems around it, that's what I thought through Figital. And um the solutions of today and the ability to connect online with offline, uh yeah, this will be the future, I think.

SPEAKER_01:

Okay, so there's there's a there's a long way to go then for the physical retailers right now. Yeah, yeah. And there's a lot of uh there's a lot of decisions to make for the online only uh retailers right now as well.

SPEAKER_02:

Then yes, indeed, the uh online only as well, because if you're uh online only, uh your um uh business model is a little bit less complex, of course, but it you also have less to offer your consumer. So depending on what kind of business you're in and what kind of products you have. Um, but in general, we see that the companies who have both a physical channel as a digital channel perform better, and also pure players who at a certain point in time uh say, Hey, we're going to open our flagship store. You see that it really helps um um with the increase of their brand, but also just performance of their online channel as well.

SPEAKER_01:

Yeah, definitely. Gymshark. Yeah, good obvious example of that.

SPEAKER_02:

Yeah, they just opened uh a store in Amsterdam, I think, last month, and uh I'm sure that it will uh definitely lead to um better performance online as well.

SPEAKER_01:

Yeah, I agreed. And one of the problems that retailers face uh is checkout drop-offs. So, do you think AI, AI agents are going to help solve this problem?

SPEAKER_02:

Uh yes, because agents will uh know you as a customer. So if you're using, and that's not an agent, of course, but if you're using ChatGPT, you or whatever model, you sometimes have the feeling that this chatbot already knows you. And this means that it uh also knows how you want to buy stuff. Do you want to evaluate the price two, three times more, or do you just say, hey, I want this product, you have all uh access to my bank account, and uh just make sure that I get these trousers uh sent at home? So talking to an agent, like literally talking or typing or whatever form of communication it will be, uh, that will allow for more tailored checkout experiences, and therefore, I think also for uh less dropout or more conversion, or how do you uh how do you want to call it?

SPEAKER_01:

Interesting. So I know it's a it's a problem that we hear uh our clients talk about fairly regularly. Um what in your view uh uh the top roadmap must-haves uh in the next 12 months? And I appreciate the pace of change is is very fast. But what's your view on that?

SPEAKER_02:

Yeah, the pace is really fast. So uh top roadmap map must have um, well, maybe boring answer, but start with your data because everything gets easier after that. So top a roadmap, look at all the systems that you have, look where you are storing your data, and make sure you are capable of uh centralizing the data and then feeding it to whatever system you want. That I think that for most of the uh even the big but also the small one, all retailers um uh are lacking in capabilities there, even though they think often that they have it uh right.

SPEAKER_01:

Okay, it's data, data, data really focused on that.

SPEAKER_02:

Yeah, yeah, it's it's a boring answer, I know, but it it it is the most important answer. And well, we're I think we're both into marketing, and as marketers, well, uh this podcast is about AI. So we like shiny stuff. Um we like all the sparkles, but we can only uh provide these sparkles once the base basis is uh right, so it's really fun about talking about all these AI solutions, but they only become valuable if the foundation of your data uh and your uh processes of collecting and uh storing and sending out the data to other systems is right, then you create value.

SPEAKER_01:

And do you think that applies um when we talk about the focus on data to software vendors as well as retailers and brands? Or is it a different dynamic?

SPEAKER_02:

Yeah, I think there is a little bit of different dynamic, uh, and I'm not sure if it's really uh if you're uh it really depends if you're a brand or if you're a retailer. I think it has more to do with the complex complexity of your company and the complexity of your market. If you are uh a pure player with a brand that has a single product uh and you don't have a manufacturing uh process because it's just buying and selling, uh then it's a different story than if you are in the uh in an insurance company where you have a lot of rules and regulations. So I think it's mainly about the complexity of your um company and the complexity of your market.

SPEAKER_01:

Okay, and as agencies or um systems integrators, yeah. Where can you still add margin once platforms bake in AI?

SPEAKER_02:

Yeah, this is um a big struggle, of course. So um if you are automating things or if you are scaling things with AI, then the first thing you uh notice our business model that's based on our rates, day rates uh doesn't work anymore. So at first we thought, okay, what are we going to do? Because we want to invest in AI solutions, not only internally make our process efficient, but also externally. But it's not fair if we have sold something and we say it costs you I don't know, 10 days to build it. And then we use AI and it only costs us three days, and we uh uh and we don't say that to our customer, that's not a fair way of doing business together. So then you have to communicate honestly that you're doing three days. But luckily, what we see in that uh companies don't say, Hey, uh, you've asked for 10 days and you're now doing only three. Uh, give me my money back. No, they say, okay, nice, we only spent three days, so we have seven days left to do even more stuff because the work is endless. So at first we were a bit afraid of our business lines, but now we see that almost every client wants to do more instead of cutting budgets.

SPEAKER_01:

Yeah, that makes sense. Um, and uh inference uh influence is the new cloud bill. What what numbers are you seeing in the in the work you're doing?

SPEAKER_02:

Uh yeah, I think that um uh especially on AI, well, those bills are are increasing, of course. Yeah, costs are uh are getting higher and higher, um both on uh on on traditional cloud solutions as on uh using AI models, but it's it's the use of these clouds and the use of um uh AI models in general, it's not expense too expensive yet. The hyperscalers make sure it's still interesting, but yeah, as demands keep sort of sourcing uh keep going up, then you can imagine that prices will sooner or later come up, especially if you have a vendor lock-in um with a certain uh if you have a vendor login with AWS, for instance, then uh you might have some risk in uh getting those costs up.

SPEAKER_01:

And um what are you saying to clients about that? What how what what levers are available to keep the cost at a sensible level?

SPEAKER_02:

Yeah, we uh we always make a business case up front, so we always try to make the bit that's that's with the start because once you have started, then the levers that you have are limited. Of course, you can make sure that you have uh if you're building agents and or if you're using APIs in uh for open AI, you can make sure that they are working efficiently. That but uh the first step is about okay, what is my budget? What do I want to do? Uh really easy again, uh, and how can I make sure that from uh the design up front, the business case works, that I don't get surprised with costs afterwards. Uh, because yeah, that's what we unfortunately see that a lot of retailers are being um uh lured in by uh vendors with solutions. They have discounted rates on cloud, discounted rates on other license uh fees, and after years, all these licenses got increased. They cannot get rid of their current systems, they can uh not get rid of their tools that they're using for their processes, they cannot get user uh uh get rid of their uh cloud environment, and suddenly their whole margin is vaporized. Um, calculating upfront with the real numbers that's uh the most important part.

SPEAKER_01:

But that's not an unusual model, is it? That we've seen over the last couple of decades. You know, you get a client bought into a product at a low price point, and then once they're hooked and it's passed their business and the price slowly increases, right?

SPEAKER_02:

Yeah, indeed. And um, uh, I think that's one of the reasons why open source uh well it's it's still around, but it's uh used to be a lot more uh used. Now it's more uh software as a service uh type of platforms, and there the risk gets higher to get in the situation where uh cost structure just rises to a level where you cannot afford uh uh your build anymore. Um, so open source is one of the directions and solutions towards that, but the counter movement of the vendors is that they're all moving more and more away from uh open source systems, yeah.

SPEAKER_01:

And but presumably, as well as we saw with SaaS models, that your there'll be more and more competitors. Well, well, actually, we've we talked about the cost of building an LLM. Is that less likely because of how complicated and how costly it is?

SPEAKER_02:

Um, no, I don't think so, because there are uh also in the LLM world, there are just a few players who are fighting for dominance, and in whatever market uh where um uh parties are fighting for dominance, so some will win, some will lose. And if you only have well, you can name probably five relevant uh LLM uh providers, they will become expensive, and especially because the cost of running an LLM is also insanely high. Uh, it I'm probably just like you reading these stories, not only uh how much it costs to run the servers of these LLMs, but also to hire the right engineers. We're talking about prices that are uh normally being done in football between football clubs. They're now uh engineers and scientists are being bought from one company towards another, and those costs they need to come from someplace and somewhere, and not only from individual and consumers, but also for the companies that want to uh build solutions uh with AI-infused uh tooling or feature.

SPEAKER_01:

Yeah, absolutely. What um what's the most overhyped AI buzzword in your view at the moment?

SPEAKER_02:

Igentic AI agents, AI agents. So you have automations, so where you automate a process making use of an LLM, uh, and you have agents who are really an assistant and uh using different large language models to provide you with the best answer for the task that you want, and just automating a process with a little bit of AI in it is not an agent.

SPEAKER_01:

Right, okay. But the uh uh they've been passed off as a gen. Yeah, yeah.

SPEAKER_02:

Yeah, because uh AI uh well, first everything that you label AI makes a lot of money, and everything that you uh label agentic AI that makes even more money because marketers like shiny stuff and they uh they will buy it.

SPEAKER_01:

Yeah, what uh what would you say is the most underrated metric that uh e-commerce leaders should be tracking?

SPEAKER_02:

Time to value. So we're in a time uh of experimenting with new technologies, and this is really great. But I see use cases being pursued uh for uh weeks, months, sometimes even years, with a lot of time uh and money going in that, but not actually measuring what is the time from the first experiment, uh, especially in the world of AI, towards actual value and making sure that you get better in that or have a cut-off time. So time to value is the most underrated metric regarding AI.

SPEAKER_01:

Interesting. And I imagine you do quite a bit of research in your own time to stay uh as on top of AI as possible. Um, is there a particular non-tech book which has kind of shaped your view of AI, particularly like use cases and where it might take us?

SPEAKER_02:

So there are a lot of books that help me throughout uh the way, but uh I think the book that I most often uh get back to is actually a Dutch book. So I'm not sure if there's an English version, but uh Jessica GPT can help you probably, and it's uh has the title Strategy uh is equal to execution. Um, and I think that also has a little bit to do with the time to value. People always want to make things perfect, and their own uh perfectionism is standing in the wave to true innovation and true value. And this is a book with really easy models and systems, uh, also models that are already made in the 80s by strategists, innovation uh uh scientists, etc. But it works really well. So if I get stuck somewhere, then I grab that book and it helps me uh to get anything further.

SPEAKER_01:

Well, thank you very much for taking the time to join us on the podcast. I personally have learned a lot and I'm pretty sure that the audience will as well. So that we really appreciate your time, Sandra. Thank you.

SPEAKER_02:

Yeah, thank you, and you're welcome. See you soon.