MetaDAMA - Data Management in the Nordics

4#8 - Shuang Wu - Service Platform: From Analytics to AI-Driven Success (Eng)

Shuang Wu - Lead Data Engineer Mesta Season 4 Episode 8

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 41:11

«We want to make data actionable.»

Join us for an engaging conversation with Shuang Wu, Mesta's lead data engineer. We delve into the concept of platforms and explore how they empower autonomous delivery teams, making data-driven decisions a central part of their strategy.

Shuang discusses the intricate process of evolving from a mere data platform to a comprehensive service platform, especially within organizations that aren't IT-centric. Her insights emphasize a lean, agile approach to prioritize use cases, focusing on quick iterations and prototypes that foster self-service and data democratization. We explore the potential shift towards a decentralized data structure where domain teams leverage data more effectively, driving operational changes and tangible business value in their pursuit of efficiency and impact.

My key learnings:

  • It’s not just about gaining insights, but also about harmonizing and understanding data in context.
  • Find your SMEs and involve them closely - you need insight knowledge about the data and pair that with engineering capabilities.
  • Over time the SMEs and the central data team share experiences and knowledge. This creates a productive ground for working together. 
  • The more understanding business users gain on data, the more they want to build themselves.
  • Central team delivers core data assets in a robust and stable manner. Business teams can build on that.

The Data

  • You can integrate and combine internal data with external sources (like weather data, or road network data) to create valuable insights.
  • Utilizing external data can save you efforts, since it often is structured and API ready.
  • Dont over-engineer solutions - find you what your user-requirements are and provide data that match the requirements, not more.
  • Use an agile approach to prioritize use cases together with your business users.
  • Ensure you have a clear picture of potential value, but also investment and cost.
  • Work in short iterations, to provide value quickly and constantly.
  • Understand your platform constrains and limitations, also related to quality.
  • Find your WHY! Why am I doing the work and what does that mean when it comes to prioritization?
  • What is the value, impact and effort needed?


Service Platform:

  • Is about offering self-service functionality.
  • Due to the size of Mesta it made sense to take ownership for many data products centrally, closely aligned with the platform.
  • Build it as a foundation, that can give rise to different digitalization initiatives.
  • If you want to make data actionable they need to be discoverable first.
  • The modular approach to data platform allows you to scale up required functionality when needed, but also to scale to zero if not.
  • Verify requirements as early as you can.


Working with business use cases

  • Visibility and discoverability of data stays a top priority.
  • Make data and AI Literacy use case based, hands-on programs
  • You need to understand constrains when selecting and working with a business use case.
  • Start with a time-bound requirements analysis process, that also analyses constraints within the data.
  • Once data is gathered and available on the platform, business case validity is much easier to verify.
  • Gather the most relevant data first, and then see how you can utilize it further once it is structured accordingly.
  • Quite often ideas originate in the business, and then the central data team is validating if the data can support the use case.


From Data to Service Platform

Speaker 1

This is MetaDemo, a holistic view on data management in the Nordics. Welcome, my name is Winfried, and thanks for joining me for this episode of MetaDemo. Our vision is to promote data management as a profession in the Nordics, show the competencies that we have, and that is the reason I invite Nordic experts in data and information management for a talk. Welcome back to Meta Daymar, and today we're going to have a really interesting guest. We have a Schwang Wu with us, and schwung works at mesta, which is a quite interesting company, and I think when we talk about a lot of applications of ai and data, it it oftentimes becomes a bit abstract, right? So by talking with a company like mesta, who does really road service constructions that are hands-on in a lot of the work that we see on everyday life, and especially now when winter comes along and snow falls down, these are the people that keep the roads open, that keep it going, and they use data and AI to do so. So that's why I think it's really interesting to have the chance to talk with Shuang about it, and we're going to talk about the topic from a bit of a different perspective. We're going to talk about, as we titled it, from data to service platform. So what is really the fundamental element of building a data and AI service in an organization? And that is the platform.

Speaker 1

Now, I have to say, the term platform has been used in a lot of different ways and that is the platform.

Speaker 1

Now, I have to say, the term platform has been used in a lot of different ways and there is no really clear definition of what a platform entails, so that makes it a bit tough. I traditionally kept really close to even Botcher's definition, and he wrote it on a blog of Martinin fowler and I think the blog post is called what I talk about when I talk about platform, and he well, he talks about the digital platform a bit more on a general level. But he said a digital platform is a foundation of self-service apis, tools, services, knowledge and support which are arranged as a compelling internal product. That means that autonomous delivery teams can make use of the platform to deliver product features at a high pace with reduced coordination. And when I hear the words autonomous delivery teams, high pace, reduced coordination, I immediately think of Data Mesh, distributed teams working in a distributed landscape, and we're going to talk a bit about that as well today. But before that, let's talk a bit about Shuang, who is my guest today. Welcome.

Speaker 2

Hi Yixing Ku, I'm really. It gives me pleasure to be here to share my experience and also perspectives with you, and this is a great start. Yeah, but I would just introduce myself first. Yeah, I'm Shuang Wu. Currently I'm working as a lead data engineer in Mesta and I have like 11 years experience in data and AI field. I have software or data engineering background, also becoming more like data scientist or machine learning engineer for a period, and now currently I'm leading the development of data platform in Mesta, and currently we're also like in the phase of expanding data platform to become more bigger scope, for example, like a service platform that's not limited to data analytics or insights generation, but also automation applications or the AI opportunities as well. So, yes, this is my kind of current situation, my background. Yes, very excited to be here.

Speaker 1

Fantastic to have you and that's one thing that I found particularly interesting when you introduced yourself and you talked about your background. You have quite a various background in data, which is really interesting, because you could see the problems and the issues and the challenges we have from really a lot of different perspectives, and I think this is very valuable, not just for the work we're doing on a daily basis, but especially for the podcast and the topic we have chosen. But before we dive deeper into the topic, what do you do when you don't do data? What are you doing in your free time?

Speaker 2

Yes, of course I like to relax and currently, uh, we just moved to a house, uh, last year, so something new to me. I need to learn how to, like, clean the garden, or I could do some work there and also, like, example, good food, both maybe, uh, eating outside or so cool and singing and dancing sometimes as well, to make me feel relaxed as well.

Speaker 1

That is fantastic and congratulations on the house. Thank you. So when you started out working in data in those various roles, what would you say sparked your interest for the topic?

Speaker 2

Yeah, my background. Originally I studied software engineering. At that time there was not so much just data analytics in school as what I remember, and now it's a lot more AI data. I could be quite broad, I can work various things. So I started actually in more like software development, but then I also previously. When I study, I really like numbers, like mathematics, like how can I get value or analyze numbers to figure out and get answers? I think I'm really I like to be logical, to not make decisions just by guessing, also just see pure numbers to make decisions. Then I think, oh, data is a perfect field for me, so it's combining machinery and also mathematics, and also close to the business because I also really want to solve real cases. So basically I found this is a perfect field for me mathematics, and also close to the business because I also really want to solve real cases. So basically I found this is a perfect build for me after the first year or two in my career.

Speaker 1

And I also like the combination of I want to create value on the one side, but also I want to measure what value we actually are creating, and I think that those two go really well together, so I like that. Let's talk a bit about Mesta as a company, and it's not a new company. I think it's been around for like 20 years at least.

Speaker 2

So, to be honest, I joined Mesta like three years ago, but I think it has been part of the public road bureau in Norway. I think it has been part of the public road bureau in Norway and then it was become a private entity 20 or 21 years ago, yeah, something like that, because we had like 20 years last year. Yeah, it's the leading provider One of them I cannot say the biggest, but one of the leading providers in road maintenance and construction built in Norway, and we perform construction maintenance tasks for Norwegian infrastructures road tunnels, also rails and bridges, docks or some airports.

Speaker 1

So quite complex really what you are doing, and it really is paramount for public transport and infrastructure in Norway to work, especially since we are living in a country that is, we have a lot of weather right.

Speaker 2

Yeah, I think to be one of the kind of this road maintenance providing. Norway probably is one of the most challenging in this category in the world and yeah, also Mestad mission or vision is to yeah, you touch upon it like get people going forward safely every day on the road. So that's our mission, and it's complex as well, especially in winter times. Yeah, so a lot of potential, of course, to that regard. Like we need the data, we need fact to be able to optimize our jobs and also to help both our employees, our customers, also everyone in the society to be able to work better and also travel better.

Speaker 1

So how do you use data and AI on that journey and to get the gateway to our main topic, the platform. Where do you see the platform is important?

Speaker 2

Yes. So if we're talking about platform investor, I think actually it was not so long ago that we started this digitalization journey Slightly before I started, I think it was in 2020, from the top leader, we got this order mandate that we cannot just make decisions by guessing. We need to analyze what data we have and also harmonize them. I think we did have data in various applications, but there's no one single place that you can actually get access to them and also get into a understandable, analyzable format. So in 2020, I think there has been a MVP how can we get to like use of our data across different systems and then we use like six weeks to just showcase? Yeah, we can. We can just get all the data into Azure and make it available and make some good dashboards in Power BI. So actually, we can harmonize data already. After that, we have built teams, of course, with external competence.

Speaker 2

In Mesa, I think there was no traditional development team this concept. It's more like there are IT operations. There are not so much, for example, data or software development environment. So we used a consultancy to help us. So they did a great job, of course, and then they hired me and also some other colleagues in the centralized team to be able to drive it forward, and so we managed to just keep going and get more data into the platform, harmonize it and make good data models.

Speaker 2

Also, we have recruited, in a way, a virtual team in the organization that they are the main experts. They know the data, but they don't necessarily understand everything how data engineering works. But they would love to learn, for example, how to draw insights from a data model in Power BI. So they are really of great value for us so we can combine technology, data and business. In that sense, then, we have, until now, I think, we have gathered the most important data assets into our data platform and created the insights almost from all domains, and we get 250 to 300 weekly users for all the core reports we created. So I think it has been a great success within a short time, also with not that many people. We have a centralized data engineering team plus virtual members from the business.

Speaker 1

Very interesting. I think there's a lot of things we can talk about. The first thing that popped out to me is combining data, and I would imagine that a company like Mesta has a lot of internal data that you are collecting yourself, but there's also a lot of third-party data, like weather data, that you are collecting yourself, but there's also a lot of third-party data, like weather data, that you are using and utilizing. How do you combine that in a good way?

Speaker 2

Yeah, so I think that's definitely. For example, weather data is important, and also we have data from Southern Cypress, like, for example, there is the National Way Data Bank, nedb. They have quite a lot of data for all the road networks. That data is important for us to have a good data platform or the pipelines that are in place that we can actually utilize or reuse them, that we can actually utilize or reuse them. So I think why we have platforms is more like you don't have to create things again and again from scratch, at least so you can probably reuse quite a lot and do some customized transformation, tuning the parameters. So then we have good quality data from externals and already we made good data model from internal. We try to adopt the star schema or at we have good quality data, unique keys, obvious keys also, so we can make sure that if you have another data source, probably it's like automatically joined without you need to do extra stuff.

Speaker 2

But of course, this is, I think, what I mentioned. External data actually is not the most complex because quite often they have good data model already if they have APIs available. Sometimes the applications we use internally is a bit more complex because maybe that's our domain. Also, it's not super standardized. A lot of also, for example, from the vehicles. If the sensor data from the vehicles, it can be quite complex as well. We really use a lot of efforts to harmonize big, huge amounts of data from there as well.

Speaker 1

Yeah, I think IoT data is still a challenge for many organizations, especially when you're using IoT data not just to build a historical database, but also to get real-time information or get information as quickly as possible near real-time. When you build up your data platform and maybe this is a bit of a broad question, but there are a lot of things you can optimize for or towards. What was your intent in building the platform itself and how did you optimize for the different use cases?

Data Ecosystem for Actionable Insights

Speaker 2

Yeah, great question. I think that's really important actually to connect technology or data from the source with business, with business cases, also with business owners, and connect to the strategy or the vision. So we actually create things not just for creating a super platform collecting data like super quick, but the why. There's an example, for example, maybe we can collect data near real time, but if the user just use it every now or even just every day, what's the why? And use all the efforts? And also it's cost right. The platform when it runs every minute and there's also cost.

Speaker 2

So I think, and also master is not like an it company. So we really have a lot of other things to kind of. We cannot put all the resources in ideal data development, so we really have a small team. We need to use all the resources smartly. So I think we use the lean, agile approach. We prioritize use cases with stakeholders based on present value, potential impact and also efforts. So that's probably a very normal matrix, either two or three matrix that we can just prioritize accordingly, choose the most valuable, also probably not taking that much effort, yet a low hanging, and of course, we can climb up a bit and start with bigger value and maybe a bit complex In that sense, and we always focus on short or quick iterations.

Speaker 2

We can make a prototype or MVP first, but if this data model is good enough, or if this visualization is good enough, we need to verify with the users, with business owners, as quick as possible and then we can decide how can we create pipelines. It also decides our constraints, limitations like what's the updating time we need, how robust we need, and data quality constraints, what kind of maybe tests we need to conduct before we push the data. So everything I think we try to focus on solving the problem. At the same time, we also set the problem with an internal platform-specific goals Could be errors like reduced errors, number of errors to only 1% or something, or how much time we take to debug stuff, or could it be time to market to debug stuff, or could be time to market. So those internal metrics can also help us to next time we develop quicker and also make the platform more robust. Yeah, so that's kind of like we need to kind of always talk to each other from the development and also business.

Speaker 1

I like that. There's a lot of really valuable stuff right here, starting with find your why. Why are you doing? That is really the background for any of the work you're doing. You need to understand why you're doing it and then you need to prioritize it accordingly, and I like your three points of value, impact and effort. They really resonate. There was one thing that we talked about a bit in the beginning, and it's that you call it not a data platform but rather a service platform, and there is a difference, I think, in the development. When I think of a service platform, the first thought that pops in my head is self-service. How can we build a foundation so everyone in the business can use data, democratize data, use it more freely and accessible? I don't know if that's the thought behind the service platform, but maybe you can tell us a bit more about that.

Speaker 2

Yeah, and also for myself, I think we are trying to figure out, for example, what entails in a service platform. What kind of boundaries are there like, what does it include or not include? What's the mission and goals. So yeah, we're also on the way to figure out and we, for example, we have a new partner now and we're trying to, yeah, defining those stuff together. But I think one probably key difference between data platform and the service platform or at least in Mesta, I think the same word in different companies or different even like occasions are different.

Speaker 2

Originally we still focus on data analytics, data insights, make data available and people can, for example, make dashboards themselves. Or we maintain kind of like a core reports because master is not so big in data. So then sometimes it's really difficult to find like a real data mesh way that you find data owners, repo builders across. So we took a lot of ownership ourselves. So data platform actually I felt in Mesa, is like including both the infrastructure, the self-service portal, plus the products or services we have delivered from the centralized team. So it's a bit vague sometimes. Does platform include services and products on top or not? Currently I felt we included, since we have quite centralized team and to shift to a bit more probably decentralized in the future. If master have more responsibility, centralized in the future, if master have more responsibility, competence, they can build multiple domain teams with data competence.

Speaker 2

I think then service platform will make easier. For example, we can build APIs, make data available, or we can build a good service product or data catalog so people can, yeah, just explore data themselves and build products really easily. I think that's kind of like the dream scenario. We don't have that yet. So I think on the transition, we probably need to build, in a way, a centralized service platform. First, we still like have everything or have the possibility for, for example, make data available or make insights available, but if no one needs it yet, if we don't have those are the main teams yet maybe we can design the architecture and then when the use case comes, we know that, yeah, we can just enable it.

Speaker 2

And the other, I think differences are we can include a lot of automation, mass data integration, maybe build applications on top, which I'm not so sure whether data platform actually should include that or include that already. That's vague, of course, but that's maybe in the future. It will be a digital platform because in my head. All the digitalization no matter whether it's data, insights, automation, integration is like part of the big digitalization platform. So probably that's kind of what we're targeting for.

Speaker 2

So we have the data foundation already that's really important and from there we can build, for example, masterintegration. We have the master data objects and we just need to build the integration layer on top and then two different systems and then we have good quality and consistent, valid data across systems or we can automate processes easily From the data. We can create maybe alerts or we can make rules, and then we have the possibility also to send API calls or call other things. So I think it's just like a huge digitalization opportunities or pipelines together and big ecosystem portfolio. I felt if we don't use data, if we have reports but people don't use it or didn't actually cause any change in work or tangible value, it's probably still not that valuable. So I think that's kind of what we're targeting. We want to make data actionable.

Speaker 1

I think that's important. Right, you want to make data actionable and you want to have tangible results. But there's something about and you talked about the data ecosystem and the possibilities that are there in building on top of a foundation, and I think this is valuable because it's just something that I think we have gained through the entire discussion in the last years about the modern data stack. It's the flexibility right, you get applications that are tailored to a specialized need that you can use when you need them, but you can also scale to zero, right, if you don't need them, and I think that gives the flexibility, especially for smaller, mid-size companies that don't have the money or intent to build up an entire platform from scratch. But you can plug and play, really, and I think that's definitely an advantage.

Speaker 1

At the same time, I see there is a bit of a retraction from that modern data stack because it gets confusing, but there are too many players on the market and too many options out there.

Speaker 1

It's hard to set up your data ecosystem if you don't have a team where you have, where you have, like, the internal knowledge base and you have people that actually understand how to set up their ecosystem. So for a company that doesn't have that internal team or the centralized team that you're talking about, it's hard to make the right choices and, with that said, we talked a bit about data mesh. This is something that I think it's a bit confused sometimes that there is a federation of responsibility to domains. There's also a federation of governance, but the data platform itself is really domain agnostic, right? This is a foundation that should serve across domains. That could serve even multiple purposes, and I think the way you have set it up was starting with that foundation, making it possible for teams to emerge at a later stage if their competency gets to a certain level or the maturity gets to a certain level. I think that's really smart. So the big question here is when you provide that foundation, how does the centralized team or your team work towards the different domain?

Speaker 2

Yes, yeah, great summary and the transition. I think this. Yeah, it's always like not so straightforward from the beginning, I think, but we have tried to create this virtual team. For example, we have the core members, the centralized team members, both internal and external, so that's more like a data engineering or analytics team that can have knowledge in, for example, azure or Power BI, plus the business owners and also, if we have use cases, we will involve. We will try to define, for example, who is the business owner, who is the data owner, who is the data owner, who is the product owner, basically, and who are the core users. We involved some of them already in this virtual team and someone is probably based on cases. We try to involve them together from the beginning, yeah, so understanding their problems and we have probably some discussions and then we, yeah, design. I think it's.

Speaker 2

We have learned it's probably smarter to not just start building First. Yeah, figure out like a bit of why and also prioritize deliveries. You probably cannot start everything at once, but you can ask the owners to prioritize what you want or what's the most important, and we can evaluate if that's possible and also try to make some prototype or some drawings, a draft. Is this the ideal solution? Maybe a few rounds solution, maybe a few rounds. And when we already know that what's the ideal solution, then it's easier to design the data pipeline data model. Otherwise, you probably use a lot of time already designing data model pipelines and it's not fit.

Speaker 2

So we try to have this mindset to verify as early as we can, and also we after a while I think this centralized team also get to know the data more. We understand business more and more, and probably the domain experts. They also understand the data a bit more, how we work and what are the opportunities. Not everyone maybe knows that we already gathered so much data. We should make it actually. Yeah, we should make the data catalog or product catalog better. So probably everyone knows that. Okay, we already have those data, but probably after a few rounds they understand more and next time they want to build themselves.

Speaker 2

I think ideally we don't have to build all the products ourselves. It's like data mesh we deliver the core data assets in a stable, robust, good quality way at the right time. Of course we try to take ownership until certain level and then the business should maybe take over and make whatever they want. And this boundary probably in the beginning, bit overlap, and then we see how do we tune it and I think we are also on the way to try to define this now. We can think our probably this service platform team or be an internal provider that we define bits, what do we deliver, what's the SLA, and also, if they have orders and how should we handle it, who should pay for it. So yeah, I think we will try to define the procedures there.

Speaker 1

Very interesting. What I really like is that you talked a bit about gaining competencies and building really data and business literacy, and I really like that. You chose a really hands-on approach to it Make it use case-based, work through a use case and then build that competency around that. I like that. So there's one thing that came to mind, and that is when you're working with the use cases. You know what the user wants and expects from you, but it's it's more or less up to you to see certain restraints like, let's say, data availability restraints, restraints on keeping the integrity of the data that are really data management problems and you wouldn't expect the user to know about them. But how do you work with those restraints in the process of verifying the use case?

Utilizing Data for Efficiency and Impact

Speaker 2

Yeah, I think probably that's a bit like product development as well. Not everything is clear from the beginning, but we try to have some proof concept first Usually time bound could be a couple of days or a week or something and just verify do we have the data, is it in good quality, and if we have the technology or if the performance is going to be good, I think we do have the competence or we can actually verify it in short time and next time maybe also if you have done it already once. Next time, if a similar use case comes, we probably can direct the answer, and we do have some. Of course, with the platform it's much easier because all the data is already available in a good format. Then you can do some SQL, you can do some Databricks, notebooks, so make it much easier to verify, and then we need to show it to the user in a way that stands then.

Speaker 2

So not like SQL. Probably that could be a dashboard, could be some PowerPoints or just simply say, oh, we don't have the data for this. So I think just do a concept and verify and then we communicate and also could be discovering what's available. You know, originally they look for a, but maybe we figure out that we have b and could be even better or could be alternative. So sometimes it could be top down, bottom up approach and try to find the sweet spot between what's available and what's valuable.

Speaker 1

So so I think we should talk a bit about some concrete use cases so it gets a bit more tangible for people. And there's one thing that I've read the book Invisible Women years back, and it's really a book about data bias, right. The author Caroline Paris talks a lot in the book about different use cases where you can see that the world is really designed for men, right, and there's a bias against women, and she had this one example that I found really interesting and there's a connection to MESTA it's coming. She talked about snowblowing as a gender bias activity and I never thought about that that way.

Speaker 1

But there was an experiment that she talks about in Sweden where companies started cleaning out sidewalks before they cleaned out the roads and that gave an enormous increase in pedestrian safety, right, I think up to like almost 80% of pedestrian injuries occur during winter and when there's snow, and most of those pedestrians are women. So by cleaning the pathways before cleaning the road, they really decreased the amount of people that got injured, that fall, that broke a leg and with that, roads got safer. So it really had an impact on everybody's life. On using data to see what can we change, what can we twerk in the way we do certain processes to get a better result, to get more safety, for example and since we are talking about snowblowing and MESTA, I think you probably have a couple of good examples of like a practical use of AI and data initiatives in Mesta that are supported by your service platform.

Speaker 2

Yes, of course, and I think production, or winter production especially, is a really big part of Norwegian road network, of course also of Mesta, both in terms of like how many people working on it, how much investment and also how much cost and income they have have. And we do process, actually I think more than 50% or even more amount of data coming from the production system. So we firstly we just try to gather data before maybe we know exactly what can be, for example, a good use case or actions we can do. But we need to gather data first. So we try to just analyze what APIs they have and try to gather the most important production data first in a stock schema format, and that's more like we use lake house architecture. So it's, for example, bronze, silver and gold layer. We have probably, for example, silver available and then we can, based on the use cases, we make gold tables and we will aggregate different or filter differently. Some example could be.

Speaker 2

Anyways, we need to have overviews. Usually, people just like to simply have some overviews. It's not so easy in the system actually, because the system is usually designed for actual production, not for, like, aggregating stuff. So simply just to see how many maybe orders will we have? How many has been approved, how many are still open? Because, for example, minimize the number of open orders or how do we call it, like the trips, if, if we approve it and then we can probably get the money flow better. So a simple KPI can help us to have a better, for example, cash flow or efficiency, and also with better data, we can see also in a trip in winter, for example, what is actually the production amount in terms of like how many kilometers we have, what's plowing, for example, how much salt we have used and what is the parameters like dosage on the vehicle and those type of very detailed data. If we aggregate it also, we can see do we use too much, maybe salt? Is this necessary? Also, we can actually track during a trip, some people, maybe some vehicle stopped too much because we're not only using our own employees but we use quite a lot of subcontractors. We pay for them, we have an agreement. For example, we only pay for when they are allowing but not when they're stopping, for example, or if they stop under 30 minutes. But if we have the detailed views and maybe set some business rules, then we can see is this trip a good trip. Maybe some we shouldn't approve it because they have stopped for an hour before. Maybe people just approve already. So we can get the potential like maybe losing money or too much cost, much cost. And also if, for example, the setup on how much salt we use is too much, then we could, of course, use too much salt and it's not good for the environment and also it's not good for Mesta either.

Speaker 2

So if we can basically analyze the data and figure out what are the not necessary ways, how can we optimize things, make it more effective? It's not just about the master's profitability, but also, I think, for the environment, because everything we do will have an impact on the environment. If we drive more efficiently, definitely we will also have less emission, for example. Also, there are some examples for driving habits as well. We do have a system that's like more vehicle management. They have scores on how people drive, because we track how we brake, for example, or speed and everything.

Speaker 2

So we do gather that data and also we can see are we improving ourselves? And also what can be the correlation between this driving score and potential could be profitability or could it be maybe environmental impact. So I think there are lots of potential there and we haven't really fulfilled everything. So there will be a lot of potential from the business first. So quite often they have ideas and we just try to figure out from the data we have can we fulfill it. And I think there were also a lot of potential that we just discard what data we have and make some blocks and try to figure out what decisions we could make.

Speaker 1

This is really interesting and there is a certain complexity to it and I really enjoy that. It starts really from efficiency questions to route planning questions to safety questions. It's really complex and I think it's really an interesting application of data and AI. So, for the last two minutes of our discussion, first of all, thank you so much for participating. Secondly, do you have any key takeaway or call to action?

Efficient Infrastructure Design for Organizations

Speaker 2

Yeah, thanks a lot. It's really nice to kind of share our experience or learnings. I think, yeah, there's no like a golden bullet for everyone. But I think to focus on, yeah, value generation and try to have good or fast iterations end-to-end so we realize value and make sure that it's what the customers or user wants. I try to make a desirable stuff with the kind of least maybe efforts or you don't have to build a rocket, for maybe just you want to drive under an hour and you don't have to build a rocket. So I think, like, fit for the purpose. So build infrastructure pipelines, fit for the purpose, that's just right amount of efforts and also design. That probably can be advice for both big and small organizations. Sometimes big organizations have the privilege to really have a lot of resources, but I guess they could probably achieve more if they really focus on value creation a bit more.

Speaker 1

I think Fantastic. Thank you so much.

Speaker 2

Thank you Bye.