
App Performance Café
App Performance Café
Café with Nolan from Twitter, on the psychology around performance
Today we’re lucky to have Nolan O’Brien from Twitter with us, sharing his view on mobile app performance. Starting with why should we care about performance and the psychology behind user perception, the conversation goes from managing global user bases with such diversity of expectations, to measuring and ensuring ways to handle user frustration.
Nolan's Twitter: https://twitter.com/NolanOBrien
App Performance Café: http://performancecafe.codavel.com/
Codavel: https://www.codavel.com/
Hi, everyone. I'm[inaudible] Garcia and me, and then took engineer I'm in love with two things, mobile app performance and coffee. So that's why I decided to start the app performance cafe, where I will bring the most knowledgeable individuals in performance to have relaxed conversation today. We'll have the Laura Ryan from Twitter. We will start by discussing why should we care about while after four months and go, and then go into a psychology behind user experience, uh, then into waste and handle user frustration and how to measure all this. So hope you enjoy it. And don't forget to follow us on the usual podcast platforms and visit[inaudible] dot com. Welcome to the app performance cafe. My name is Chloe and I'm very excited to announce our guest today. Uh, NOLA O'Brian from Twitter. Uh, Nolan. Thank you so much
Speaker 2:much for taking the time to join us and share a little bit though your views on mobile app performance.
Speaker 3:Can you tell us a little bit
Speaker 2:about, uh, who you are? What do you do and your experience around this topic?
Speaker 3:Yeah, sure. Briefly, I'm a software developer at Twitter, as you mentioned, but in the doing software development for 15 years, pretty much my entire career has been focused on performance, uh, specifically around networking and on native applications. Uh, so you know, back windows, Linux, before there were phones and now there's phones iOS and Android devices. Yeah,
Speaker 2:no interesting. I'm very excited because like in our profession call, I noticed that
Speaker 3:you're all over
Speaker 2:well, while at performance you touch every point of it. So I would like to start by asking why should we care about it? So why should we care about mobile app performance from your perspective?
Speaker 3:Yeah, I think there's a lot of ways to answer that. Um, I think we should care about mobile up performance because we care about users is the easiest answer. Um, I think one of the first things to do is to set ambiguous performance. Um, as a term, I think we all know here on this show, what it means, but whenever I bring up the topic of performance at work or among colleagues or even outside, there's a lot of confusion of what we mean, uh, add for ties and revenue, performance, engagement, performance, whether users are interacting or not. But what we're not talking about is that we're talking about the things that are barriers for user being able to use your app. And yeah, so, so that's why it's important because, uh, you can have the best app in the, and I like to use the, the, the analogy of, uh, your app is like a party, right? Like the better your apps features are the better that party and the better more people are going to want to stay because of how compelling your features are that you built. Uh, however, if you don't have application performance, people can't get in the door, uh, and if they can't get in the door, you're, you're just keep leaving, keeping people out. So get them through the door, having performance, get out of the way completely so that they can just enjoy what your application provides. So it's both a, an enabler and a non blocker. Would you say so? Yeah. How would you describe that from the user perspective? So from the user feels, let's say, yeah, so, um, the user feels a lot of things really in using the app. I think a good example of like what we're talking about with the different areas that users can experience with performance is the, uh, is what Google has before it and their, their next billion users. Uh, and they break down performance into several categories, there's connectivity. So the speed, pretty much all apps run over the internet now. So the speed over the internet, there's the device recognizing that people all over the world have different devices at different levels. And you know, you and I probably have the most recent Android or iPhone, but the majority of the world don't, um, so there's device and then there's the data cost. Data is not free. Um, and for a lot of worlds, quite expensive, uh, then there's the battery itself. These are mobile devices. So keeping that device up and running is really important. And then last the overall experience, which compass is all of those, but in the glued together user experience that you provide in user and how you can smooth out the edges where you have difficulties with performance getting in the way. So we mentioned user diversity from some perspective, we'll just say it's all about networking devices, or for example, different use cases of different perspectives from the user in terms of the tolerance that they support to get into the door. So to get the right content in the application, for example, Oh, it's absolutely a good point in that every user is different and catering to an average is something that's very common for making mistakes and how you build your apps is if you build everything around the average user that you already have, you're going to miss a lot of the nuance that a lot of people have in where their thresholds are for everything. Some people, they don't mind their battery being drained. And so they can handle that. And some people don't mind actually, a slow connectivity. Some people have built up a resilience to that. And so it's being adaptive for the users. Use case is a difficult challenge in an application. So
Speaker 2:I'll do on such a diversity, a diverse diversity network, diversity user diversity, and user expectation, diversity as well. So how do you,
Speaker 3:I mean, I think diving into how we handle it as kind of complicated, recognizing up front that there's diversity and then maybe building some principles on how you start achieving that. So measuring a lot is really important. And then when you measure BVU really rigorous about the numbers you're, you're using. So for instance, understanding that, uh, your, you have used your courts all over the world that have different behaviors based on what they care about. So for instance, there are regions, I'm not gonna name regions, but there are regions that are very tolerant to a slow connectivity, but they're very intolerant of data usage. And so incorporating that into how you build things is really important, but that's like in the nuance of, of how you're actually going to start tack, tackling it, taking a step back, I think one of the most important parts of before even tackling performances. So the psychology of what performance is to people and understanding that you are interacting with a human with your app based foundations of the psychology of performance, focusing on speed as, as a, as an entry point and the experience that they have, uh, can really help ground the way you design your apps and move forward with where you need to optimize and whatnot.
Speaker 2:Wow. Uh, can you share a little bit more about your view on that psychology from the user perspective? Um, so I, I tend to look at performance as speed and that's it at least from a natural it's, it's my instance of that performance that way. Um, rather curious to learn a little bit more about how do you look at that psychology from the user perspective with regards to performance?
Speaker 3:Yeah, so I think that helps that's a good grounding is that, you know, when you're networking engineering like I'm with you, like you can always make things faster. So shave off every millisecond everywhere you possibly can, right? But at the point there's actually a return on investment, an ROI for where you can make improvements and where, where you make improvements actually has no value. Um, and so getting the user's perspective on where the latencies are hurting them and when they're not is really important. Uh, and so from a user perspective, they have a cycle there's a psychology behind what speed is and latency feels like. Um, so in 1968, there was a documentary that went through the concept psychology of powers of 10, which is things increase by a power of 10, that increment buckets in a user's mind, the next level. And when in the 1970s, they expanded that just from physical objects into time, there was an understanding of it actually temporal effects that powers of 10 also translates into directly. And in the 1980s, there was thorough amount of investigation and the nineties into, uh, speeds of human interaction with computers, so human computer interaction. And that was a very full field of, of research for a long time. And the principles is, is effectively. That is that for as the powers of 10 of the speed go up or the latency or slowness goes up, the mental model of how something is performing changes. And what's important is to identify where is the task at hand supposed to fit into this bucket for the user? And how do you make sure that you fit into that bucket? Um, so just to go through the ones that have been established for powers of 10, when it comes to speed is, um, anything under a hundred milliseconds. So a 10th of a second is considered interactive. Now a 10th of a second per frame rate is really poor, but we're not talking about that. We're just saying, as user is interacting with something directly, if it takes less than a hundred milliseconds, it's interactive, uh, and then something that's one second latency or duration. That's, that's responsive now, it's taking time, but the user doesn't get stuck. They're just, it just takes time and it's very responsive tasks, but it's not boring. And then you get into the harder areas, which is like 10 seconds. Something is working, it takes time and it needs to be it's working. And so I'll talk a little bit more about the threshold between responsive and working work, but you can tell that the user is transitioned to this. Isn't just something that's going to happen. I have to wait for it. And then the last thing is something that's a minute or longer, which is that you've got a series of tasks or a job to do. And then that combination or workflow can take that amount of time, uh, so that you would break down some examples of that. Like a interactive would be like tapping on your navigation, navigation, transitioning or responsive would be, um, something on a quick neck or filling in, uh, over in one second or less texts being sent or received, uh, uh, 10 seconds someone's working. That's like, um, you want it, you're posting something. And it has a little bit of data, enough data into it that they, they know it takes time. So maybe posting an individual, the internet. And then there's a concept of like a whole minute where it's like the entire workflow of sending something. So you're posting something on to read or whatnot. You're composing what you want to say. You're editing it. You're downloading an image that you want to attach, attaching it, uploading the whole thing and seeing it to completion. So that's in a minute, but where we come down to where we can really drive the reduction in users, getting frustrated is seeing what bucket a, an instance of a user experience falls into. And whether that those are matching and trying to set a threshold to make sure that you get under that, but not over optimizing such that you drive it to zero because you can't drive everything to zero. And I think great example is really the things that are going past 10 seconds that need to get under 10 seconds and the things that are 10 seconds or so, or above one second, they need to get down as much as possible. And I think a great example here is that when a user transitions from seeing something that's taken more than a second, but has not taken 10 seconds yet, there's this serious psychological effect of like, I call it dreadlock where they're, they transitioned from anticipation of something's not responsive anymore, but when's it going to finish to affectively dread of like, why is this not done? Why is, why is it still happening and trying to get either under that dreadlock or salvaging the user experience, because it's going to take long and reinforcing to the user. This will take time is really important. So that's why as a principal, trying to get things under three seconds that are supposed to be responsive in the, in the majority case. And under one second in the target case is really important. And things that can't reach that really building out the fundamentals of you have to communicate to the user, visually the progress. What would you say between one second and 10 seconds, that gray area, it's a step function in the sense that there is a point after which the user really gets frustrated a linear function, or it more gravitational in that, uh, the first three seconds that gravitational the user, excuse expecting something responsive. And it just was that responsive thing, just not fast enough, the seconds they snap to this frustration point of something is not a responsive thing. And it's now a required work. And if you're not providing context to the user, particularly planning progress of where you are in the process of delivering on the task at hand, you are going immediately, they've entered into frustration, and it's just a growing level of frustration. Like you said, every user is different. So the step function of that is it will be linear. Hyperbolic. Logarithmic really depends on the person, but it's fairly well understood that once you leave the expectation of it being responsive and go into the understanding that what was start going to be responsive is now work. You are growing in frustration over time, and that will not stop until you either save the user by finishing or provide them a relief with understanding that completion is, is going to take time and how long that will take. How do you handle that? So I'm thinking about cases where you do know that for the majority of users, or you will not be able to fulfill say one second or three seconds, how can you handle that for station? So I'll do manage that. Are you focusing on optimizing the delivery of every content in that specific action? Or can we do something else? There's, there's a number of routes, right? Um, the, the thing that's important is that the point at which you were not a responsive task is providing the user, uh, progress. And this leaves from sort of the psychology of powers of 10 into maximizing the user experience within that transition period of going to a 10, second sort of bucket of work is the psychology of progress. So let's think of it for a second. A task takes time, right? Looking back in, in reverse, like, Oh, that task took five seconds, right? And you can think of it as like, okay, a linear progress of time is five seconds. So if a user was to enter into something that was going to take five seconds and you could provide a linear progress bar that said, or some sort of progress indicator that says, you're start from nothing. And you progress up to that entire thing in a linear fashion, that would be a reflection of accuracy to the user. And that's, that's a good thing, right? Having a linear progress is very difficult. It's very difficult to achieve a linear progress when you use your apps and you use iOS or Android, I think a very common thing that people are noticing these days when it comes to progress is finished. Slow progress is what I like the sort of like regular cadence. And then it starts slowing down more and more and more into the end. It's just really slow to, like it's painful, right. Are very, very fast. And then you're stuck at opposite of what the user really wants. And so with this finished slow progress, you can, there's been psychological experiments showing how, how much worse that experience is. If you take the linear experience, that's 10 seconds, and then you provide the exact same duration, 10 seconds for the task, but you progress it in a slowing down fashion. The user that slowed down experience a hundred percent, we'll say that was a worse experience, even though real time experienced was the same, because the expectation is it's not just the expectation is being met. The great thing about linear progress is expectations set and, and it's perfect, right? Slowing down is that expectation is met with the velocity up front and human mind is very good. Brain can latch onto philosophy right away. But when that velocity roads, every amount of erosion into texts is a, is, is breaking that trust and it hurts the user experience. And so that's a bad problem. And then the reverse, if you were to just set out the opposite situation where, um, you speed up the progress and you start slow and then aggressively get faster, faster until it completes. And that is also exactly 10 seconds, although is exactly 10 seconds like linear progress that would be perceived by a user as a better user experience. Because once again, if we revision in the user's mental model of assessing the velocity of the progress, it keeps getting better. And so you're, you're, you're breaking the expectation, but in a positive manner. And so the summation of those positive reinforcements leads to the end result of identical amount of time between three different progress mechanisms. Last, if you can finish fast is the most compelling. It's very difficult to accomplish though, because to sort of like artificially build something that speed up progress would require really good concrete understanding of what the, the temporal impact could be. And so you can't really even fake that. So what you have is this problem space of an undetermined amount of time, something will take, but you have progress in sort of the effects that each step of a job will take, whether it's latency of downloading and then latency of decoding or coding something first and then uploading it. And you don't really have an idea of what those are, but you can show the progress along the way accurately, but not faithfully to make the user have a good experience. And this is where you start getting into some clever solutions. So continuing to preserve progress, figuring out how you can get into the mind space, how you can relay the same info, the same concept of progress to though, but in a way that is much more satisfying. So progress bar is still there. I'll get on a progress bar as of a sec. Um, but there's a few ideas with that. So there's the concept of, uh, so with in deterministic, um, uh, problems, this is why you have a lot of degrading progress bars. It's like, we think it'll take five seconds. And so it's linear progress to about five seconds, but then the time isn't completed. And so you have to degrade the progress, it's like, well, we didn't meet your goal. So we'll slow it down a little and we didn't use your goals. We'll slow it down a little bit more degradation. Now, the degradation, the problem is that your completion is that the exact same point as your degradation finally reaches its target. If you could front load that and say, we know that these, this part has an indeterminate factor, right, and great early, but you reserve a substantial amount of the progress for the end. So say it's a ten second task and you degrade, but you only get to 60% by the time that 10 seconds elapses. And then you take the last third of a second to one second to quickly ramp up the speed to completion. You can actually take 11 seconds with a progress bar and they will feel better about it than a linear progress bar. That takes 10 seconds. And that's because the finished fast experience really impacts the user's perspective coming through me. So basically the point is you can erode the user experience as you degrade, but resuming the experience can counter that. And the faster you're resuming the experience, the faster you will recoup that loss of expectation and exceed it, making it a good finishing experience. We call this the principle of finished fast. Yeah. So I was thinking, can you give us an example, like I'm thinking about use cases where you can apply that rational installation of an app, um, with Android or iPhone, right? There's a latency to the amount of time it takes to download the application. And then there's a latency that takes to install the application. It's a lot of work, right? Like the latency of the download and installation of downloads a lot now, unfortunately, because it's such a large thing, very it's fit into a very small package, particularly on iOS where it's just a small circle. Android has a better experience because I believe there's a, uh, look down, you can pull down and you can see the full progress. Now it's kinda silly, but that line being long actually helps the user experience, having something small and tight really reinforces that you're not making progress even when you are, because you can't with granulary display it just there on itself. That there's a problem of the psychology of showing the progress. And then within that, so really changing the expectations, such that instead of it being your bytes per second, or the exact amount of bites you've downloaded versus soundbites are going to download, matching one-to-one in the progress. And then the installation of extracting and putting it onto the place and just taking a one-to-one like bites, expected bites, destined completion, and having those buy side, taking a step back and understanding like there are those two steps, but you are incurring a lot of promise that those will map up both equally as well. So 50 50 if they were, but also that they would end up mapping to feeling satisfied in the completion. And so a better experience might be to see that upfront. And maybe front-load both parts and say, I'm going to take 25 to 30% and say that that's downloading the next part, which is installing it the later part. So I'm going to actually take a larger portion. So 40 to 50%. And then the last 30 days, 20 to 30% is actually nothing. And what I'm going to do is show the, uh, best estimate of, of that progress as it's happening front loading the cost. So it's that the completion ends up having a significant enough amount of the progress bar or circle in the case, and then having a fast completion. So the user has the satisfaction of, Oh, it didn't take as long as I was going to expect, because I was expecting would take five minutes because of how long it's taking those that, that circle to fill. But in the end, it really ramped up and completed on the I'm really grateful. And so that's an example concretely of how you can take advantage of that. Um, so I think short progress being something tight is good once it gets larger. Um, and you're going across with minutes, I think small, tiny UI is it Chrome. And that's when you really want to have a, a place that you can look at up how the product is going. So for instance, in the Twitter app, you can continue navigating to the Twitter app as you're posting your tweet. And then we have the progress going so you can keep track of it, but you're not stuck. You're not locked by that progress. Progress bar, current use case progress can be done in many different ways. So I was thinking about
Speaker 2:how does that relate to, uh, like
Speaker 3:delivering it
Speaker 2:an application, uh, where you don't believe the entire content at the same time, but you're start fulfilling, uh, the critical content as soon as possible. And then you include extra content. So how does the, those two, uh, sides correlate?
Speaker 3:Yeah, this is, I'm glad you asked that. So the progress bar is basically you have nothing you can show until the whole thing is done, which is you're always stuck, whether you're blocking you or not, you always start by not having the thing until it's completed. Progressive loading is the concept of you can progress in the content as it loads. Um, so starting simply, like, I think everyone's familiar with video in that you have a amount of time that you have to buffer, uh, but then you're loading it in progressively as you continue to watch it. And it's a wonderful experience, right? Because maybe the quality starts a little poor, but then it ramps up over time. It's noticeable. I think mostly on Netflix or Amazon prime, it starts pretty great equality, but over time, like 30 seconds, it gets to be pretty high because you have this progressive income and then you're just experiencing it, right? You're no longer waiting. You don't have to wait for the full buffer to fill because you got into it sooner, lower quality, but sooner, and you're unblocked and similar things can be applied on any amount of content. So if you have a page that you're trying to load, and it has all kinds of media in it, and it has all kinds of text in it and all kinds of ads in it, right? You can progressively load that as well, right? Like you take the most critical parts that you want to surface to the user and you get those up. And then you're raised synchronously bringing in the pieces as you need later on. And then the final thing that I think we've done a really great job of at Twitter, um, is our progressive loading of images, which a lot of companies do this. But the concept is if you can get a low quality image while it's progressively improving in quality, when you have high latency, you have this amazing experience of what used to take, say 30 seconds. If you're on a to G network, now it takes you three to six seconds to act till you actually get something that you can see what's happening. And this is great. You're, you're going through a, a, an experience we'll use, uh, a feed of a post, right? And so you're, you see this post from someone, you know, and it's coming in, you can't see what it is at all. If you have to wait the full 30 seconds to know what that image is, it's very disappointing. If it's not even an image you wanted anyways, but if you've got three, six seconds and you get a good idea of what it is before it's fully clear, like, Oh, it's my friend's baby. I'll stay and watch and see other babies, or, Oh, is my friend showing off their new car. I do want to see that that's a cool car, take a picture of their lunch. You know what I mean? It keeps going. I can, I don't have to wait. You can make an informed decision as a user early. And that PA that's empowering, that's empowering to start consuming. What's there mentally gauging yourself for what you want to, what you're consuming and whether to disengage or not. Cause you're, you're locked. You don't have that. So progressive loading is an amazing feature just for images, but also for any contact that you can load in a piecewise instead of the whole thing. And I think post talking about, you know, posts are, is great. So I can read it. You don't load the entire page of posts. You load the top and then you keep holding the next while you're waiting for the rest to load, you can see the top to bottom. And this really helps if you experience it with like a to G connection. So I think a lot of people like I have a very fast wifi connection. My home I'm the newest iPhone. I never see this experience, but if I can experience it by force myself into a, to G scenario, it was really compelling where I can interact with something immediately, immediately as in like three, six seconds versus 30 to 60 seconds. It's game changer. Absolutely. Are there cases, so you're basically, you're saying that we decreased quality of the content let's say, or at least part of the information to improve that, uh, operability from the user perspective, or are there cases where to reflect and rephrase what you said though? We don't, I don't think anyone decreases quality. Right. I think what happens is you have no quality of nothing present. Yes. When you progressively increase quality and fidelity over time. And so I don't like the comparison of what will the final product is this quality, but you're only showing this because you're degrading it. That's not the comparison. Comparison is nothing on the page versus something. And you're going in the right trajectory. Are there cases where are there cases where you see that it's more important to deliver, uh, reach content, so heavy content from a method perspective versus so it's more important to do this versus trying to optimize, like for going to other types of conduct, like texts or similar. I don't know if I can really, I don't know if I get the scenario you're putting up, like you're talking about, Oh, I seriously. So like, it could be that there are, that's more of an adaptive experience. So instead of progressive loading, what you have is the concept of adaptive experiences. So if you're trying to load content and you have a plethora of content to select between, and you're trying to get in front of the user, being smart about that content coming to the user so they can consume as much as possible is valuable. So if I was just going to shove video after video, after video, after video to a user, that's in Sao Paulo, Brazil, that's not going to be very compelling when everything is frozen and you don't see the video when it does load, it's low quality. And you can't go to the next content while if you could diversify and say, listen for people in those high bandwidth, maybe that is the experience that you can give them. Maybe they're in the United States with a great connection, but what if they're far enough, they need a mixture. And then you can start mixing in images with the video or texts with the images. And maybe they're so bad that you have mostly texts, but the text is usable. And then you have images along the way, and there's even better options, right? Like one of the things is being adaptive and trying to serve users to what you think the best experience is going to be, but you can't predict every single user. And so some users you need to give them control. And this is why a lot of companies do with data saver, right? So a data saver is basically a user says, listen, I need content, but I'm going to say the content I get, you can give me low fidelity media, which is usually the heaviest stuff. And I will choose when to go out of heavy media or a low quality media into the real quality media on my own. And that can be really empowering because then you don't have to worry so much about mixing perfectly that may frustrate the user. You give them the control. Um, and I personally believe that you can never go wrong by giving the user control of their own experience, um, versus trying to facilitate to the average of, you know, millions of users, uh, so much learn yours. So I was thinking about looking at this, the challenge that I see is how do you measure this? So how do you not only measure, but actually test what you're doing with respect to tackling disease issues. But this is a really, probably where 90% of the work is, is, is measuring and understanding the problem space. You have to measure, understand users, and you have to measure and understand the technical challenges and you have to measure, understand your changes you want to make. Um, I can't give you specifics. Um, but what I can say is, uh, there are steps you can take is first identify the areas where you feel like there's a user experience, problem, and try to measure from a user perspective process. So not like look at every request like, Hey, these requests that fascinates your slow care, which requests are faster. So they cared that their experience is slower. So try to take larger measurements, um, holistically of the large experience, whether it's, um, doing an action, like I'm putting together a composition and sending it off in an email or a post, or if I'm doing something on the consumption side of navigating somewhere and wanting to consume it like a NAR experience, if you can capture the users in those larger, um, in tailings of the, the actions they're doing and measure that you can see if they're meeting the expectations that you want for them, which is fast, right? And then you can start considering, okay, we've recognized here. This particular part of user experience is not meeting user's needs, matching it up with qualitative information, serving users and seeing where they're finding deficiencies like, Hey, I'm in this region and I'm doing this particular thing is too slow for me. And then really from there, um, measure, measure, measure to fully understand what you're going to do to improve that. Don't just say, it's here, it's a ten second experience. We need it to be three seconds say, well, where is every millisecond of that duration going every millisecond? Is it going into the DNS lookup? And we have to optimize the DNS. Is it going into the TCP handshake? Is it going into the compression or the transcoding side? Or is it just the transfer of the data we need to compress better? Or is it in something in between? And we have a bug somewhere, but really getting defined where every single millisecond is being attributed for that thing that you're targeting so that you can start breaking it apart and solving part a
Speaker 2:from listening that I'm thinking that this is heavy lifting, basically. So it's highly customized for every single problem that you kind of faced within the app,
Speaker 3:or like thinking you try to round it up.
Speaker 2:Is this the next challenge with respect to understanding performance from a user perspective is trauma.
Speaker 3:There's, there's two sides that I think were, could be even before you can, you have to do that. That's I feel like where you're going to get once you've finished the easy stuff, that's the hard stuff. So that's where most of your time spent. But before you even get there, there is first principles building. So get in slow, put yourself on a slow network and build it and run it and experience it as a normal user would not a user on wifi, not a user on the brand new iPhone, 11 pro a user on an iPhone six or user that has two G connection and data costs them. You know, I'm in 20 bucks a month and they just can't afford the data using measure that locally and see where those large, low hanging fruits are to move forward, get qualitative data to whether it's internally at the company or whether it's externally with users, see where they're getting feedback on where your you need to start focusing. But once you pass that point of low hanging fruit, then you really do have to come up with a rigorous measurement system. And it doesn't have to be sort of like bespoke for everything. It's more conceptually have the idea of an action that you want to measure for the user at a high level. And you measure that and you measure that for many, many actions and then ones that you care about. You'll say, well, this action compared to this action, you just take them on their own and see if you can make those individual ones better, where they need to be improved. So some things take 10 seconds and they need to take 10 seconds, but some things take three seconds, then they should take one second. Really finding those things. And then if you really want to solve those problems, because it can get really complicated. And especially as you'll know, really like the global internet infrastructure is really complicated. And it's a miracle that even works. Like the fact that I'm talking to you right now, it's magic because I understand exactly the BGP, um, protocols, what the HTTP two connections going through. And it's freaking miraculous that it isn't taken down every single day, dealing with that miracle of the internet. You really have to understand every single second in that action. So you can start attacking it. And some things can be shared, right? Like a lot of actions, cause she will share a lot of things. So maybe you have a really big problem with your network communication, your own data centers. And you're like, listen, I am talking to my data center for around the world is too slow. I have to get in between the user and the data center with something faster, just as a general example, there's just so much you can do, you have to understand where the problem is first and sometimes solving for one will have cascading effect to software, many things. Cool. Interesting. Well, thank you so much. I believe it was very enlightening. I would say this perspective. Um, so, uh, would you like to, to leave us like with a very short summary of your recommendations to everyone working on mobile app performance? Uh, I would say thanks for even listening. Um, I'm just one guy working in this immense field. Um, don't be overwhelmed because performance is really, really big and it's not possible to talk, tackle everything simultaneously. So pick an area, start with some first principles and fundamentals and target what you want to see and make those improvements. I really do like considering the user perspective first getting into the psychology of what that effect is and trying to not bias myself towards well it's this amount of time. And it should be, this is not a time when really that's not necessarily the goal. The goal is to understand what the user needs. And sometimes it's another area you need to focus on. Instead it's putting the user instead of the engineer, I would say it's very, so like one of the things that's very easy. It's like, listen, Hey, I've got this algorithm on my backend service and it's causing 50 milliseconds latency between my backend services. And I can cut that down to 25 seconds. That's a 50% reduction, right? How I have done this over and over again of over optimizing and thing that does not matter. And that 50 milliseconds to 25 milliseconds, 50% improvement, doesn't amount to squat when you have multiples of seconds between that content and the user. So, um, it's, it's not easy. Sometimes you get, you get stuck in a pit at your perspective as an engineer and you have to take a step back to reassess it that's really to focus and do lots of things and not everything will work, but that's okay because if you care about the user, you'll
Speaker 1:the right thing. Thank you so much NOLA. Uh, and thank you all for listening and see you next week. I hope you have enjoyed the conversation. Uh, Nolan had so much more to, to share with us. Uh, you can follow no load on Twitter. His handle is little O'Bryan, uh, and, uh, please follow in the usual podcast platforms as well as visit performance@fed.com. See you next week.