DevXPod

Measuring and improving developer productivity w/ Abi Noda from GetDX

• Chris & Pauline • Season 2 • Episode 4

We're your hosts (Christian Weichel and Pauline Narvas) 👋

In today's episode, we're joined by Abi Noda, CEO of GetDX where we talk about how we measure developer productivity.

Note: there are issues with the audio from Chris' side! Hopefully, this doesn't sound too bad, feel free to also follow along using the transcript.


The hosts  â–»

Our guest  â–»

Things mentioned â–»

Let's chat some more! â–»

Abi:

What I thought was the most interesting finding from this study was that when they asked developers how they think their managers define their team's productivity, they said tasks completed as well. So there's actually a misperception by developers of how their managers define and perceive productivity and I think that gets at the root of the challenge of trying to define developer productivity in a single way.

Chris:

Welcome You're listening to the DevX podcast. The show where product and engineering leaders share their thoughts and insights on developer experience I'm Christian Weichel joined by my cohost Pauline Narvas.

Pauline:

Welcome to the podcast, Abi. Thank you for joining us today. We're really excited to have you on board. How are you doing?

Abi:

I'm doing great. Thanks so much for having me. I'm really excited to be here.

Pauline:

Yeah, I'm really excited to learn a bit about you, about your story and everything about developer experience. so for those who don't know much about you and what you do, can you give us a brief introduction on yourself?

Abi:

Sure right now, I'm the founder and CEO of a company called DX. We're actually a SaaS product that helps technology organizations measure and improve developer experience. of course, really big on the topic of developer experience. And my personal background is I'm an engineer, been an engineer for over 10 years. And so of course developer experience is personally relevant as well. but I've actually spent the last five to seven years in this space of trying to understand and find ways to measure developer productivity and help organizations, improve developer productivity and experience. really excited about what your company does and, to talk more about this today.

Pauline:

So you mentioned developer productivity, for those who are listening to this for the first time and don't really know what that means, could you just define what developer productivity means?

Abi:

Yeah, well, that's a really funny question because I think everyone's trying to figure out what developer productivity means. a couple years ago, while I worked at GitHub, I actually gave a talk called my elusive quest to measure developer productivity. And I think the keyword there is elusive. I think everyone's been trying to figure this out and. A lot of people have opinions. so I can share a little bit about, I think what we know from really academic research, that's come out recently about developer productivity. So one thing to say is that there definitely isn't one shared standard definition or meaning of developer productivity. And I think that's a good thing. Recently, one of my colleagues, Margaret Ann Story, she's a researcher and professor. She published the paper that was a study on how developers and managers define productivity, and it was really interesting. So first they did a series of interviews with developers on how they defined productivity and the prevailing definition revolved around their activity. So it was around, I feel productive when I complete a lot of tasks. Then they went and studied how engineering managers defined productivity and the prevailing definition for managers was actually performance and quality of outcomes. What I thought was the most interesting finding from this study was that when they asked developers how they think their managers define their team's productivity, they said tasks completed as well. So there's actually a misperception by developers of how their managers define and perceive productivity and I think that gets at the root of the challenge of trying to define developer productivity in a single way. Another recent paper was called Space, it was published by some of the same researchers at GitHub, Microsoft and University of Victoria and Space is a framework for understanding developer productivity and it broke down productivity into these five dimensions. I'm not gonna go over all of them, but included things like satisfaction and collaboration, being in the flow state. I think this research was a big deal because it was really the first time that people came out and really suggested that you should be paying attention to how developers feel as a measure of developer productivity, not just what work developers have completed. When you look at all the research that's going on and all the discussion around developer productivity, I think one sort of commonality is that I think most researchers and many leaders believe that we need to move beyond this narrow definition, this narrow managerial view of productivity that has really just revolved around how many tickets or pull requests have you completed or like assembly line metrics like lead time. Software development is knowledge work, not factory work. And so I think the way we think about developer productivity really needs to evolve and reflect.

Chris:

I liked a lot when you said that managers measure things as outcome and so often that's mistaken for output. So just because I completed a PR, it doesn't mean, I actually affected an outcome, you know, it could have been a zero sum game. And, it's an interesting challenge of how to focus teams rather than, these are your tasks, this, these are the PRS we aim to achieve this week. Trying to embed this in the largest scope of develop areas, like you hinted it at some the way developers feel, and also self-perception of productivity is the tasks I complete the really operational things. how does dev activities sit within that space of developer experience and which one is larger? Like which one is the super set of which one?

Abi:

Yeah, that's a great question. If I were drawing a flow chart, Developer productivity would be the output, so I would draw a line pointing from developer experience to developer productivity. We were just talking about what developer productivity is, an equally important question is what actually affects developer productivity. And I think the answer to that is developer experience is what drives developer productivity, and there's actually a lot of research going on now that sort of proves this, but I also don't think you need a PhD to rock or understand this. So if you think about the typical day of a developer, right? There's so many points of confusion, delay, frustration, right? You could be working in code that's difficult to understand, spending time waiting for tests and builds, having to go through difficult steps to deploy, or as you guys know, set up a local development environment and these types of tools and processes become bottlenecks in the process. These things negatively impact the productivity of developers when there's friction. And these are just some examples, but right now our team is really focused on using data to identify which of these aspects of developer experience have the greatest impact on developer productivity and retention because we know that developer experience isn't just about how quickly or easily people can get things done. that in turn also affects how much they enjoy their jobs and whether they, look for a new job opportunity

Pauline:

Following on from that, do you think that we should measure developer productivity? And if so, how?

Abi:

Yeah. of course, developer productivity is such an important thing in business, right? companies, people are saying, all companies today are software companies and software is eating the world. There's all these kind of fun soundbites, but it's really true. All companies are investing enormous amounts of money into their software engineers and software development and then when you're investing so much money into something, you need to be able to have an understanding of how that investment is paying off or how specific investments are paying off and whether that investment is doing well, and so I think measuring developer productivity is absolutely, a critical problem. It's something all organizations should be doing. How to do it is similar to the definition of developer productivity. I think it's been a very elusive problem, right? I think historically the sort of predominant way of measuring productivity has been that managerial view we talked about earlier, which focuses on the outputs and activities of developers, and anyone who's done this knows this, this is a both flawed and limited view, and I always give this sort of analogy to people, some people even outside of tech, but I say, think about, imagine if you ran a factory or like an org, a company that produced paintings. So you had a bunch of painters and you produced really high end, high quality, original paintings. And you needed things to measure. You wanted to measure your productivity as a painting organization. What would you measure? So imagine if you started measuring how many paintings you produce or how many brush strokes your painters do per day, what would happen well, would that matter? Like does the number of bus strokes affect how good a painting is? No. is the number of paintings important. Yes. But if you focus on that, what about the quality? So imagine instead of measuring brush strokes and the number of paintings you instead went to your painters and asked them, "Hey, what are the things that are slowing you down. What are the things that are affecting you in your work? What are the things that would help you produce better paintings faster?" And I think that's the same sort of shift and mindset that our industry needs to go through and is going through. I think we believe in what we're seeing with many organizations is rather than try to measure the number of widgets developers are creating, we should focus on measuring the environment that they work in. In other words, the developer experience, and by measuring the developer experience, by measuring the tools and processes, the daily points of friction of developers, and by improving these, we ultimately improve productivity as well as engagement and things like retention. That'd be my thesis and observation on how we measure productivity today and how we need to really shift towards measuring developer experience instead.

Chris:

Where have you seen Dora Metrics applied and to what effect?

Abi:

Yeah. So first I wanna say that I work with Nicole Forsgren. I worked with her at GitHub and she's joined the DX team as well. And, her work and accelerate and do are phenomenal, like groundbreaking pieces of work for industry. And I think one of the reasons is it took a bunch of things, concepts and metrics, and methodologies that were really just opinions and it applied science to it and statistics and validated survey methodologies. And there's a lot that our industry has gained from that. And there's a lot for us to learn from that and reapply as we go forward. The Dora metrics specifically. lead time, deployment frequency, et cetera, are really interesting because since the publication of that book, they've taken on a life of their own and become, just everywhere. Everyone's trying to budget the Dora metrics, there's tons of vendors selling Dora metrics, people blogging about them, talking about how they've transformed their organizations. It's interesting to consider that even the book Accelerate was not written with the intention of really advocating for a measurement methodology within organizations. Accelerate was really about observing high performing organizations and looking for patterns of certain types of metrics and practices that could be linked to their performance. I have a lot of experience personally, with the Dora metrics. And in fact, while Nicole and I both worked at GitHub, we stood up the door of metrics ourselves. GitHub was going through a time of transition, new leadership, a merger with a bunch of engineering organizations from Microsoft and, as a result of the years of accumulated tech debt and ways of working, we knew that we needed to be better, we needed to ship faster to our customers. And so as part of the effort of transforming and accelerating engineering at GitHub, one of the things we did was stand up the door metrics. And, I personally was actually tasked with an OKR of taking the Dora Metrics and using them across the organization to actually drive change in improvement. So my OKR was actually accelerate GitHub engineering, and it was really interesting. First of all, I should say that one of the things people often run into with Dora Metrics, especially within larger organizations, it's very difficult to actually even measure them. Things like lead time, even deployment frequency don't really have standard definitions that can be easily applied across very disparate sets of technologies. on-prem software versus mobile software, web software and tools. It's very difficult to measure something like lead time across one team using JIRA, one team using GitHub, or working in different ways. So at GitHub, we actually were only able to get lead time for really one monolith code base that represented probably only less than half of our engineering organization, and we also knew that the measurement we produced was flawed. It wasn't actually true to the real definition of lead time from the book and that Nicole had created, but nonetheless, we wanted to use these metrics to improve. Once we had the metrics, my job was improve them. And this is where it was really difficult. So I knew what our lead time was, and when compared to the benchmarks, they were pretty good. It's not like our lead time was bad, but of course we wanna continuously improve so to speak, but I didn't know how to improve it. we had theories. we had opinions, people had thoughts on all these things that could be improved, but it was like, where do we actually focus? And because our organization's so large, where do the individual departments or parts of the organization and specific teams focus, and so, what I did was actually go around the organization at GitHub and start talking to VPs, start talking directors, talking to developers and EMS and just asking 'em "Hey, Keith, our SVP of engineering wants to improve lead time. What would help your team improve lead time?" And what I heard was fascinating, right? was different than the hypotheses we had as a leadership group, which were mostly like outer loop things

that were definitely broken:

our builds were taking a long time, it was difficult to deploy. Those are important things, but when I talked to the local teams, there were so many things that were nuanced and specific to how those teams were working things like, yeah, we don't have a good product management process. So all of our tasks, we have a ton of churn and our requirements, so of course our lead time is poor, Or, we work on these big projects. So yeah, when like lead time is poor because we work on huge units of work, which take, the longest time to deploy and, what I personally experienced with the Dora Metrics was there were a really good way to start a conversation about the overall performance of an organization, but the Dora Metrics in it themselves weren't useful as measures to actually drive improvement. They didn't tell you the why, they told what was happening, they didn't tell you why things were slow or what needed to be done to improve, and, the answers to the why really, again, came down to developer experience. It came down to understanding the day to day work of these different teams and understanding what their biggest points of friction were. Since that time I've spoken to a lot of other leaders across the industry. There's a recent paper that was published in IT Revolution as journal, and they're the publisher of Accelerate called'How To Misuse, How To Not to Misuse and Abuse Dora Metrics', which also kind of describes this a very similar experience of an organization that went through trying to adopt Dora Metrics and seeing a lot of negative effects from them, like them being used as an incentive and people just gaming the metrics and the metrics, just losing meaning of what they're actually trying to promote. I think that Dora metrics are really useful for any organization that is trying to get a snapshot of how they're doing especially if they don't know, a lot of organizations, especially smaller ones have a big, pretty good idea. But if you wanna start a conversation about where do we sit overall compared to the industry? It's a great way to benchmark your organization. If you're looking for a way to drive "continuous improvement", right? Enable local teams to eliminate their bottlenecks and have a process driven through data to continuously improve. I don't think the Dora metrics alone will allow you to get there

Chris:

Yeah, the Dora Metrics so attractive from a management perspective is that they boil down to a simple set of numbers, and you mentioned it's very attractive to introduce Dora Metrics, and look at those numbers and go "alright, we need to, I dunno, half hour deep time. This is the goal, for next year or next quarter, whatever your planning cycle is, let's execute on that" How have you seen that pan out?

Abi:

I heard a really good quote in a recent interview, I was speaking to the head of developer experience of our productivity at IBM, and he said that they used the Dora Metrics, and when he would talk to teams and developers, he shared this wonder quote. He said, developers said,"Look, I love the book. I love accelerate. And I get the vision for these types of metrics, but it's not the world we live in." And I think that's such a powerful quote because lead time, when you look at an engineer organization from that factory or assembly line, top down view, lead time makes a lot of sense. It's like you wanna be cranking stuff out faster, right? Turning stuff around faster, but if you're a developer on a team, sometimes things take a little longer, you work on a bigger task, it takes a little longer. If you work on a smaller task, it takes a little less time. And along the way you're running into, what actually affects lead time. What we know is that probably the biggest thing that affects lead time is the size of the task itself, right? the larger the task, the longer, the lead time. So it's almost not even an engineering metric you could argue in practice. It's also often specific to a very narrow set of tools, right? How quickly you can deploy a change is, really a reflection of specific set of tooling, but when you talk to developers and what the goal of lead time metrics like lead time ultimately are, is to improve productivity, right? As leaders would say, but when you talk to developers about what actually affects their productivity, it's so disconnected from lead time, right? Lead time just depends on the context of each team and what they're working on and the tools they use, and so I think that's the biggest problem with Dora Metrics is it doesn't take into account the context and nuance of each developer and each team. It tries to apply a universal manufacturing ask and lean comes from manufacturing, Dora is based on lean principles which is very different than knowledge work, right?

Chris:

Yeah, it looks like the goal of Phoenix project, it becomes, you read this and you think yeah, no, this makes sense. I can reapply this work center analogy and all that happen where delays happen and you got processes that interact and yeah, I could totally apply this." And then you go and try and you realize that "no, it's not a factory floor, CNC machine that's standing still, it's a bunch of "try and interact and figure out what they're even trying to do. it's a very different problem."

Abi:

Yeah. I recently watched this documentary on Netflix. I think it was called American Factory. Amazing documentary! It was just my wife and I watched it just for fun, but it was about a factory, and funny enough, they use metrics like lead time and it was so interesting to see, because in factory work, you're literally doing the same thing repeatedly, like moving object A from here to there and you're just doing it repeatedly, so it makes sense that you can break down that process and measure each component in these very manufacturing esque assembly line-like ways, but it just underscored the problem that software development is so not like that, it's not"Every 30 seconds I type and then I look at my screen." It's not a repetitive process, it's a creative process and so highly recommend that documentary, and funny enough in that documentary, the workers actually rebel, they go on strike because, even in manufacturing, the leaders were so focused on those metrics and the workers were saying, "look like, look at our working environment. It's not safe. We don't have the proper equipment" So they couldn't do their jobs because of their experience. so it was just, it was a really, illuminating documentary to watch

Chris:

Awesome. One of the appealing things is quantitative data set goals, we're gonna improve X." And, they make you feel like you get an overview that is abstracted from the detail. It's a model. Every model is flawed. Some useful. Have you seen the qualitative and then the quantitative come together to fit sort of an OKR world while at the same time being able to affect change?

Abi:

That's a great question. I mentioned earlier, the space framework was, one of the first sort of credible bodies of research to come in and say, we need to be measuring and focusing on how developers feel in addition to what they're doing. I'm gonna reframe your question a little bit, rather than say qualitative and quantitative. I would say, let's talk about subjective versus objective measures. A subjective measure would be how someone feels or their opinion. So if you ask someone "how healthy do you think you are?" that's a subjective opinion. You would get their sentiment around their health. If you were to ask someone, "how frequently do you work out?" That would be objective, right? a fact. I think that you need both. That's what the space framework, those, that was a key kind of point in their thesis was that you need a mix of both, subjective and objective. They call it perceptual, but the mean subjective measures. I think you really need both because, and let me give you an example of why you need both. Imagine if you have a company that you only hire grads straight outta college, they've never worked anywhere else and you ask them, "Hey, how do you feel about our development environment?" They would probably say it's good. They've never seen anything else. They have no point of comparison. So subjective opinions are colored by experiences, our context, our expectations, and so while knowing how your developers feel about something is really important, it's not telling, they could feel good about something that's not good, on the contrary, they could feel bad about something that is good. And that's where the objective measures are so important because objective measures help give you this other perspective on how is this thing actually working? So you said your development environment is good, but it takes 40 minutes to set it up each time. And you have to do that sometimes a day, that is not good based on benchmarks against other . Organizations or what we know is possible. On the other hand, if you only have objective, it's also a problem because if you only have objective, and this is for example, lead time where lead time is a problem, you might say, "oh, your lead time is slow." that seems like a problem, but then you might go and talk to the team and they would say, it's actually not a problem because we work on large tasks and that's just a part of how we work, the way we deliver work to customers, we need to work on these larger complex things and sort lead times higher. Another example would be even review turnaround. I've talked to so many teams, cause this is a big focus of mine with my previous company, we measured review turnaround time; so one team might have turnaround code reviews in three hours, another might turn it around in two days. So objectively you could say that one is better than the other, but how important is that? So what I found was sometimes I would talk to the teams that would take two days to turn around code reviews and they would say, "oh no, we all work on parallel tasks, we don't get blocked, so it's not a big deal." Whereas I actually talked to some teams that turned around group reviews in three hours and were really frustrated cuz they would immediately get blocked. And so you, I guess the point here is, again, you need both, you need to take into account the objective and understanding of how a system and process is working and whether you would interpret that to be good or not, along with how developers feel about their experience and whether or not these different areas, their tools and processes, how they're actually affecting them and whether they perceive them to be frustrating, good or bad.

Chris:

Yeah, I can totally relate to that for the longest time we, for three quarters in a row, actually, we had results around, "reduce our review time to, X amount of hours." And for one was a strongly diminishing return in how far down we could push that number, even, P95, and also that number meant complete different things to different teams. If I have a one line change and it takes me three days for a review, or it takes three days until I got one. This is too long. This is strong. This is frustrating. It's massive, huge change. And it's how the teams work. We have given up on this goal. We're still not completely happy about how long PR reviews take, there's still frustration around.

Abi:

Right.

Chris:

How should we be phrasing our goal in order to actually drive positive change?

Abi:

So I think you touched on it perfectly, there's a limit to what the objective metrics can tell you. The objective metrics, especially because like you said, data can be irregular, right? Some things take longer, some things take shorter and you mentioned there's still frustration. So I would argue that if that's a clear signal, you have, that frustration of developers or how organizations typically operation, satisfaction with code review, turnaround, or satisfaction with the code review process. And again, one of the dimensions of the space framework is satisfaction. In fact, I think that lists satisfaction with code review as one of their example metrics, but I think potentially measuring developer satisfaction with the code review process could be a more direct measure of what your goal is. In addition to keeping an eye on the objective side as well, to make sure there's a learned helplessness, so to speak, phenomenon where if code review just continues to be bad, developers might start feeling okay with it. They might get used to it. And therefore satisfaction with code review will go up, even though it's not getting any better. So you need both, but it sounds like in situation, maybe focusing on the satisfaction dimension could be interesting.

Chris:

That immediately evoke the picture of the meme room in flames dog sitting in the middle. This is fine. And then you give them a thermometer and you ask them, is this a good temperature?

Abi:

Yeah. So it's tricky. You really need both sides of the perspective there.

Chris:

Love it.

Pauline:

I think that goes with everything. It's never one or the other. It's always like a balance of both. Isn't it? Just to take a step back. I'm curious could you delve in a bit deeper about what getdx.com is?

Abi:

Get GetDX or DX, which stands for developer experience, of course. everything we've been talking about is what we're really trying to solve. So know that measurement in software engineering is just so elusive and frustrating still for leaders and organizations. We also know, as I mentioned, there needs to be a greater focus on developer experience. We believe developer experience is the key to developer productivity, and so DX is a SaaS solution for technology leaders, in for teams, DevEx teams that gives them a standardized set of both subjective and objective metrics that they can measure, that they can distribute to their teams. And what we're trying to do is build on the work of Dora build on the work of space to create a new framework, a standardized approach to measuring and improving developer experience and productivity within internally within an organization.

Chris:

what effects have you seen, in companies and organizations apply DX as compared to say more traditional, purely we take the time between the open time of your PR kind of metrics.

Abi:

Yeah. I think what we're seeing is that with our approach of DX, it's much more holistic. So like you mentioned, I think historically engineering organizations have focused on a limited set of metrics because that's what they had available, and then they use those metrics as the proxy of everything. PR turnaround time became productivity, even though we know that productivity is so much more than that. as you mentioned, there's such diminishing returns, right? relevance to some of those types of metrics when you narrowly focus on them, and so with DX, we have a holistic set of measures focused on the top areas of developer experience based on the data where have done research on and are continuing to gather, but the outcomes we drive for customers

are drilled down two:

it's increasing developer retention and increasing efficiency, which we measure as time wasted. So we wanna eliminate time wasted, time lost in the software development life cycle, and we wanna make sure that as a result of greater efficiency, but also aspects of culture and delight that developers are happier in their jobs, in their work and therefore stay at companies longer. So with our product, I think what we see is that, we give data to organizations that really drive action and I think that's the biggest difference in what I see with what we've done with DX versus my prior experience with using Dora Metrics or my prior company Pull Panda, which focused a lot on pull request metrics was that there's very limited actual things you could do with that information. It was a real stretch to try to incorporate them into O cares. With DX, we're seeing our metrics used across by DevEx teams as North Star metrics, by executives, as all hands talking points and goals for the organization and by local teams as a, either on a, quarterly cadence or a faster cadence, similar to retrospectives, something they can continually measure, discuss, take action on and strive to improve. So really to me, one of the biggest indicators of how successful or valuable a measurement or a metric is the extent to which it can actually inform action and decision making, and, we're seeing that happen with DX, still a lot of work to do. but, that's been really exciting.

Pauline:

Just more broadly speaking then, where do you see developer experience evolving? What does the world look like? Maybe in two to five years from now as of having this conversation

Abi:

Yeah, I think we're at such a exciting time in terms of developer experience, right? I feel two years ago, developer experience was a foreign term, you would hear it here and there and I think in the last two years it's become a really big deal what I mean by that I saw red monk just publish something on developer experience at the beginning of the year, for the first time, you're seeing much more mainstream discussion around developer experience. You're seeing a growing increase in the number of dedicated developer experience teams. There weren't dedicated developer experience teams four years ago, I don't think that was a thing. We had DevOps teams, we started to have developer productivity teams, we didn't have developer experience teams, but you're seeing leading companies everywhere standing up developer experience teams, realizing their importance and continuing to increase their investments. And I think in two to five years, I think developer experience will be the thing that technology leaders and business leaders are focused on. I think dedicated developer experience teams will become standard, I think their size that investment in these teams will continue to increase. I think frontline managers will look at developer experience as core part of their responsibility. Whereas today I think managers are very focused on delivery and not so much the health of the team. Team health is a term people use, but it's abstract. I think local teams will spend as a result much more time because leadership will care, managers will care, there will be DevEx teams evangelizing, I think local teams will spend much more time discussing and improving developer experience, whereas today we all know things like technical debt, not to mention developer environments document, all these things just become neglected until it becomes so bad. Story at GitHub, I don't, I think you guys know this, at GitHub, we paused features for a quarter. This is two years ago, for a quarter to focus on developer experience because it became so bad. There was so much debt accumulated that we had to stop everything and just work on fixing things. And so that's the world I see in two to five years, and it's very similar to Accelerate. Accelerate was written in the context of sort of software developing history as a scientific validation of continuous delivery, which was an emerging practice at the time. I think it's become an afterthought for organizations like ours, but in the enterprise still continuous delivery is such a thing people are focused on. And I think developer experiences is on that early curve right now. companies like Stripe and Figma, all these sort of leading Silicon Valley companies that know that engineering is their core competitive advantage are investing heavily in developer experience. Developer experience for the RESA industry is still a new thing that they're trying to understand, that they're trying to figure out, that they're trying to evangelize. And I think as time goes on, the ROI of developer experience will become much clearer, that's something we're working on doing through our research and the investments will increase and it will become something, all leaders and all developers to their benefit will be able to invest much more heavily and deliberately into.

Chris:

Love it, I think from the individual perspective in that world will be imagined today, you the company and, they told you, yeah, no, we don't do CDI. you know, Joe does that every Friday you go like, "what is wrong with you people?" Imagine in, you know not too far, join a company and they go like, no, no you don't have experienced, not really come on. That's what we pay you for. You go "what's wrong with people?" this is not the day we wanna live in.

Abi:

Exactly. Yep. And when you think about it, maybe it's hard to keep you up, but 15 years ago, Source Control or Virgin Control was the same thing. I was using subversion and a lot of companies use nothing, so yeah, exactly. I think it's an exciting time for developers and for leaders to have a new perspective on how to increase the performance of their engineer organizations and exciting time for vendors like us who are trying to help organizations achieve this.

Pauline:

Absolutely. We're actually closing off the podcast already, so what we usually do to end an . Episode is we usually ask our guests to tell us about one thing that they'd like to shout out from something that they've learned or seen, or just enjoying right now that can be a resource. It could be learning someone who impacted you or non-tech related, so do you wanna give us one thing you'd like to share?

Abi:

I've been spending a lot of time reading academic research papers on developer productivity and developer experience, there's so many individuals that could shout out, but I would just give a collective shout out to all the researchers who their work isn't always very visible, to practitioners in the industry, but there's an extraordinary amount of research happening to understand developer productivity and what affects developer productivity and how industry needs to approach developer productivity. I think this body of research is going to have a little bit of a renaissance soon, but, I'm just really grateful for all the researchers who are doing really tedious, difficult work of peer reviewed research in these areas. And I've benefited and learned so much from this, and I'm excited to hopefully have more practitioners and leaders be able to learn from this research as well.

Chris:

What you just said, the lack of feedback from industry towards research is what made me end an academic career and human comput interaction PhD, because you do a lot of interesting work and it's hard work as well, and so little of it seems to impact the real world, so very happy and grateful to hear that is not always the case. The shoutout this week is charm.sh which is a and set of tools to raise the CLI to the next application platform and they build a lot of cool libraries and software to make that happen and if you're building any kind of CLI application, go as you should, I would recommend you have a look at that.

Pauline:

For me it's a podcast called Glowing In Tech, and it's actually powered by coding black females, and they do a lot of work with improving diversity in tech. This podcast in particular is run by two software engineers, Amber and Jesse, who I'm quite close with and they talk to black women in tech about very technical topics, but also more of the like, "how to set better boundaries at work?, how to not make tech your whole identity?" And things like that, I find it very interesting. They have episodes every Tuesday and I think it's just a really good and fun podcast to listen to, especially for people who are looking to advance their career in technology and yeah, that's my recommendation. And, Abi, I think that is the end of our episode, that was absolutely fantastic, lots of things to think about and digest from that, but yeah, if people want to find out more about you and get DevEx, where can they find you?

Abi:

They can connect me with me on LinkedIn or Twitter. I also do have a newsletter. I just actually share sort of analysis of research papers and productivity. If you want to get into some engineering research without reading journal papers,

Pauline:

Awesome. We'll put some links into show notes, but thank you so much again, Abi, for joining us really enjoyed this episode and yeah, we'll talk to you again soon. Thank you for listening to this episode of DevXPod Want to continue the conversation about developer experience head over to our community discord server at gitpod.io/chat We have a dedicated channel there for us to talk all about DevX to make sure you don't miss the next episode follow us on your favorite podcast platform or subscribe to our newsletter DevX Digest

Chris:

you can also find out more about Gitpod on gitpod.io Start any workspace and tell us about your developer experience See you in the next episode

People on this episode