Agile Tips
Unlocking Agile Wisdom: Insights from Decades of Experience. Scott Bain is a 44+ year veteran of systems development.
Agile Tips
#83-Measuring Agile Progress
How can we best measure the progress of an agile project? Getting this right is critical, and in some cases might be somewhat counter-intuitive. This episode is all about that.
Measuring Agile Progress
We normally think of measurement as a benchmark we use to evaluate the effectiveness of a team's progress. In agile, however, we use measurement as a way to record reality, allowing us to make better predictions going forward. Measurement in this sense is about managing expectations rather than accountability.
In general, the idea is to be empirical rather than prescriptive. We assume that the people that work for us are dedicated individuals who are trying their best, and as a result we simply need to know what that best looks like.
For these purposes here are the metrics that agile recommends:
Velocity. We measure the team's velocity or speed by recording the number of work units they accomplish, on average, in a given period of time. Those work units could be user stories, or story points, or any planning artifact that we have settled on as an organization. The goal is not to set a target, but rather to record a trend so that realistic plans can be made based on the team's past performance. Each team should be measured separately, because the velocity of a given team will be relative to the nature of its work, its personnel, and a variety of related factors.
Burn down. We track the actual progress of the team showing the work completed versus the remaining work over time. Ideally this is done through some form of visualization, but my preference is to use acceptance tests as the indicator of progress. Acceptance tests are generated at the beginning of a Sprint or similar unit of work onto a backlog that the team can pull from. These tests all start out failing but then are gradually converted into passing tests as the team completes each task. There are many tools that will automatically produce a predictive curve from these numbers allowing management to realistically set their expectations about when they can expect the product to be complete.
Cumulative flow. Whether the team is using an extreme programming set of swim lanes, or a kanban board, or any similar artifact, there will be stages of progress that can be tracked. For example: prioritized, in progress, ready for testing, tested, complete. Those are just examples. What we are looking for is consistency in the way that progress is made from stage to stage across all the work units that have been prioritized, to be able to spot bottlenecks or critical impediments to the flow of value from requirement to product. Here again this is not about accountability but about radiating the information needed to solve problems that are impeding the team.
Lead and cycle times. Lead time is literally the amount of time from when a request is made to when it is delivered. Cycle time measures how long it takes the team to complete a unit of work once it has started to work on it. Both of these are literally measurable in hours or days and can be used to set customer expectations going forward, increasing customer satisfaction and aiding in marketing and sales for the future.
These are all metrics that can be gathered during the work. There are others that can be useful once the product is delivered.
For example, of the features that we prioritize and develop, how many of them actually get used by customers? If we find that there are a large number of features that are not being used, or used very frequently, then we have to ask why those features were prioritized in the first place. This allows us to improve our decision-making process going forward.
We can also reflect on the number of defects in the system that are reported by customers over a given period of time. Each of these defects represents a missing test in our test-driven process, because we never release a system with failing tests. Once a defect is reported the primary focus of the team is to determine the test they missed, so they can write that test, watch it fail, and then do the work to make it pass. This ensures that the right work is done, but also that the defect will never be reintroduced into the system because the test is retained for the future.
Finally, at the end of a given project, we can compare the expected length of time it took to complete the work to the actual time it took, so that we can adjust our expectations for the next project we engage in.
All of these metrics reflect on the fact that agile is about responding to the truth rather than issuing demands. They are also a reflection of where actual value is, as opposed to traditional metrics such as number of lines of code generated or number of bugs repaired.
You get what you measure, as they say.
Next week I will examine how agile can be applied to other domains, just as marketing and sales. See you then!