Thursday, February 22, 2007

5 Essentials for Software Development Metrics

I ran across a whitepaper today by 6th Sense Analytics that summarizes the essentials of a software development metrics program. The paper doesn't specify the metrics to collect, just the properties they should all have. An overview is provided below. You can find the full whitepaper here.

From my perspective, for any metric to be useful, it needs to help the Project Manager make decisions. As this whitepaper states, all metrics should be actionable. If it's not actionable, then it's not useful.

Introduction
Today's approach to metrics calls for automation over process and in-process adjustments over post-mortem analysis. Information should be made available to all stakeholders throughout the lifecycle.

To be effective, metrics must be properly planned, managed and acted upon. What is measured, how it’s collected and how it’s interpreted are the difference between brilliant insights and blind alleys on the path to metrics-driven software development.

The key is to ensure metrics are meaningful, up-to-date, unobtrusive, empirical and actionable.

#1 - MEANINGFUL
Emerging activity-based software development metrics solutions focus on a simple and fundamental unit of measure: Active Time. Understanding the actual time being applied to these activities across a portfolio of systems and projects can provide an important level of insight that enables organizations to understand project velocity, relative investment, sequencing of activities, and opportunities and risks. It also provides a uniform basis for comparison across projects and teams.

Select metrics that will enable you to steer your projects in a meaningful way.

#2 - UP-TO-DATE
It is important to look for metrics that can be captured automatically during the execution of the process itself. Activity-based metrics solutions are able to automatically capture metrics during the conduct of the software development process, ensuring that the metric is consistently based on up-to-date data.

#3 - UNOBTRUSIVE
The process of collecting data for your metrics program should be seamless and unobtrusive, not imposing new processes or asking developers to spend time collecting or reporting on data.


#4 - EMPIRICAL
Activity-based metrics solutions capture data as software development processes are executed, eliminating all of the issues that compromise the integrity and accuracy of data. Additionally, the use of the Active Time metric ensures data consistency; an Active Hour is the same in Boston, Bangalore, Mumbai and Beijing.

#5 - ACTIONABLE
It is critical that the metrics you gather inform specific decisions during the course of your software development projects. Avoid information that is nice to know, but doesn’t help you make decisions or solve problems.

The litmus test for any metric is asking the question, “What decision or decisions does this metric inform?” Be sure you select your metrics based on a clear understanding of how actionable they are and be sure they are tied to a question you feel strongly you need to answer to effect the outcome of your software development projects.

It is also critically important to ensure that you are able to act on and react to metrics during the course of a project.

Finally, be sure that metrics programs are inclusive and that data is available to all stakeholders within the software development lifecycle. Data that is widely available is empowering.

1 comment:

James Peckham said...

I completely agree that metrics should have all of those characteristics you mentioned. Also, I would state that most of the time metrics should be an indicator to something however not treated as 'facts' for the purpose of making decisions.

I've seen too many times where some metric taken out of context or solely by itself was used to make a decision before consulting the people doing the work.

So I prefer metrics be only indicators of current work status and that a discussion with the people doing the work be my confirmation of a certain thing.

Derby and Larsen say in "Agile Retrospectives" to put the data in front of the team and then find out what they believe was going on to make that data look the way it does. Then try letting the team make the decision... if they can.