Customer report cards

In federal contracting, there are established mechanisms for rating contractor performance. There are even databases that contain official records of how well a contractor did on a particular contract.

A performance rating after the fact may help agency managers avoid awarding contracts to poor performers in the future, but it does little to help a contractor understand how his or her performance is viewed at the time it really matters: when there's still time to fix problems.

Software developers have a reputation for being late in delivering debugged software, and software projects have an unfortunate tendency to overrun schedules and budgets.

Those of us who create software development methodologies have responded to this problem, in part, by introducing iterations into the development process. By having a number of discrete, repetitive development efforts rather than one big "waterfall" of development, the hope is that each smaller effort will be easier to manage.

The end of an iteration is a great time to get feedback from the customer. Indeed, any time there is a significant delivery is a good time for feedback — if not by the technical team, then certainly by the marketing folks.

Asking for feedback is hard. Engineers don't really want to hear bad news, and customers often like the individual software engineers and might avoid being brutally frank about their shortcomings. The problem, of course, is that without meaningful feedback, the opportunity to make corrections in mid-project is lost.

In the QuickStep methodology I developed, we use a customer report card to try to organize this feedback in a meaningful way and to make the process a little less threatening.

A customer report card has specific categories of feedback that are tailored to the project at hand. For example, the input categories might include scheduling: How did we do overall in meeting deadlines? How hard were we perceived to be trying to make them? Another might be quality: How many defects did we generate during development? How effective were we at removing defects once they were identified? And how many made it out the door into the finished product?

It's natural to rate the software's functionality. I also suggest adding an item for team interaction and communication, because those issues can be critical to success.

The first time you try this, it might be good to do it in a group meeting. It might also be a good idea to have somebody from the contractor side facilitate the meeting, or even propose the grades. Once the ice is broken, a customer report card could become a standard part of a milestone review.

Feedback is something those of us in software development say we want, but all too often it doesn't happen. Think about a customer report card as a way of giving developers constructive criticism — and helping them do better on the next project.

Bragg is an independent consultant and systems architect with extensive experience in the federal market. He welcomes your questions and topic suggestions at tbragg@acm.org.

Featured

Stay Connected

FCW Update

Sign up for our newsletter.

I agree to this site's Privacy Policy.