How to approach post-award management of agile contracts

Steve Kelman talks with U.S. Citizenship and Immigration Services CIO Mark Schwartz about the practical matters of past performance evaluations and continued discourse.

Agile Development Stock Image

In a recent blog called Contracting for agile, I discussed one of the three management challenges for agile contracting outlined by Dan Chenok and Joiwind Ronen -- namely, the fear that some principles of agile cannot be reconciled with existing procurement regulations. I argued that good practice suggests, and the procurement regulations allow, issuing a solicitation for an agile contract, or a task order under an umbrella IDIQ contract, without specifying requirements at the beginning, which would violate the whole idea of agile. The government should give only a very general description of the work, but be specific about the process the government will use to develop and refine requirements during agile sprints.

At the end of that blog, I wrote that "my own view is that a great exchange for less specificity upfront is greater attention and rigor in the post-award evaluation of deliverables that contractors deliver under sprints." (I will use the generic term "sprint" in this blog, though not all agile processes involve sprints. All do, however, involve delivery of capabilities in very small increments.)

I recently had a 40-minute conversation with Mark Schwartz, the dynamic CIO at the Department of Homeland Security's U.S. Citizenship and Immigration Services, and it was a 40 minutes exceptionally well spent. Schwartz provided me with lots of enlightenment on two important and practical issues. His advice, in my view, should be read carefully by everyone in the federal community – government and contractor – who is working on agile.

We discussed two issues -- post-award management of agile contracts/task orders and past performance evaluation in an agile task order environment.

Schwartz strongly believes that agile contracting enables better post-award monitoring of contractor performance, for two reasons. One is that the government gets the results of a contractor's efforts very frequently, not after a long period of performance against a complex requirement. The second, related, is the use of automated tools that signal when distinct parts of the work on a sprint are done, and transparently communicate this to the government. (There are definitions in agile task orders of what it means to be "done," which includes that the software has been tested.) The software also transparently communicates other information about various aspects of the process, such as what defects have been uncovered.

Why couldn't the same automated tools be used for traditional waterfall requirements, I asked, with similar signaling of when at least part of the work, if not the whole big requirement, was done? The answer, Schwartz explained, is that no piece of software -- even part of a waterfall requirement -- can be considered "done" until it has been tested. And in traditional waterfall contracting, only limited testing can be done before the whole requirement has been completed, many moons later.

So in waterfall development, contractors submit reports of the estimated amount of the requirement that has been completed, but these are subjective and not as transparent. While sprints are either done or not done – including testing – in waterfall contractors can often claim to be "done," only to discover that much work remains because the software fails some testing.

It might seem that, since a contractor produces work products for government review and approval far more often under an agile approach than under waterfall, the total workload for the post-award management of an agile task order would be greater per contracting dollar the government spends than for traditional waterfall contracting. However, waterfall contracting involves extensive contract management of requirements that end up never being used, along with the management of rework required as lengthy requirements adapt to reality along the way. The TechFAR also notes there is a learning curve here; as agencies get more experience in managing agile sprints, they will learn how to become more efficient (and effective) at it. This should be a topic for cross-agency conversations about agile contract management lessons learned.

Even if the agile approach hypothetically did take more resources, I am convinced that the post-award management is likely to be more effective. And I have no objection to a shift in the procurement system from resources spent, often in a minimally productive manner, on the intricacies of source selection towards more productive post-award management. As regular readers know, this general theme has been a recent hobby horse of mine.

In terms of evaluating past performance on agile task orders under an IDIQ contract, USCIS has experimented with making multiple awards for individual task orders under its umbrella contract, typically to four vendors, often divided into two teams of two vendors. What Schwartz wants to be able to do is to reallocate the division of work to these teams, on a monthly basis, based on their performance the previous month.

When I spoke to Schwartz a few months ago about past performance evaluation on agile sprints, he said he wanted to do frequent, informal reports, seen more as feedback for improvement than "evaluation," and was concerned that he would need to go through the more-detailed requirements demanded by the CPARS system, including the right to get a higher level review of a rating the government gave. In response to our earlier conversation, I had determined by consulting the FAR that such evaluations were indeed required for individual task orders over the simplified acquisition threshold, a modest $150,000. I mentioned this to Schwartz at the time, sorry to be the bearer of bad news.

However, based on my new conversation, I realize there is not a problem. A individual task order for agile development awarded by USCIS typically is for a period of six months, with three option renewals, for a total typically of two years for one task order. Under an individual task order, a contractor getting a normal amount of work would do 25 or so sprints in a year, 50 in two years, none of which is individually subject to FAR requirements for formal past performance evaluation because they are all part of the same task order. At the end of the first year (with many sprints), the FAR requires a formal past performance evaluation to be done (these task orders typically have a value of $2.5-3 million a year.) Meanwhile, the customer is free to do frequent informal past performance evaluations.

That is exactly what USCIS is doing. Typically, USCIS does these once a month. The agency's practice is to discuss each evaluation with the contractor, including an informal opportunity for the contractor to give a different version of events, if any. USCIS wants the contractors to know what they think, so they can use this fast feedback to improve. These evaluations then serve as a basis for quick rejiggering the volume of work among contract holders. Sounds a lot like how a commercial company would do the same process, which to me is a source for praise.

(As an aside for future discussion, given the push for smaller, modular projects where functionality can be delivered more rapidly, I wonder if it is time to revisit the idea of a streamlined past performance reporting process for smaller dollar acquisitions. If those involving $1 million or less in annual spending were less involved, and included relief from the higher-level review process, agencies could more easily share information with each other through the past performance information retrieval system. I favor this more generally, but maybe this is a target of opportunity.)

Schwartz's thoughts were really helpful for me, and I hope for the agile community in government. I like to think that all my blog posts make a contribution to improving government, but, thanks to Mark's insights, I am especially proud of this one. I believe that applying, or adapting, his approaches to the situation of each agency using agile will make a huge difference for getting this new way of doing business right in the mainstream of government IT acquisition. Spread the word!