Why training programs often fail

Recently, a telling photo made the rounds of our community. It showed a successful looking executive using a BlackBerry to keep on top of business. He was standing next to a beautiful 80-foot yacht named "Change Order," and behind the yacht was a dinghy, appropriately named "Original Contract."

If nothing else, this photo reflects what's wrong with the psyche in the world of federal acquisition. We just don't seem to be able to get what we need because we insist on defining what we think we want. Time and time again experience has proven that detailed specifications are at best correct only for a relatively short time and at worst way off the mark.

Nowhere is this more evident than when it comes to user training during the deployment of major systems. Training requirements in solicitations usually focus on classroom- or computer-based training education rather than on outcomes that meet end-user goals. The attitude is, "Training? Who cares about training?" Training is just a means to an end, which is how people perform once a system is deployed. Folks in that business call it "human performance." In our business, we conduct acceptance testing by having the developers who know all the ins and outs of the system confirm the adequacy of their own work or use third-party testers who do pretty much the same thing.

Actually, training, in its classic context, probably is the least important part of any major deployment. Much more important is whether the user community accepts, understands and uses the new solution effectively.

We all acknowledge that training efforts usually fail for several reasons: 1) the students come unprepared to be trained, 2) they are taken out of their already pressure-filled jobs so they don't want to be trained and 3) because it will be six months to a year before they see the final new system (with all of the changes that happened after the training) in operation.

Generally, the solution is to have the systems integrator that is developing the solution teach the customer how to use it. Foxes guarding the hen house? In most cases, the integrator would seem to be motivated to have less-effective training in order to allow for more billable labor hours, right? Unfortunately, the measures of success for the training usually have nothing to do with the success of the deployment, but with attendance at classes or grades on exercises.

User acceptance, competency and efficiency are the real issues, but those are hard to measure. So the easy thing is to put in place help desks that respond to ongoing user problems using a system, thereby adding another layer of cost to an already inefficient process (again, the integrator does not seem motivated for effective training, thereby allowing for more help desk hours).

As in most other areas of performance-based acquisitions, you need clearly defined goals, supported by metrics that address processes focused on the mission, change management driven by community of interest practitioners, user-connected, context-sensitive self-help aids built into applications and evaluation mechanisms that assess how well the new system is supporting the practitioners.

As usual, the solution to this dilemma rests with relevant, well-defined outcomes rather than statements of work that just define how things are done today with the outcome that we make the same mistakes we always have, only more quickly.

A successful "Human Performance" program that is tied to the achievement of business goals should have specific outcomes like shortened time to competency, reduced cost of learning, self-paced learning opportunities and facilitated change management.

We should be measuring the results of training and not the training process. We should redefine training to include state-of-the-art "just in time software" that sees what a less-than-proficient user is doing and assists that user in developing more proficiency. We should be using the best technology that is available and not be stuck with training methods that time and experience has shown don't work when they are needed the most.

Private-sector companies and federal agencies always look to training as a cost element that can be reduced. Frankly, that is a good idea. Training should not be a large cost component. The key element should be measurable results of the deployment that focus on user competency and reduced costs of learning irrespective of the costs of training. If we can just focus on the "Original Contract" we might just avoid the need for that super expensive "Change Order."

Robert J. Guerra is a partner at Guerra Kiviat and Thomas Hogan is president at Hogan Government Solutions.

About the Authors

Robert J. Guerra is a partner at the consulting firm Guerra and Kiviat.


  • Congress
    Rep. Jim Langevin (D-R.I.) at the Hack the Capitol conference Sept. 20, 2018

    Jim Langevin's view from the Hill

    As chairman of of the Intelligence and Emerging Threats and Capabilities subcommittee of the House Armed Services Committe and a member of the House Homeland Security Committee, Rhode Island Democrat Jim Langevin is one of the most influential voices on cybersecurity in Congress.

  • Comment
    Pilot Class. The author and Barbie Flowers are first row third and second from right, respectively.

    How VA is disrupting tech delivery

    A former Digital Service specialist at the Department of Veterans Affairs explains efforts to transition government from a legacy "project" approach to a more user-centered "product" method.

  • Cloud
    cloud migration

    DHS cloud push comes with complications

    A pressing data center closure schedule and an ensuing scramble to move applications means that some Homeland Security components might need more than one hop to get to the cloud.

Stay Connected


Sign up for our newsletter.

I agree to this site's Privacy Policy.