The United States could learn from the Canadian government's performance evaluation process for IT hardware, writes Advanced Micro Devices' Rick Indyke.
Rick Indyke is manager of public-sector business development at Advanced Micro Devices.
Federal procurement reformers are under the gun to deliver cost savings. For example, Federal CIO Vivek Kundra and others at the Office of Management and Budget are making big changes in the way government buys and manages IT. The innovative cloud-first policy and the 25-point plan for reforming government IT, along with the focus on project oversight, are already transforming government services and have the potential to deliver considerable cost savings.
In fact, the work already appears to be paying off and has generated nearly $3 billion in savings, according to Kundra’s recent testimony before Congress. However, a report released by the Government Accountability Office found that although investment oversight and management have improved, there's more work to be done.
Cost efficiency in IT is always important, but in these tight financial times, it is vital. Yet for all the attention reformers have put on improving procurement oversight, they haven’t focused on effective ways to measure IT performance and value before agencies buy. That’s the way most taxpayers shop, and not doing so is the same as leaving money on the table.
Reformers don’t have to look far to find additional cost savings. In fact, the United States might consider taking a page from the Canadian government’s performance evaluation process for IT hardware.
Canada’s rigorous annual qualification process for IT hardware, which the Public Works and Government Services Canada (PWGSC) office manages, defines standards through the National Master Standing Offer. The NMSO process and policy are similar to U.S. procurement policy, but they also wisely include a reliable performance evaluation to enhance decision-making and expenditures.
PWGSC uses the Canadian branch of an independent testing facility, the National Software Testing Laboratory of Philadelphia, to conduct independent performance evaluations with actual applications in real-world scenarios using real-world workloads. PWGSC defines a specification based on the equivalency of competing vendors, and the evaluations are performed using the same operating systems, utilities and software applications that government agencies use. The workloads in the tests are designed to represent different scenarios. The program also includes an initial user experience and other important factors to gauge the hardware’s usability and performance.
Testing for desktop and laptop PCs includes system performance, stress, compatibility, features and usability, battery performance for laptops, and other measures. Servers undergo a variety of performance tests designed for specific types of applications and workloads. My company, Advanced Micro Devices, had its Phenom II desktop and mobile processors and Opteron 6100 Series processor evaluated in that process, and they all qualified for the NMSO.
Each year, new processors from competing vendors undergo evaluation, which allows government IT managers to choose with confidence the system that best suits their needs at the best price. That’s what fair and open competition is all about.
The Canadian approach promotes choice by providing a reliable measure of system performance, and it eliminates the sort of biased performance measures that can be found in some commercial benchmarks.
From what has been seen so far, U.S. officials are headed in the right direction with many of their reforms. As they proceed, they should consider alternative methodologies to deliver cost savings without sacrificing quality.