Valuing cybersecurity outcomes instead of oversight
- By David Wennergren
- Jun 19, 2015
David Wennergren is senior vice president of technology at the Professional Services Council.
Every day, new technologies and applications offer opportunities to change how we work, live and play. This frenetic pace is rivaled only by the ever increasing number and sophistication of the cybersecurity threats we face.
We are eager to embrace the future: the Internet of Things, nanotechnology and everything from Fitbits to bring your own device. We want to be always connected, from any device, from anywhere. Yet with each new capability that we embrace, new threats and vulnerabilities are introduced.
We must re-evaluate our cybersecurity efforts to ensure that we can quickly exploit new technologies to deliver more effective mission results. Today, the call for speed and agility is nowhere more crucial than in our cybersecurity policies and practices.
Progress has been made. Information assurance professionals used to have to beg for attention and plead that security not be an afterthought. We should applaud federal successes in identity management, public-key infrastructure, Trusted Internet Connections, common security controls, Joint Regional Security Stacks, data-at-rest encryption and continuous monitoring.
But despite heightened awareness and attention, many organizations are not operating at a fast enough pace to make use of important new technologies and proven best practices. And as is often the case, the impediment that most stands in our way is not the adoption of new technology but the acceptance of new thinking.
Through the use of the Common Access Card, the Defense Department significantly improved information and physical security, not to mention enabling electronic solutions to replace labor-intensive, paper-based processes. Yet a number of civilian agencies still use their personal identity verification cards as little more than flash passes.
A number of cybersecurity threat vectors, not to mention barriers to information sharing caused by multiple disparate networks, could be successfully addressed through the combination of strong identity management, attribute-based access control and security at the data level.
Another conundrum we face is the difference between oversight and outcome. Continuous monitoring provides far more value than a point-in-time focus on certification and accreditation. And although we have long touted the value of reciprocity and the goal of “certify once, use many,” the adoption of cloud computing in the federal government provides a great example of a promising technology solution that is lagging in implementation.
It was great to see the recent press release from the Defense Information Systems Agency that highlighted 23 commercial cloud service offerings that had been granted provisional authorizations. Yet those proven offerings still require a DOD organization to conduct the assessment that would lead to an authority to operate — all for solutions that will not handle sensitive information and that have previously been granted a FedRAMP agency ATO or provisional authorization.
Those examples share two classic change management issues: the desire for personal control and a lack of trust. The processes we institute to address issues of trust and control must not take the place of what matters most: measurable outcomes that ensure mission results.
A world where we rally around a common goal of secure information sharing will be one where our security efforts help ensure the rapid adoption of new technologies and the ability to get the right information to the right person. Some laws, such as the Federal Information Security Management Act, must be changed, and new laws addressing liability and information sharing must be enacted.
But perhaps even more important than changing laws is changing attitudes to stay ahead of the threats we face and deliver the results we need.
David Wennergren is executive vice president for operations and technology at the Professional Services Council.