IT Insecurity: Aggressive use of security solutions

To avoid massive data breaches in the future, the government must address its cumbersome acquisition process and misguided IT security practices.

Richard Spires

According to Richard Spires, to avoid massive data breaches in the future, the government must address its cumbersome acquisition process and misguided IT security practice.

In my previous two columns, I described the three primary root causes that have led to the massive data breaches and compromises of core mission IT systems in multiple federal agencies. and provided recommendations for addressing the first cause: lack of IT management best practices. The remaining two root causes — which are the focus of this column — are misguided IT security practices and a slow and cumbersome acquisition process.

Regarding misguided IT security practices, to the government’s credit, there has been a fairly aggressive shift in thinking from the traditional Federal Information Security Management Act reporting approach to continuous monitoring of IT systems and the overall IT environment. I was also pleased to see that Congress passed much-needed reform in the FISMA Modernization Act of 2014, and I hope Congress will work closely with the executive branch to ensure that implementation delivers enhanced security.

Nevertheless, when I look at the current cross-agency priority goals for cybersecurity, I believe the government is still trailing behind current IT security best practices. For example, if you look at the overall objectives, the CAP goals will typically consider objectives of less than 100 percent to be successful, such as 95 percent for automated asset management or 75 percent for strong authentication.

Higher numbers are certainly better than lower ones in those metrics, but we are dealing with adversaries who are advanced and persistent — and who will almost certainly find the holes and exploit them. It is simply a matter of time.

Likewise, the Einstein system can aid agencies in detecting threats, and the promise of Einstein 3 Accelerated is the proactive blocking of malicious traffic. However, Einstein is only helpful if the traffic is actually going through the system. Many agencies have Internet connections that are not monitored by Einstein, and I posit that this is another example of poor IT management.

The government has invested hundreds of millions of dollars in the Einstein program, yet agencies continue to posture and delay implementation. In effect, these approaches have led the federal government to establish a virtual Maginot Line as its key IT security strategy.

Based on the current situation and what I see evolving in the cybersecurity industry, I recommend rethinking how we measure success, with a focus along three lines:

1. Enhance automated protection. There is without a doubt a continuing need to pursue cybersecurity tools to prevent intrusions and, perhaps even more important, detect them quickly when intrusions do occur. The Einstein program identifies and protects against known “signatures” or characteristics of malicious activities, thereby preventing those intrusions. However, more advanced protective capabilities are required to prevent intrusions that the government is not yet aware of, thereby further reducing the government’s attack surface.

With enhanced automated protection, network defenders could focus on detecting and remediating only the most sophisticated and potentially dangerous attacks rather than trying to decide which of the seemingly endless alerts to pursue today.

The cybersecurity industry has made great strides in those areas in the past few years, and the government should be using the most advanced tools for prevention and detection that take advantage of threat intelligence from users all over the world.

2. Fully establish and monitor trust. Even with the most advanced prevention tools, the government needs to assume that sophisticated adversaries will gain access. So alternative approaches are needed — particularly ones that rely on creating more trust in online interactions.

The root of all trust is verified identity, and in the online world, multifactor authentication methods are the key to doing that. A plethora of newly available technologies enable multifactor authentication for both internal (government) and external users. And some of the solutions can integrate with antiquated systems. However, the government needs to step back and rethink how it rapidly implements ubiquitous use of multifactor identity authentication.

Even though the root of trust is identity, there is more to the equation. In the physical world, I trust other people because I have high confidence they will act in a manner that I expect. Some of the most damaging data breaches have come from individuals who were properly authenticated and authorized to use systems and access data, but their behavior was not in keeping with what was expected. This is commonly called the insider threat problem.

There are new technologies and capabilities today that can bring in other contexts to assess someone’s trustworthiness on a regular basis, such as audit logs or behavioral analysis systems. Those additional factors, beyond those used to assess authenticity, are essential to fully establishing and monitoring trust.

3. Focus on protecting the most sensitive information. The government needs to target additional protection of an agency’s most sensitive information, whether it is in the form of datasets or documents. Tools and products exist that enable agencies to protect information independent of the likely insecure environment in which they operate.

Agencies should focus on their most valuable information. I recognize that there are limitations because of the antiquated systems in which some of that information resides, but by focusing efforts on the most sensitive information, the government could ensure that only trusted parties would have access to an agency’s most sensitive information. That would go a long way toward thwarting additional major and damaging data breaches.

It is difficult to implement state-of-the-art IT cybersecurity solutions if you have no way to rapidly evaluate them and then purchase or license them. The Continuous Diagnostics and Mitigation (CDM) program and Einstein could potentially serve as governmentwide vehicles for that process, but it has taken significant time to put them in place.

I recommend an approach that enables individual agencies to rapidly bring in solutions and try them in a test-bed environment. After thorough testing and based on what works best, agencies should be able to roll security solutions into production. That approach would ideally encompass traditional cybersecurity vendors and new vendors that have little or no government experience. They are an incredible source of technical innovation.

The government is not getting the best solutions through the existing acquisition process. Therefore, I recommend that the Office of Federal Procurement Policy work with the General Services Administration and the Department of Homeland Security to put a more streamlined CDM program in place — one that would enable rapid addition of new capabilities as they become available in the commercial market.

The data breaches at the Office of Personnel Management are terrible for the government and for the millions of us who could be negatively affected in the future. Viewed through the right lens, however, the episode could be the impetus for much-needed and sustained change. And given the need to implement the Federal IT Acquisition Reform Act, the current administration has a golden opportunity to set the correct foundation for success. It is critical to make enough progress in the next 18 months to ensure that leadership commitment to FITARA, FISMA modernization and other needed changes in IT security are sustained into the next administration and Congress.