Ground rules for improving federal cybersecurity
- By Dave McClure
- Jun 09, 2015
Dave McClure, Chief Strategist of Veris Group.
Big-data analytics are gaining attention in the cyber world, and there is widespread recognition that government agencies must retreat from the current cut-and-paste approach to collecting threat information. Instead, there is real value in automating critical continuous monitoring and focusing more attention on critical analyses.
That shift has given rise to the application of predictive and behavioral analytics to all enterprise and external data in an effort to better evaluate threat potential. , thereby increasing the likelihood of detecting attacks before they occur and gathering useful threat and vulnerability intelligence.
However, for many organizations, it is a daunting if not impossible task to prevent all intrusions from occurring. In fact, most testing shows it might be safe to assume a breach has already occurred without any near-real-time detection. Today, data security — at rest, in transit, in use — takes precedence over a systems mentality. Data sharing, created by distributed computing environments and accelerated by the continuing explosion of end-user devices and capabilities introduced by the Internet of Things, has created a very challenging cyber environment.
There is real value in automating critical continuous monitoring and focusing more attention on critical analyses.
As a result, organizations must address some basics that form the foundation of future cyber protection success:
1. With the push toward enterprise solutions, data governance needs critical attention. Without it, cybersecurity is handicapped at the outset. At a minimum, agencies must categorize data into master, shared and single use bins. That is essential for building basic business and workflow processes that control proper access, usage, protection and accountability. Business process rules can help identify critical assets and whether those assets are being used in ways that could create damaging vulnerabilities.
2. We must focus on data and security architecture and engineering designs that make it difficult to get access to key assets and limit damage when cyber breaches are successful. As noted, protecting data can be complicated by moving apps to the end user and operating in an Internet of Things world. Data segmentation practices are paramount in a world in which vulnerabilities are a fact of life. Agencies must address a fundamental question: When an attacker gets inside a perimeter, what will that entry allow them to do?
3. Given the unprecedented rise in advanced persistent threats by internal and external actors, agencies should incorporate a “current compromise” assessment approach into core security measures. A good way to think about this is akin to using a hunt versus peck approach to vulnerability scanning and penetration testing. Compromise assessments use egress pattern matching and other techniques that can further isolate and identify the source of a compromise. In essence, you assess entry from an attacker’s perspective and use the same likely attack vectors.
Often a reverse, “inside-out” approach is deployed. Administrators say: “Here are my critical assets; let’s try to discover all the ways an outsider could get to them and then plug the dike.” However, it is more useful to search for business-critical assets and data that adversaries would seek during an actual breach. That method is stealthier than standard penetration testing.
Tools are available that evade and bypass normal system security protections. By emphasizing significant business and operational impacts, the approach is useful for drawing executive management’s attention to prioritized security solutions. And it is a bell-ringer for those who doubt whether security vulnerabilities really exist and put their operations at risk.
Keeping a focus on these fundamentals in cybersecurity programs can help strengthen an agency’s overall security program and posture.
Dave McClure is chief strategist at Veris Group.