Rethinking TIC: 3 pitfalls to avoid
Unless we take a new approach, the Department of Homeland Security’s plans to modernize the Trusted Internet Connections (TIC) program could repeat the same missteps that plague the previous effort. Namely, lack of flexibility, long implementation times and little or no integration with existing cloud security initiatives like the Federal Risk and Authorization Management Program (FedRAMP) and the Continuous Diagnostics and Mitigation Program (CDM).
Cloud is already a major part of most federal agencies’ computing reality. Recent reports reveal that agencies now use 228 different cloud service providers, with two-thirds of those being Software-as-a-Service companies.
Yet the new version of TIC 3.0 may not be substantially different from TIC 2.2. The latter forced federal IT projects to move at the speed of government. It involved customized, one-of-a kind deployments that took years to implement, and were typically obsolete before they were deployed.
Instead of reinventing a deficient wheel, TIC 3.0 should help federal government agencies to take advantage of existing cloud technologies that are already approved through FedRAMP and other cyber security programs. These programs speed up deployments and make the government more agile and more secure.
Let’s consider TIC 2.2’s shortcomings and how to avoid them going forward.
Lack of Flexibility
TIC’s existing program has some serious flaws that were created by the government’s focus on securing appliances and networks at the perimeter. Determined to eliminate or at least seriously reduce back-door vulnerabilities, the government devised standards for monitoring and security across all departments and agencies.
However, TIC 2.2 did not properly address the government’s increasing reliance on the cloud and mobile computing. At the same time, it ignored the shift from network-based computing to application- and user-based computing in the cloud.
At present, TICs protect networks through a hub-and-spoke design which backhauls application and data traffic over dedicated Wide Area Networks (WANs) to centralized gateways. The approach is costly, inflexible and largely irrelevant in the world of cloud and mobile computing, and it increases the distance between users and their destinations unnecessarily.
Long Implementation Times
The government typically builds custom IT solutions when agencies need to use public infrastructures such as the internet, the cloud, and mobile applications. At a minimum, the traditional process takes two years — from defining the requirements to requesting and evaluating the bids to awarding a contract and actually deploying a solution.
In addition, this pursuit of custom solutions is self-defeating as it involves finding workarounds to perfectly fine commercially available solutions. Creating such workarounds is not only costly and time-consuming, but ultimately doomed to failure, as the cloud and mobile computing spawn innovations every few weeks.
Another Achilles heel of custom IT projects is that they simply cannot integrate smoothly or easily with existing technologies — many of them available as commercial cloud services — that are used by countless government departments and agencies.
The core question is: why would smart people in the federal government even try to create a custom cloud, when a commercial cloud service provider delivers cost-effective, secure, and scalable solutions through its platform?
TIC has outlived its usefulness. Instead, the federal government should:
Embrace the flexibility and the scalability of publicly available technologies from approved vendors. Trying to create expensive, custom alternatives that become obsolete within months is a waste of taxpayer money.
Shift its security focus from the network perimeter to applications and users. Protecting the perimeter is old hat and irrelevant in the cloud and mobile computing age.
Focus on extending the implementation of technologies, practices, and processes already approved by FedRAMP and CDM. Failure to do so could lead to institutional uncertainties, costly trial-and-error scenarios and longer implementation times.
Ken Ammon is chief strategy officer for OPAQ Networks.