Some core building blocks of federal IT are primed for radical improvement.
Some core building blocks of federal IT are primed for radical improvement. (Stock image)
Agency infrastructure is in for a serious overhaul. New technologies, architectures and frameworks have begun to assert themselves, leaving no core IT component untouched.
Storage, once a fairly stable sector from an innovation perspective, will see an explosion of flash-based devices and the arrival of technologies that will help agencies put data in the best possible storage tier.
Software-defined networks could shake up the networking status quo as the technology extends virtualization beyond servers and storage. More rigorous and methodical approaches will become more commonplace in the field of mobile app development. And a new security framework for defending critical infrastructure will begin to take shape this year.
Although the full impact of those developments might not be felt for a few years, the leading edge is already moving into view. Read on for an overview of the key developments.
IT managers face more storage options than ever, and traditional disk storage now operates alongside a variety of flash-based devices.
With the wealth of storage choices, the issue now becomes how to direct data to the optimal device from a cost and performance perspective. For example, low-cost, high-density disks such as Serial Advanced Technology Attachment (SATA) provide lower performance than other media but offer an efficient platform for storing data that is retrieved less frequently. More expensive flash storage — as deployed in solid-state drive (SSD) technology — offers higher performance for mission-critical applications with high input/output demands, but it is too expensive for archival purposes.
Against this backdrop, automated tiering technology has begun to emerge as a way to route data to the appropriate storage level. Some large IT shops have already adopted tiering, and its use will only increase into 2014 as more customers adopt flash storage and seek to use SSDs more efficiently. Automated tiering can help government agencies incorporate flash into storage arrays, where the new technology can co-exist with more familiar disk drives.
Jim Damoulakis, chief technology officer at GlassHouse Technologies, an IT infrastructure consulting firm with a specialization in storage, said automated tiering helps customers introduce solid-state storage in a way that is less obtrusive.
"It gives you the ability to leverage a relatively small amount of SSD, which is still significantly more costly than spinning disk," he said. "You can inject a little [solid state] and get significant performance gains from that."
Damoulakis said tiering can be deployed to establish three layers of storage: a fast SSD layer, a medium layer of Serial Attached SCSI, and a slower, near-line layer of Serial ATA drives. The automated tiering technology samples input/output activity to determine where the data should reside, and tiering products let organizations control when the sampling occurs. For example, should it take place continually or only during peak work hours? Customers can also determine when data reallocation among tiers takes place.
Vendors that offer tiering include EMC, whose Fully Automated Storage Tiering for Virtual Pools is entering the federal acquisition pipeline. Rich Campbell, chief technologist at EMC’s federal division, said the company’s VNX storage line now incorporates tiering.
"It is a part of every bid and proposal today," Campbell said of automated tiering.
He added that he believes the technology will dovetail with the adoption of flash storage and the emergence of software-defined data centers in the next 12 to 18 months. In the software-defined data center context, tiering will let IT managers provision high-performance, middle-tier and lower-tier storage according to an application’s needs, he said.
"It’s made for the software-defined data center," he added.
Furthermore, the data allocation and tiering issue will become more complex as new storage media surface. David Hung-Chang Du, Qwest Chair Professor of Computer Science and Engineering at the University of Minnesota, said shingled write disks, which increase storage density by writing data to disks in an overlapping pattern, and non-volatile RAM technologies, such as phase-change memory, are near-term developments that will populate an expanded memory and storage hierarchy.
Du is working on a research project funded by the National Science Foundation that investigates effective data placement in this new storage landscape.
"There are a lot more things you have to look at," Du said. "I think this is only...the beginning."
Software-defined networking (SDN) promises to transform the way organizations manage networks. With a traditional network, administrators manually configure stand-alone pieces of networking equipment. SDN, on the other hand, lets administrators centrally program networking devices such as switches through a software layer. In effect, SDN brings virtualization to the network by following established virtualization trends at the server, storage and desktop levels.
Kelly Herrell, vice president and general manager of software networking at Brocade, said virtualized computing and the cloud are driving SDN.
"When the compute application model changes, then the network changes to adapt to it," he said.
Government agencies might be getting an early taste of SDN, even though mainstream adoption is probably a couple years away. Internet2, a university-led advanced technology community, has launched a nationwide 100 gigabit network that uses SDN. About 70 government agencies participate in Internet2, and notable early network users include the NSF-funded Extreme Science and Engineering Discovery Environment.
XSEDE is a project that involves 17 supercomputers and taps the Internet2 network as its backbone for linking high-performance computing centers. The organization had been using 10 gigabit Ethernet prior to the migration.
"They are now on a 100 gigabit backbone instead of a 10 gigabit network," said Rob Vietzke, vice president of network services at Internet2. "The SDN piece is under it. They get the future opportunity...to use the SDN substrate."
The shape of SDN applications, meanwhile, should come into sharper focus later this year. Internet2 and industry partners recently announced an Innovative Application Awards program to encourage the creation of open-source apps that use the Internet2 network’s SDN capabilities. The competition emphasizes apps that improve large file transfers, said Eric Boyd, senior director of strategic projects at Internet2.
Boyd said Internet2’s plan is to work with a select group of application developers during the summer and have software ready by fall.
"We can start to really demonstrate a core base of applications that you can point to [as] the types of applications that will use SDN going forward," he said.
Sudhir Verma, chief services officer at IT solutions provider Force 3, said widespread adoption of SDN won’t happen overnight, but the technology will eventually follow other data center trends in the federal space. "Federal folks...aren’t jumping on this bandwagon right away," he said. "But as data center consolidation, cloud and virtualization...gain momentum, SDN is not that far behind."
SDN adoption will probably proceed incrementally rather than all at once, industry executives say. Herrell said network functions virtualization represents one of the first waves of SDN. With NFV, IT shops can run a particular network function as a virtual machine rather than as a physical device.
"It is a very easy way to begin moving into the SDN arena because you don’t have to re-architect everything," Herrell said.
Joe Brown, president of virtualization provider Accelera Solutions, said SDN adoption is still at an early stage, but he anticipates more activity in 2014, particularly among organizations seeking to boost the sophistication of their existing virtualization deployments. In that vein, SDN provides a step on the path toward the software-defined data center, he added.
Mobile tech development
Mobile app development activities tend to be dispersed throughout an organization, creating the potential for redundancy, inconsistent look and feel, and uneven quality. The coming months, however, will likely see greater coordination of mobile software creation.
Some agencies have already taken steps in that direction. The Department of Veterans Affairs is establishing a governing board that will serve as a single, VA-wide source of guidance on mobile development. The board is being set up primarily at VA’s Veterans Health Administration, with the participation of the Veterans Benefits Administration and the National Cemetery Administration.
Dave Peters, an assistant deputy CIO at VA, said the board functions informally for now but is on track for formal approval in the next month or so.
He said it will serve as a single point of intake for projects and will review proposals for potential projects. In addition, it will define mobile development policies that do not fall within the purview of other parts of the organization, such as information security.
"Anyone can submit an idea, and the intent is for the board to decide whether the idea is a good one and whether or not development should proceed," Peters said.
In 2014, the board might take on additional tasks. For instance, it could play a role in promoting a uniform app style, and Peters said it would be ideal for the board to tackle establishing a consistent look and feel across apps.
Tim Hoechst, chief technology officer at Agilex, an IT solutions provider with an enterprise mobility focus, said more agencies are recognizing the need for greater formality in mobile development.
"The development of the app is as distributed as the running of apps in a mobile environment," he said. "The IT department is saying, 'How do we do this in an orchestrated way?'"
Hoechst said agencies can take a number of steps to provide that orchestration. Those measures include consistent access to enterprise data via secure services, app-building templates to ensure consistent aesthetics and usability, and automated tools for activities such as unit and integration testing. Development approaches such as agile and DevOps can also boost an organization’s mobile app maturity.
The next seven months will see the unfolding of a cybersecurity framework that will identify security standards and guidelines that will span a range of critical infrastructure sectors, such as energy and transportation. The National Institute of Standards and Technology is spearheading the effort.
The framework’s details will emerge out of a series of workshops in the next few months. A preliminary framework is slated for publication in October, with the final product expected by February 2014. The project and timeline stem from Executive Order 13636, which sets policies for protecting critical infrastructure from cyberattacks.
"It is a Herculean effort NIST is engaging in here," said Rick Comeau, executive director of the Security Benchmarks division at the Center for Internet Security.
Comeau cited the framework’s need to account for differences in how critical infrastructure sectors operate while also addressing the regulatory compliance demands each sector faces.
"We are asking for a framework that represents a great diversity of critical infrastructure sectors," said NIST Director Patrick Gallagher at a cybersecurity framework workshop in May.
He added that companies ranging from small utilities to large multinationals will need to implement the framework, and he hopes the government will also take full advantage of it.
NIST plans to incorporate existing security standards into the framework rather than invent new ones. Comeau said the SANS Institute’s 20 Critical Security Controls for Effective Cyber Defense could help shape NIST’s initiative. He also cited the IEC 62443 standard for industrial automation and control systems as potentially playing a role in the framework.
Rahul Kashyap, chief security architect at Bromium, a developer of software solutions for endpoint security, said the scope of critical infrastructure extends beyond government and industry to anyone who uses an Internet-connected device to access sensitive information.
"It’s been proven many a time that adversaries have infiltrated into sensitive environments by breaching human trust and attacking underlying vulnerabilities in the applications people use," Kashyap said.
NEXT STORY: AFCEA to honor Spires