The Federal Data Strategy demands the right infrastructure
- By Nick Psaki
- Aug 02, 2019
When the long-awaited Federal Data Strategy was released earlier this spring, it was encouraging to see guidance that reaffirmed putting data at the heart of decision-making. The strategy aims to help agencies deliver on the promise of data in the 21st century, and the practices it outlines reflect the importance of aligning data stewardship with data usage and of building a culture that values data and promotes public use. It creates a framework for governing, managing and protecting data, and for promoting efficient and appropriate data use. By making data strategy a priority, the federal government will be able to better leverage data assets and more strategically execute data management improvements.
The strategy emphasizes the importance of putting data first and providing transparency and equitable use at all levels. The federal government has petabytes on petabytes of data, with more being created every moment. However, the potential uses of data extend past federal agencies themselves. Non-federal users, from academics to businesses to private citizens, also stand to benefit greatly.
However, for agencies to implement this data-first strategy, the right infrastructure must be in place -- and legacy data service platforms were not built for today's goals. Today's federal IT infrastructure wasn't built from scratch to implement a cohesive, cross-government data strategy. It was built over time, usually program by program, and as a result is fragmented and not very agile. Agencies must be prepared to meet the strategy's goals, and to do so, they must focus on creating a data-centric infrastructure, which would have five key attributes:
- It should deliver fast, shared data. Applications built today simply cannot live in the world of "slow" – i.e. spinning disks or, worse yet, tapes. Modern data should live on flash, and it should be built from day one to be shared, because tomorrow's applications will require that.
- It should be on-demand and automated. Federal data service architectures should be built with this model at the core: on-demand consumption and automated delivery to accelerate innovation and reduce costs by delivering data service infrastructure via standardization and automation.
- Exceptionally reliable and secure data infrastructure is a must -- especially when it houses and moves sensitive and protected data.
- Tomorrow's storage should support hybrid by design. It should allow easy movement of data volumes to/from the cloud, simplifying application and data migration but also enabling hybrid use cases for application development, deployment, and protection.
- And finally, it should be constantly evolving and improving. Users expect the cloud to continuously get better, without downtime, delivering more value every year for the same or lower cost. Government IT leaders must architect for this constant improvement, in order to seamlessly improve data services infrastructure without ever bringing users offline.
The guidance laid out in the strategy is sweeping and aggressive, but achievable. Progress at many agencies is well underway – and this strategy only further confirms what many already knew. Data needs to be at the heart of decision making, and the Office of Management and Budget is taking the necessary steps to push agencies down the right path. It's now up to those agencies to plan and build accordingly.
Nick Psaki is principal system engineer with Pure Storage.