Business decision logic is a type of data, and it's time agencies started treating it that way.
Tradition is good, but efficiency is better. In an annual report released in April, the Government Accountability Office examined fragmentation, overlap and duplication among government programs and identified 440 actions that agencies and Congress could take to improve operational efficiency and effectiveness.
Similarly, the Office of Management and Budget's Digital Government Strategy strives to improve IT efficiency and effectiveness for the American people. A key tenet of the strategy is ensuring that data and content are accurate, available and secure. The strategy further emphasizes the need to treat all content as data.
Experience shows that a major cause of IT inefficiency is the continual rebuilding of hard-coded, decision-based systems. Business decision logic is a type of data, but unlike traditional data elements that are stored and managed in databases, it is typically hard-coded into software.
Modifying software to reflect changes in business decision logic is costly, cumbersome and slow. Yet hard-coded software systems dominate government IT.
Those systems are largely developed by third parties under large, complex and risky contracts with lengthy software development life cycles. And until recently, hard-coding decision logic was the only option.
Moreover, compartmentalized agencies have traditionally lacked the incentive to coordinate system investments enterprisewide. As a result, government systems are often overlapping, fragmented or duplicative.
The trend toward standardization
A contrasting approach exists that would reduce operating costs, increase response times and improve accuracy while empowering internal analysts and experts. Government agencies would rely on those internal decision-makers to centrally govern decision logic, with minimal technology labor. The need to continually rebuild hard-coded, decision-based systems would diminish. This prevailing alternative is known as decision modeling.
An interim step on the way to true decision modeling implementation might be rules engines, which could resolve some technical challenges by doing away with the hard-coding paradigm. However, rules engines without decision models would do little to overcome the superfluous developer costs associated with continual software rebuilds. Moreover, decision models would not replace rules engines because the two are complementary. In fact, decision models are easy to automate in today's rules engines, so those models increase the value of rules engines. (This is because a new, agile life cycle exists from a business analyst-created decision model directly to rules engine code, with minimal IT intervention.)
Two decision modeling frameworks exist. The Decision Model, invented by Barbara von Halle and Larry Goldberg in 2011, has been successfully adopted by insurance and banking firms, and continues to spread throughout the financial industry.
The Decision Model and Notation standard, published by Object Management Group in 2014, enables organizations to access and share centralized business decisions using a common tabular format. A short list of vendors on the group's DMN committee includes IBM, Oracle and FICO. Von Halle and Goldberg were also key contributors to the specification.
Both models are suitable for government, and both exemplify the trend toward standardization of decision management.
The benefits of decision modeling
Decision modeling extracts complex business logic from software systems and allows internal business experts to manage the logic in a central repository. Business decision tables are two-dimensional and organized into simple conditional statements that result in a single conclusion. The tables are managed in a structured repository and are intuitive to maintain as the underlying policies and regulations change.
Most important, the logic in decision models is expressed in business-friendly (not technical) terms that are defined by business people and linked behind-the-scenes by technical people to actual data sources. That approach has proven invaluable. It means decision models are truly a technology-agnostic and business-aware deliverable. It means the same decision model can operate against more than one data source without any changes. And it means a data source can be replaced with a new one without making any changes in existing decision models.
In short, decision models are independent of data sources and independent of target technology. They are purely business driven and deploy anywhere and to many places, if need be.
Rob Lux, Freddie Mac's executive vice president and CIO, wrote in a 2013 blog post that, by using a decision model, it took Freddie Mac "only 17 days to write, test and deploy the 100-plus rule changes comprising Hurricane Sandy disaster relief policies for the systems lenders use to sell and service Freddie Mac mortgages. This is about 90 percent less time than it took to operationalize policy changes following disasters like Hurricane Katrina or the 2012 New England floods."
Among other things, decision modeling:
- Allows decision logic to become a managed asset, like other forms of data.
- Strengthens stewardship over decisions by internal business analysts.
- Shortens response to continually changing policies and regulations.
- Frees up otherwise fixed program costs.
- Shortens software development cycles and yields far fewer errors.
- Supplants monolithic systems.
- Reduces the complexity and number of IT contracts and the dependence on third-party labor.
To realize the potential of decision modeling, the federal government could establish a governmentwide pilot project that would entail modeling a subset of business decision logic pertaining to a topic area subject to federal regulation, such as telecommunications, patents, acquisitions, environmental issues or taxes.
Then the government could model the chosen set of regulations in simple decision tables, in accordance with the Decision Model or DMN, and load the connected decision tables containing the regulations into a web API tool to make them centrally available and systematically accessible. The Digital Government Strategy encourages the use of web APIs to make "data assets freely available for use within agencies, between agencies, in the private sector or by citizens."
Such an architecture would allow internal business analysts to update the regulations in real time as changes occur. Upon successful adoption of the new decision model paradigm, legacy hard-coded systems could be redacted and eventually phased out.
NEXT STORY: NTSB looking for a new CIO