McCall: New look at an old technology
Demanding information needs call for more automated business processing
- By Daniel McCall
- May 22, 2006
Government information technology supports citizen needs. Corporate IT supports business needs. Government is in the business of service while corporations are in the business of profit. Yet even the best-case examples of application processing from both worlds share similar inefficiencies, including latency and poor visibility into processes and outcomes. As information systems have become more complex, they have grown less reliable because of bottlenecks and stand-alone business functions.
Such systems work, but they are slow and prone to errors — mainly because they require some manual intervention. Whenever a business process includes manual steps, you introduce errors and latency, and poor visibility into business processing makes it difficult to prevent bottlenecks and system failures.
But emerging trends such as business process management and service-oriented architecture fuel expectations that real-time information at an enterprise level is a realistic goal.
Not all processes need to yield real-time information. Most business functions still require only batch or near-real-time processing. Organizations that optimize IT by reducing manual handoffs and centralizing enterprisewide batch processing gain immediate business value and add flexibility to their IT infrastructures.
Federal agencies have many independent functions and thousands of business applications. They support IT backbones that must instantly respond to workload surges without buckling, especially during unpredictable events, such as natural or man-made disasters.
Human IT operators are good, but they’re unable to react quickly enough to such surges, and they can introduce errors, particularly in high-pressure situations. Only an automated IT infrastructure can adjust processing priorities.
Batch process schedules have traditionally used date- and time-based triggers to launch jobs. In an agencywide or global environment, those triggers are inadequate for critical business processes. Many agencies and businesses need the ability to adjust to dynamic events. Existing processes typically rely on custom scripting for integration and automation.
Although any functionality can theoretically be scripted into a process, hard-coded solutions support only static operations and offer minimal visibility into the execution of those processes.
A central view of application processing is important for efficient IT operations in government and corporate environments. Because the majority of business processes touch many applications and platforms, the “fire and forget” method creates islands of automation that function without integration or coordination. That produces errors and potential system downtime, process latency and expensive integration scripting.
Achieving an agile IT infrastructure is no easy task. Automating business processes and integrating disparate platforms from a central location is a step in the right direction. We live in an unpredictable world, but unpredictable business application processing doesn’t need to be part of it. Scalable, flexible infrastructure tools that monitor and manage systemwide processes offer significant gains in IT reliability and service levels, while providing the agility needed to manage increasing business complexity and respond to random events.
Less latency and downtime in government IT means better service for those depending on the systems.
McCall is vice president and chief operating officer at AppWorx, which makes business process automation software.**********