Not a model situation

Building an enterprise architecture (EA) is a difficult process at the best of times, but of all the thorny issues that have to be tackled, one of the prickliest is how to select the software tools that will be used to model the architecture.

A small agency building an EA may not have any problems, because one set of compatible EA tools can be used to model the organization's entire architecture. This ensures a homogeneous picture of all of the agency computer systems, datasets and business processes — the core ingredients of an EA.

A larger agency composed of a number of different organizations could be in a tougher situation, because each of these organizations will likely have its own modeling tools. These could range from standard relational databases to specialized object-oriented EA tools sold by a number of companies.

It's the job of the enterprise architect to pull all of these partial views of the enterprise into a single, coherent view that can help decision-makers find the best lineup of information technology systems to support the agency's business mission, the whole reason for doing an EA. But that's when headaches begin, given the inherent conflicts between incompatible tools and the inability of one tool to accept models created by another.

Standardization Struggles

The federal CIO Council's "A Practical Guide to Federal Enterprise Architecture" says tool standardization is a "recommended best practice" for ensuring cost effectiveness and the interoperability of models. Most enterprise architects would agree with this idea, but it's often not possible.

"Where there are specific incompatibility problems, tool standardization can help," said Jim Wilson, a software architect who leads the EA effort at the Treasury Department's Office of Thrift Supervision. "But the sphere prescribed by a standard is important. A tool that delivers value in one EA environment may not be valuable, or worse, in another."

There is probably no one tool that works well for all organizations, he said.

Even if one tool was an option, there's no guarantee that it will be used in a consistent way, according to Carl Creager, the associate chief information officer for EA at the Transportation Department, which he described as comprising "12 or 13 fairly autonomous operating administrations."

"Even if everyone is using one tool, they would still probably be using it to do their own thing," he said. If the tool is being used for a low-risk business process, "we probably won't make a big deal of standardization. But if it does get to be a big deal, then that will make an argument for standardization."

However, more than just technical considerations have to be taken into account, Creager pointed out, because politically, officials need to maintain the autonomy of the administrations.

Given the number of different tools that need to co-exist within an EA, the question then is how to accommodate the differences in what the tools produce. This comes down to adopting methods that ensure the best exchange of information and models among the tools.

For this, you need a good knowledge of the application program interfaces (APIs) of the various tools, said Abe Meilich, a certified systems architect with Lockheed Martin Mission Systems, a major government contractor. Tools that have a fairly complete set of APIs make the exchange of information a lot easier. Otherwise, you have to write code to create "bridges" between tools.

"In most cases, you try to choose tools that have open APIs and nominal export capabilities," he said. "XMI [Extensible Markup Language Metadata Interchange], for example, is becoming a standard we can use to do this."

XMI is an emerging industry standard for defining and sharing metamodels, the higher-level descriptions of the models in an EA (see box, Page 34). If a tool is XMI-compliant, it uses XML to output information, which can then be accepted relatively easily by another tool that is also XMI-compliant.

When to choose tools is probably as important, if not more so, than what tool to choose. Some organizations that already have a number of legacy tools in place could be tempted to define an EA based on what these tools can do, for example, or choose a tool or tools in order to define an EA.

Dealing With Incompatibilities

What should come first, said Jan Popkin, chief executive officer of EA vendor Popkin Software, is defining or choosing a framework that will provide a checklist of what has to be covered within the EA. This then leads to a discussion about the languages and methodologies that are needed to create the models required by the framework. Only then should the actual selection of tools come into play.

In effect, some standardization of tools is enforced by the need to tie the models produced by these tools into the broader framework requirements.

Ptech Inc., another EA vendor, deals with the need to manage tool incompatibilities by stressing tool flexibility. The information and relationships that make up the metamodels are presented in Ptech's tools in the form of diagrams that can be manipulated as users are building their metamodels.

"In other words, it's not a static way of showing information," said James Cerrato, Ptech's chief product officer. "You can determine what information you need to bring out to match up between our tools and others. And then we have extensions [that] can be used to introduce new types of information and relationships, which also allows information from other tools to be incorporated" into Ptech's tools.

Other suggestions for overcoming tool incompatibilities focus on standardizing at the level of the underlying data rather than the tools themselves. If organizations were required to publish data derived from the tools in a standard form, this would eliminate the interoperability problems that now exist, because one tool would readily accept the data from another.

However, to do that, the various parts of an organization will be required to produce their data in that standard form regardless of how and what they use the data for. Also, the cost of modifying existing systems to support the new format might well exceed the anticipated benefits of conformance.

Steve Hunter, president and chief technology officer of EA tools vendor Agilense Inc., believes the answer is to introduce standardization at the level of the metamodel. If tool vendors had to incorporate a standard model of an architecture, enterprise architects would no longer have to waste time mapping models among different tools.

"The proprietary models that vendors include in their tools now are a big selling point and lock their customers into using their tools," Hunter said. "But [vendors] are beginning to understand that it may be to their longer-term advantage to agree on standard models, since it would offer all of them the chance for a bigger share of the pie."

The question is what those standard models should be. In its solutions, Agilense provides industry standard models from the Object Management Group Inc. and other standards organizations, and in that sense, it has made its solutions "modeling tool-neutral."

Following this approach, the government now has the chance to define its own standard models according to its own needs, Hunter said. If vendors were then required to incorporate those models into their own tools, there would no longer be a problem of interoperability.


Pulling it all together

The first step in building an enterprise architecture (EA) is to define or select a framework, which is the logical structure for classifying and organizing information about computer systems and business processes for the enterprise. The next move is to create models that represent the enterprise's operations, information, relationships and constraints.

A metamodel is an overview of the different types of models in the enterprise; it contains the organization's metadata, which is an integrated view of the entire information infrastructure. To put together an EA, the architect must combine the models into a single repository, an information system that's used to store and access architectural information and the various relationships among informational elements. It's here that problems arise if different vendors' tools produce models in formats that are incompatible with one another.

Robinson is a freelance journalist based in Portland, Ore. He can be reached at [email protected].


  • FCW Perspectives
    remote workers (elenabsl/

    Post-pandemic IT leadership

    The rush to maximum telework did more than showcase the importance of IT -- it also forced them to rethink their own operations.

  • Management
    shutterstock image By enzozo; photo ID: 319763930

    Where does the TMF Board go from here?

    With a $1 billion cash infusion, relaxed repayment guidelines and a surge in proposals from federal agencies, questions have been raised about whether the board overseeing the Technology Modernization Fund has been scaled to cope with its newfound popularity.

Stay Connected