Eye on IT advances
- By John Moore
- Jan 11, 2004
Why you should care
Think of it as a coming-of-age party for Web services, which some experts believe are integral to the future of online applications.
Look for consolidation among companies that develop Web services, along with a shakeout of rival Web services standards.
Enterprise content management
Why you should care
What good is storing terabytes of data if you can't find what you need?
Enterprise content management software the programming that keeps tabs on stored data is gaining momentum. Deploying such a system, however, is no small feat.
Why you should care
Grid-based commercial tech could sate the data demands of high-end customers.
Oracle Corp. is leading the way, shipping a grid-friendly database, and others
Why you should care
It's the end of the search for a magic bullet to stop cyberattacks.
Systems now offer multiple approaches to preventing intrusions. A well-stocked arsenal might include the latest in cryptography.
Why you should care
Imagine a world in which wireless systems handheld devices, laptops, cell phones, etc. are not islands of data but conduits of information.
One concept is the meta interface, which enables cross-talk among different modes of communication.
Why you should care
The Industrial Revolution meets the Information Age, taking people out of the loop of dangerous and repetitive jobs.
Improvements in sensor technologies and on-board data processing capabilities are making robots more versatile.
Listening for a buzz
Here's one sign that the information technology industry is beginning to recover from the economic slump: Buzzwords are starting to flow again.
Service-oriented architectures, grid computing and enterprise content management are just a sampling of the technologies vying for attention. IT managers may also encounter information life cycle management and intrusion resilience as they scan the tech horizon.
The key task is separating the winners from the wannabes, which is all the more treacherous with budget constraints and managers' fixation on here-and-now returns. This article offers a few hints on where to look for innovation this year and highlights technologies that will reach development or commercialization milestones in 2004. We also provide a glimpse of technologies scheduled to roll out in 2005 to 2007.
Linux pumps iron
Linux spent last year pumping iron and will have more muscle this year.
The latest Linux kernel Linux 2.6 provides a scalability boost that will let customers run the open-source operating system on larger servers. Linux 2.6 can scale to 32-processor machines and beyond, according to industry executives. That's a far cry from Linux's humble single-processor debut in 1994.
"Linux is getting better every year," said Jonathan Eunice, president and principal analyst at Illuminata Inc., a technology research firm based in Nashua, N.H. The level of scalability will let Linux's 2.6 generation address 99 percent of potential application loads, he added. The previous iteration could handle perhaps 80 percent.
The Open Source Development Laboratory, which promotes use of Linux, contends that the new kernel will open more markets in which Linux becomes an option for replacing legacy systems, noting that Linux 2.6 is ready for production use on 32-way systems.
Improved scalability will translate into greater confidence in Linux, Eunice said. "This gives customers what they always want, which is headroom," he said. "It's not so much that everyone will deploy on 16- or 32-processor Linux servers, it's that they can."
He said he expects customers to gradually entrust larger and larger loads to Linux. Scalability is especially important for federal customers who have been tapping Linux servers for scientific and technical computing.
Linux 2.6 is available for downloading, but federal IT managers will probably not feel much impact until the kernel shows up in enterprise-class Linux distributions later this year. SuSE Inc. Linux recently acquired by Novell Inc. plans to tap Linux 2.6 for its SuSE Linux Enterprise Server 9, which is scheduled to debut this spring. Linux 2.6 also will surface in Red Hat Inc.'s Enterprise Linux 4.0, which will arrive in the fall.
The Web services field could be in for a shake-up in 2004.
A variety of vendors back Web services as the lingua franca of service-oriented architectures. A service-oriented architecture consists of a collection of services essentially pieces of software code that communicate with one another and can be coordinated to complete particular tasks.
Although other approaches to service-oriented architectures exist, Web services have the most traction, according to industry analysts. BEA Systems Inc., IBM Corp., Microsoft Corp. and Sun Microsystems Inc. hover about the field, while numerous smaller vendors also provide wares for securing and managing Web services.
Some watchers expect the field to shrink through acquisitions and attrition. Ronald Schmelzer, senior analyst at ZapThink LLC, expects a considerable amount of consolidation in 2004. "It's going to be a make-it-or-break-it year."
The vendors that emerge on top are important to a growing number of federal agencies using Web services to integrate applications or revitalize legacy systems.
Government customers must consider another source of conflict: Web services standards. Microsoft and IBM, for example, back WS-ReliableMessaging as a protocol for ensuring transaction integrity, while Sun and Oracle Corp. support a specification based on WS-Reliability. Rival security specifications also exist.
Security and management standards are critical to the evolution of Web services. "For service-oriented architectures and Web services to get to the next level, enterprise-class issues have to be met," said Scott Opitz, senior vice president of webMethods Inc.
Undaunted, Microsoft programmers aim to make Web services a part of their next-generation OS, which is code named Longhorn. The OS, scheduled to debut in 2006, will enter beta testing this year. A key component of Longhorn is Indigo, a Web services-based communications layer.
Indigo will make Web services transparent to some degree. "Indigo capabilities exploit Web services technology under the covers," said Neil Macehiter, a research director with Ovum Ltd., a London-based consulting firm.
Storage: Content matters
Expect software to take center stage in storage developments this year.
Within the storage software arena, some observers believe enterprise content management could experience a breakout year in 2004. Enterprise content management software provides a repository for unstructured business data, such as e-mail and word processing documents. The technology plays a role parallel to a relational database management system in the structured data world.
Enterprise content management could play a role in federal agencies tasked with getting a better handle on regulatory compliance data.
"We believe strongly that enterprises will select [enterprise content management] software as a repository to handle a diverse array of unstructured business data," said Jim Reimer, distinguished engineer for enterprise content management at IBM.
Reimer said enterprise content management has moved out of the early adopter phase and is headed toward more widespread adoption.
However, organizations considering enterprise content management face a lengthy deployment cycle, said Peter Gerr, research analyst with Enterprise Storage Group.
"It's like an [enterprise resource management] implementation," he said. The size of the task may compel customers to start with tactical projects such as e-mail archiving, Gerr said.
Enterprise content management fits into the broader realm of still another buzz phrase information life cycle management, or ILM. ILM is the notion of storing information according to its value at a given point in time. Critical data may exist on the fastest, most reliable, and typically the most expensive, disk storage platform, while less critical data may be stored on lower-cost technology.
Vendors contend that ILM is more about process than product. Cameron Van Orman, a senior project manager at Storage Technology Corp., said the company is conducting infrastructure assessments so customers understand their storage environments before pursing ILM. "We view services as the next trend in helping customers," he said.
Data plugs into the grid
Grid computing, once the province of academia, may become reality for mainstream data centers this year.
The grid concept seeks to harness the power of numerous distributed computers into a single, logical computing resource. University and government labs, joined together as the Globus Alliance, have been pursuing grids to tackle complex research problems.
But in 2004, vendors such as Oracle aim to bring grids out of the lab and into more mundane settings. Oracle's Database 10g represents the company's take on grid computing. The product will enter the mainstream this year, having only been available since December.
Oracle 10g lets customers leverage a network, or grid, of database servers. The ability to distribute work across multiple servers means that customers can add or remove computing resources as demand ebbs and flows. Oracle 10g includes a load-balancing feature to shift computing capacity throughout the database, according to Oracle.
Tim Hoechst, senior vice president of technology for Oracle Government, Education and Health Care, said this incremental scalability takes advantage of the low cost of commodity servers based on Intel Corp. processors and Linux. "The idea is to use a collection of low-cost hardware," he said. He added that the U.S. Geological Survey and the Environmental Protection Agency are among the agencies interested in the grid.
Wayne Kernochan, managing vice president of Boston-based Aberdeen Group, said Oracle has been the main database supplier promoting grids. But IBM, Sun and other vendors also have launched grid-oriented products.
As vendors seek to make grid computing mainstream, Kernochan said high-end customers will find this approach most relevant. "I believe people with the largest applications requiring the highest scalability will be the people who find it most useful," he said.
Information security researchers will push ahead on a number of fronts this year.
RSA Laboratories, part of RSA Security Inc., for one, pursues intrusion resilience, said Burt Kaliski, chief scientist and director for the lab. This approach emphasizes multiple levels of protection, acknowledging the difficulty of preventing each and every intrusion. This approach is encapsulated in RSA's Nightingale technology, which is designed to boost the security of conventional servers housing sensitive data.
Nightingale uses secret splitting to distribute sensitive data across two servers. Information needed to authenticate users, for example, could be stored on two servers. Information stored on a single server would make for a high-profile target, Kaliski said. He expects to see elements of Nightingale in RSA products as early as this year.
In cryptography, modernization will be the watchword in 2004. John Droge, vice president of business development at Rainbow Mykotronx, a subsidiary of Rainbow Technologies Inc., said the Pentagon's Cryptographic Modernization (CM) effort has been in development during the past three years. Some products supporting elements of CM emerged in 2003, but this year, he believes, fully compatible products will appear.
CM represents a break from previous cryptographic boxes that were hard wired and couldn't be updated. CM devices, in contrast, are programmable. This flexibility lets cryptographic users install new
Another thrust is building better test suites to assess the strength of security systems. The Energy Department's Idaho National Engineering and Environmental Laboratory has made this a priority for 2004. Specifically, the lab will develop test suites for probing host-based and network-based intrusion-detection systems. The key lies in creating suites "that emulate skilled attackers," said Jason Larsen, chief architect of the lab's Cyber Security Research and Development office.
Larsen said the test suites could be ready later this year. But he added that the suites will not be commercially available, noting that the lab plans to use them internally and license them to "three-letter intelligence agencies."
Developments in wireless communication target the nagging problems of interoperability and multiple-network access.
The inability of various wireless devices to communicate is of increasing concern to government agencies, particular those involved in emergency management. First responders typically find that the radio gear used in one jurisdiction won't work with the equipment used in another, thus complicating rescue efforts.
Industry research efforts may bridge the communication gap. The Palo Alto Research Center (PARC), for example, has created Obje, which the organization terms an interoperability framework for making wireless phones, personal digital assistants and other technology work together.
To accomplish this, the Obje software architecture reduces the tangle of protocols governing device communication to a handful of meta interfaces, said Hermann Calabria, a principal in PARC's business development and licensing group.
PARC is testing Obje in the home electronics field, but the technology's implications are far reaching. Calabria said first responder communication is one area that could be explored.
PARC will take steps this year toward commercializing Obje. The lab plans to license the technology and is "in the process of aligning with partners," Calabria said.
At Cisco Systems Inc., the company's interoperability tack is to take radio signals and put them in a multiservice, IP-based infrastructure, said Greg Akers, chief technology officer in Cisco's Global Government Solutions Group. The technology for doing so will enter the commercialization phase by late 2004.
Intel's research and development organization, meanwhile, is working to get a handheld device to communicate with multiple networks. Last fall, Intel officials took the wraps off their universal communicator prototype. The prototype device incorporates Global System for Mobile Communications and 802.11b wireless connectivity. This year, Intel officials
plan to add additional networks to their handset.
"We see [that] the issue of multiple networks for handheld devices is going to be really important," said Bryan Peebler, Intel's business development manager for the universal communicator.
Robotics rock on
2004 could be a watershed year for the emerging field of autonomous land
Land-based robots, for the most part, have been remotely controlled. Such robots have been around for years and represent a fairly mature technology. For example, NASA's skateboard-like Sojourner rover explored the surface of Mars in 1997.
Since then, robots using remote-control technology have been commercialized. Remotec, a subsidiary of Northrop
Grumman Corp., has sold more than
700 remote-control robots for uses that include handling hazardous materials and explosives.
But robotic vehicles able to operate without human intervention are not nearly as developed as their remote counterparts. The Defense Advanced Research Projects Agency, however, seeks to push this nascent technology by sponsoring a 250-mile race for autonomous land vehicles.
More than 30 teams will participate in DARPA's Grand Challenge, scheduled for March. The teams' robotic entries must complete an off-road trek through the Mohave Desert from Barstow, Calif. to Las Vegas within 10 hours. The winning team will receive $1 million.
While DARPA eyes military uses, other applications include space exploration. Autonomous robots would overcome the distance limitations of remotely controlled devices. William Whittaker, Fredkin Research Professor at Carnegie Mellon University, said autonomous land vehicles would also be ideal for underground uses, such as mapping coal mines. "It's not just distance or range that compels autonomy," he said. Whittaker will lead Carnegie
Mellon's Red Team in the DARPA race.
The Red Team hopes to expand robotic capabilities in such areas as sensors. Although today's sensor range might be 20 or 30 meters, Whittaker said his team "intends to get to 50 to 60 meters." Improvements in the range, speed and accuracy of sensors will help the robotic racers navigate rugged terrain at a pace that allows them to stay within the time limit.
The Red Team also seeks speed in data processing, because its robot will need to rapidly crunch sensor data to navigate. "We are going to a 64-bit architecture," Whittaker said, noting that a quad Itanium machine will be one of the robot's computing engines.
A look ahead: 2005 to 2007
A number of significant developments are lurking a little further over the horizon, including the scheduled 2006 debut of Microsoft's Longhorn operating system, the successor to Windows XP.
Longhorn, as currently conceived, will embed three new technologies: Avalon, a presentation subsystem for developing client applications; WinFS, a data-access layer for storing structured and unstructured data; and Indigo, a communications layer employing Web services.
Ovum's Macehiter said the three components are interconnected. Avalon and WinFS tie applications and data to the operating system. The Web services aspect of Indigo allows applications and data to interoperate with non-Microsoft systems, Macehiter said.
"Each of the components has capabilities that are useful and important, but the key value comes from the combination of utilizing all three," he said.
That said, organizations won't be exploiting those features in a shipping product until 2006. In the meantime, federal agencies may give greater consideration to the recently debuted Apple Computer Inc.'s Macintosh OS X Version 10.3 operating system. Lewis Bean, business development executive at GTSI Corp., said agencies may view the Mac platform as a hedge against complete dependence on Microsoft on desktop PCs.
On the microprocessor side, Intel officials target 2007 as the timeframe for introducing their new chip-making technology. The company recently announced a new material called "high-k," which officials claim will reduce electrical current leakage in transistors. The technology will let Intel cram more transistors on increasingly powerful chips.
Charles King, research director at the Sageza Group Inc., a Mountain View, Calif., market watcher, said that Intel is not alone in pursuing the leakage problem, noting a similar effort at IBM. When the technology arrives, whether from Intel or another chip maker, the expanded transistor density and processing punch will give hardware manufacturers more room to innovate.
And on a practical note, processors that use power more efficiently will run cooler, reducing electrical consumption, air conditioning and other bottom-line data-center costs, King said. That would be a useful innovation in this era of pragmatism.
Moore is a freelance writer based in Syracuse, N.Y.