20 years of marvels and missteps

Come along for Federal Computer Week's guided information technology tour

After 20 years of reporting on the next big thing in technology, Federal Computer Week looks back on what we said with a certain amount of 20/20 hindsight. The projections experts and pundits made seem laughable as often as they appear prescient.

Things were a lot less complicated back then. In 1987, when the first issues of this publication landed on the desks of federal information technology professionals, a desktop PC powered by an Intel 286 processor, with a hard drive measured in megabytes and a monitor that might have been color was considered state of the art. But systems of that sophistication were rare, and even late in the first decade of the PC revolution, desktop computers were often still regarded as glorified word processors. As a young naval officer, I was chided by my executive officer for using a clerical device suitable only for yeomen and storekeepers. 

If you wanted real graphics, Lamont Wood wrote in Federal Computer Week 20 years ago, a workstation was the machine of choice. And if you wanted to push the envelope on a PC, you might consider an IBM PC. IBM had just introduced VGA graphics with a resolution of 640x480 pixels, about a third of the resolution of the average laptop PC display that's shipping these days.

It is difficult to remember how some people considered color graphics unnecessary. Bruce Cummings, then vice president of marketing at Hercules Computer Technology ' one of the leading computer graphics card makers in the late 1980s ' told Wood: 'Frankly, there seem to be very few PC applications that really need color.' Hercules made a display adapter that converted color graphics to monochrome, a modification that today is equivalent to buggy whips.

Remember Pick?
In 1987, Windows was but a twinkle in Bill Gates' eye, and the next big thing was supposed to be Microsoft OS/2. Scott Palmer wrote in his October 1987 feature on 'Micro Operating Systems' that there were four major operating system choices: MS-DOS, Apple's MacOS, Unix and Pick. Pick's 'adherents are multiplying,' Palmer said, because of the multiuser operating system's database capabilities. 

The Mac had won the hearts and minds of many. Palmer reported that the Naval Weapons Center in China Lake, Calif., had more than 1,200 Macs on a local-area network connected to Digital Equipment VAX minicomputers. Fortunately for Apple, Macs did not go the way of the VAX. 

As Esther Surden reported in September 1987, DEC's VAX hegemony was already threatened by Unix minicomputers. 'Digital doesn't have a real Unix,' Dataquest analyst Kimball Brown told Surden. 'They have to contract with others to put Unix on their machines. But IBM's AIX is a real Unix.'

One reason for the focus on Unix in the late 1980s was the Defense Department's growing attraction to open systems. In 1992, the federal government, especially DOD, was an early proponent of open systems, and especially of Unix, because it was interested in portable software. Unix and Posix, an application-programming interface that defined a basic compatibility across operating systems, were important DOD standards for information systems, including real-time systems.

The problem with Posix was that it was a bare-bones application program interface for software portability. It didn't address many of the issues related to real-time systems and distributed applications. DOD also had a problem with conflicting standards.

'People have been playing up Unix with a lot of marketing hype,' as Inder Singh, then president of Lynx Real-Time Systems, now LynuxWorks, told FCW in 1992. Singh said real-time Unix and another favorite DOD specification ' the Ada programming language ' were almost mutually exclusive. 'If you combine Unix and Ada together, it is very difficult to get good real-time performance,' he said. Singh's company now focuses on embedded Linux for real-time applications, and Ada is no longer the sacred code that it once was.

The ideas behind open systems led to the World Wide Web, Web services and most of the distributed application technologies that glue together today's networked systems. Even if Posix compliance and standards such as  the Common Object Request Broker failed to keep traction as the pace of change accelerated, they at least paved the way. They were efforts to solve a problem that technology was not ready to fix.

Browser wars
Notwithstanding competition from the pen computer and other innovations, the most revolutionary technology of the mid-1990s was the Web browser. By 1997, Web technology had become the linchpin of the government's efforts to deliver more information to the public. 'Browsers are now the tools that integrate systems and platforms across organizations,' Richard Kellet, then division director of emerging IT applications at the General Services Administration's Office of Governmentwide Policy, told FCW.  

However, Kellet and others had interesting notions of how to improve Web browsers.  Kellet said Web browsers needed 'some sort of flat-file database to interact with the Web so people who access the GSA's Web page can do form searches.' JavaScript scripting and Sun Microsystems' Java programming language and runtime code solved many of the problems Kellet identified in 1997.

The browser wars between Microsoft's Internet Explorer and Netscape's Navigator reached a turning point in 1997. Microsoft vice president of development Paul Maritz told Intel executives that he planned to 'cut off the air supply' of Netscape through a strategy of 'embrace, extend and extinguish,' according to court testimony by Intel's then-senior vice president Steve McGeady.

Netscape and Microsoft continued to heap new features into their browsers.  Netscape's Navigator had an e-mail client, a Web editing tool and a group threaded-discussion tool, called Collabra, for collaborative browsing. With that tool, one user could ' remotely ' take another user to a specific Web address. The tool included push technology that delivered content for users to read off-line. The browser's professional version included an IBM 3270 terminal
emulator.

There were innovations, but Web developers and designers found themselves trapped between Netscape's and Microsoft's implementations of essentially the same services. As the companies competed by racing ahead to support the next version of every standard, they forced application developers to target one browser or another, or to write separate applications ' one for each browser.

Microsoft eventually succeeded in driving Netscape into the ground. Sun bought the server software side of Netscape, which became Sun's iPlanet subsidiary before it was subsumed into Sun's software operations. The browser lived on through its acquisition by America Online, although Mozilla outshines it in today's round of browser wars. Mozilla is the open-source successor to Netscape.

The richness of Web technology gave rise to software as a service (SaaS). A first wave of application service providers (ASPs) rose and faded during the dot-com boom and bust, but improved technology and bandwidth made feasible what many service providers call on-demand applications.

The main difference between ASPs and SaaS, as FCW reported this April, is that software developers are now the ones serving applications for on-demand access via the Internet. 'The success of Salesforce.com has really helped legitimize the SaaS marketplace,' Jeff Kaplan, a senior consultant at the Cutter Consortium, told FCW. That kind of success has attracted government agencies to SaaS.

Web 2.0
The Military Health System has been using a SaaS offering of Oracle's Federal Financials application for more than two years, Wayne Bobby, Oracle's vice president of finance and administration solutions, told FCW in April. 'In those two years, they haven't had one minute of unplanned downtime,' he said. One of the major attractions of SaaS is that it lets IT departments step off the merry-go-round of upgrades and bug fixes. 

SaaS applications have contributed to Web 2.0, a second Internet boom driven by powerful Web clients and the social and collaborative power of a global network. The reach isn't limited to PCs. It extends to a wide range of connected devices, moving applications and connections outside the safe, secure walls of federal networks. The opportunities for collaboration on a global scale have created a world in which people turn to a peer-created encyclopedia  such as Wikipedia for answers, post videos of events almost as they happen on sites such as YouTube, and contribute bug fixes and new code for an operating system such as Linux. 

Security threats
The freedom and interconnectivity that Web technologies make possible have also unleashed an ever-present security threat. During the past 20 years, computer security concerns have changed and expanded as fast as technology has ' or perhaps even faster. The constantly changing IT landscape has created new vulnerabilities almost as quickly as it has corrected old ones.

In 1992, FCW's Richard Danca reported on the expanding security risks posed by networks and distributed systems. Even before the Web arrived on the Internet ' when many in the government still called it ARPAnet ' government and commercial networks overlapped heavily. 'We can't even bound the connectivity of most systems,' said Patrick Gallagher, then director of the National Security Agency's National Computer Security Center. 'These connections are so richly diverse,' he said.

If they were diverse then, they are now nearly omnipresent. Computer security in 1992 focused on protection from viruses, good computer-use policies and firewalls for perimeter protection. But the hyperconnected digital world of today has created an unending supply of potential information security breaches and exploits from outside and within government networks.

The proliferation of wireless data services has only exacerbated the state of hyperconnectivity. When devices such as Research in Motion's BlackBerry arrived in the late 1990s, they suddenly unchained thousands of e-mail users from their desks. The devices were so addictive that they became known by some as crackberries. Many people felt the compulsion to check and respond to e-mail messages in conference rooms, away from work and in their homes.

As FCW reported in June, a GSA study of 15 agencies found they had collectively spent more than $122 million on wireless hardware and services and had 218,700 active wireless voice and data service accounts. The Navy alone had 85,000 accounts, and it spent $45.9 million for devices and services such as the BlackBerry and Palm's Treo. 

As wireless voice and data services converged, companies added more features. Apple's iPhone is the latest example of what digital convergence has wrought. Web browsers, digital cameras, and basic document-viewing and editing features have made the mobile masses more connected than ever ' for better and for worse. John McManus, deputy chief information officer at the Commerce Department, called his personal digital assistant 'the little demon device' in an interview with FCW earlier this year.

Some people called the PC a demon device 20 years ago ' and some people still do. The more things change, the more they stay the same.

Gallagher is a freelance writer in Baltimore, Md.

Who's Fed 100-worthy?

Nominations are now open for the 2015 Federal 100 awards. Get the details and submit your picks!

Featured

Reader comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above