Are 64-bit PCs ready for mainstream?

Critical mass of products near, but users still not sold

The promise of 64-bit computing in PC desktops and servers has grabbed lots of attention lately, particularly as leading players in the market increase the pace of new products that use the technology.

Federal information technology managers are intrigued by vendors' claims of systems that offer improved performance and massive memory at prices expected to be well below those for traditional 64-bit systems. Nevertheless, agencies are not rushing to buy the new 64-bit systems, often referred to as "Wintel" boxes because Microsoft Corp.'s Windows software and Intel Corp. processors dominate the market.

Although many IT managers are keeping an eye on the new technology, they say 32-bit computing at the desktop and server levels is sufficient to handle most of the applications they run, at least for now. It remains to be seen if and when this will change, particularly as the 64-bit architecture incorporates a broader range of hardware and software products in the market.

Federal agencies are not entirely unfamiliar with 64-bit computing. Government research labs that run complex computational and scientific programs have been using 64-bit Unix machines for years, in some cases.

One of the main benefits of a 64-bit processor is that it can process significantly more memory than a 32-bit processor. Although 32-bit can manage 4G of memory, 64-bit can manage up to 18 billion gigabytes of memory. With more data in memory, a 64-bit processor can work faster because it doesn't have to swap large sets of information in and out of memory the way a 32-bit processor does.

Hardware and software vendors say the more advanced processor offers substantial improvements in the performance and precision of applications and calculations. Because it supports a greater number of larger files, 64-bit computing has increased scalability compared to 32-bit — particularly for large-scale applications such as databases.

The number of products that support the new architecture is increasing. In July, Microsoft released a customer preview of its Windows .Net Server 2003 operating system and plans to ship the production version in the first half of 2003, said Velle Kolde, lead product manager in the Windows .Net server group. The Enterprise and Datacenter editions of .Net Server 2003 will be available in 64-bit versions for computers running Intel's 64-bit Itanium processor family.

Earlier this year, Microsoft announced it would support Intel's latest Itanium 2 processor in its Windows Advanced Server, Limited Edition 1.2. The company already supports the first generation Itanium chip in its Windows Advanced Server, Limited Edition. In 2001, Microsoft introduced its first 64-bit client operating system, Windows XP 64-Bit Edition, running on Itanium.

Kolde sees potential uses of 64-bit computing in government agencies that have large databases and data warehouses, run complex engineering and scientific analyses and computer simulations, and require applications that use large amounts of memory.

"The 32-bit systems in use today deliver tremendous price performance, but with 64-bit computing, performance is even better," Kolde said. "There are not a lot of 64-bit applications out there today, but we expect to see more. It's not going to happen overnight because this is a relatively new platform."

Intel, which released the second generation of Itanium in July, expects a fairly focused appeal for 64-bit computing, at least initially. "The Itanium processor addresses high-end applications and we expect to see it used at the scientific and supercomputing institutions," said Mike Graf, Intel's product line manager for the Itanium processor family.

Graf said Intel plans to have five processors in the Itanium family, with the next release due next year. Each will have increasing levels of performance and be compatible with previous generations, he said.

Intel competitor Advanced Micro Devices Inc. (AMD) is also entering the 64-bit market, with plans to begin shipping its Opteron processor worldwide in the first half of 2003. The new processor will allow a seamless transition from existing 32-bit applications because it is based on the x86 base used in current 32-bit processors, said Kevin Graf, division marketing manager for AMD server and workstation marketing.

"It doesn't require a new infrastructure," Graf said. "Customers can implement [Opteron devices] into their existing architecture and migrate at their own pace."

AMD has been giving previews of Opteron to a number of federal agencies, particularly those using "data-intensive applications," Graf said.

"Each government entity we talk to is looking at Opteron for different purposes," he said. "Some of them, such as the national test labs, need to start considering 64-bit computing soon."

The labs, in fact, are where much of the 64-bit computing is taking place today. The Energy Department's Pacific Northwest National Laboratory (PNNL) in Richland, Wash., recently began operating a supercomputer built from a cluster of 1,388 64-bit servers using the new Itanium processor. And in October, officials from Sandia National Laboratories announced that the labs would use a massive parallel processing supercomputer that will be based on the Opteron processor.

PNNL is not yet using 64-bit on the Wintel architecture, said Scott Studham, who manages the lab's molecular sciences computing facility. "It's definitely something we'll look at in the future, but 64-bit is now limited to applications like high-end computer-aided design and visualizations that require large datasets. It's too soon to talk about using 64-bit" on Wintel, he said.

Studham said it would be tough for 64-bit Wintel machines to displace 64-bit Unix systems because the latter are so firmly entrenched. "Most 64-bit machines are running some variant of Unix because many of the scientific applications have been written for Unix," he said.

Sandia also has no immediate plans to use 64-bit computing other than for scientific and computational applications, said James Tomkins, project leader for Sandia's new supercomputer effort, Red Storm. "I'm sure there will be desktop 64-bit machines available, but it's not clear when desktop users will need 64-bit processing and move to that. I'm sure it will happen eventually, for applications like voice recognition and multimedia."

Dean Mesterharm, acting chief information officer at the Social Security Administration, said the agency is tracking 64-bit Wintel developments, but has no immediate plans to abandon 32-bit Windows machines.

"I'm not sure what the advantages are to moving to 64-bit at the desktop level," Mesterharm said. "It would depend on what type of software you're using, whether 64-bit improves the performance of the software, and what kind of [budget] you have." However, he said SSA is replacing its 32-bit servers with new 64-bit Unix servers for databases and Web applications.

The Education Department also uses 64-bit computing on Unix servers, mainly for financial applications, but has no plans to move from 32-bit to 64-bit on the Wintel platform, according to Manny Hernandez, chief of operations at the department.

"For the next few years, there's really no reason to go to 64-bit," Hernandez said. "There doesn't seem to be a good trade-off right now in terms of price and performance. The kinds of applications we're running don't need that kind of computing power."

Christopher Willard, a research vice president at research firm IDC, said agencies, like businesses, are more likely to deploy 64-bit Wintel servers than desktop machines.

"It's doubtful anyone will be using a 64-bit PC anytime soon unless it's for something like film editing," Willard said. "If you've been using a 32-bit computer and are satisfied with the performance, it's unlikely you need a lot more power."

Violino is a freelance writer specializing in technology and business. He can be reached at bviolino@optonline.net.

***

Chomping for 64-bit

The following applications can put the memory power of a 64-bit system to good use:

* Complex engineering and scientific models and simulations.

* Digital content such as 3-D animation and graphics.

* E-government and other Web-based processes.

* Financial transactions that require high calculation speeds and analytical capabilities.

* Data warehouses and high-volume databases.

* Genomics research and other bio-simulations.

* Computer-aided design.

NEXT STORY: Idaho expands broadband