Power trip

A step-by-step guide to buying the perfect high-end PC

What have you done for me lately? Chief information officers must ask that question as they consider whether the high-end desktop computer systems they purchased last year can still pull their weight. The consequences of an underpowered machine can be significant.

Across government, these workhorses power everything from mission-critical financial models to NASA research projects and military simulations, such as one the Army uses to replicate airports in Iraq and Afghanistan so pilots preparing for their first flights to those areas reduce the chances of mistakes.

High-end desktop PCs are following the same price and performance trends that make their lower-end counterparts more affordable. Nevertheless, top-notch performance still requires a significant investment, often ranging from $2,000 for economy models to at least $6,000 for more powerful systems.

In addition to cost considerations, PC buyers must also weigh the wide range of new choices for core components. “Our research shows a lot of technologies at almost every component level are at important inflection points in the high end of the market, starting with the microprocessor,” said Addison Snell, research director of high-performance computing at IDC, a market research firm.

The key to building the right system in this environment is to focus on six areas, experts say.

1. Processing muscle
Dual-core microprocessors, tandem CPUs fused together into a single chipset, are becoming commonplace in high-end applications. Unfortunately, their cutting-edge engineering and price premiums don’t guarantee a speed boost over single-core processors.

For example, the price difference of a single Advanced Micro Devices Opteron 244 chip and a dual Opteron 244 processor is about $200. But the top-end dual Opteron 240EE jumps to $900 more than the single processor.

A justifiable performance gain only comes if the dual chips run software that divides large processing tasks into separate smaller jobs, a process known as multithreading. Commercial software vendors and developers of custom government applications are still in the process of converting software for this capability.

“When it comes to dual core vs. single core, you really want to look at the software stack that you are using and then see if it will take advantage of the multicore” processors, said Tony Kros, an analyst at Gartner’s Client Platforms Group.

“An application that isn’t coded to utilize both cores might actually see a performance hindrance” with dual-core processors, he added.

Before committing to dual-core chips, organizations should consult with software vendors and developers to determine if critical applications can take advantage of multiple processors, said Philip Pokorny, director of engineering at Penguin Computing.

As software developers code more programs for multithreading in coming years, Kros said he expects to see two-, four-, and eight-core processors become commonplace on high-end desktop PCs.

A subtext of the single vs. dual processor story is the rivalry between chip vendors Intel and AMD. Government contracts have favored Intel because acquisition agents viewed it as the chip leader.

But that perception may be changing.

“We’re seeing AMD processors in server-class machines, which is kind of a litmus test,” said Vic Berger, lead technologist and business development manager at CDW Government. “If people weren’t willing to run a processor in the data center, they were leery of putting it” on desktop PCs.

AMD’s growing popularity is partly economic. “On a dollar-for-dollar basis, you can buy a more affordable workstation that is AMD-based than one based on Intel,” said Mark Vera, a vice president at Alienware. AMD processors often deliver better performance, especially for applications that require intensive graphics computations, he added.

Apple recently added a new wrinkle to the high-end desktop PC discussion when it introduced Macintosh computers with Intel chips, a move akin to a Capulet and Montague marriage.

“I don’t think anyone really loved Apple because of the processor, and for Apple to end the religious war helps them as long as they can retain that sort of brand love that people have around Apple,” Snell said.

This could also induce Microsoft Windows stalwarts to try out Apple’s ease of use and system engineering strengths, Berger said. “That is absolutely going to breathe new life and new viability into the Apple machine,” he said. “I think you are going to see a lot of people switching over to a Mac in the government because Macs are essentially Unix boxes. The current operating system release, Tiger OS 10, is based on Unix.”

For now, Apple’s new dual-core iMacs will be going through a transition period requiring users to run emulation software to help some applications bridge the gap between the previously separate platforms.

2. Bring on the memory
When performance is critical, making the right decisions about memory is second in importance only to the CPU choice.

The more data that the PC can store in memory, the fewer times processors must search slower hard drives for information needed to complete a task. So packing high-end desktop PCs with large amounts of high-speed memory keeps the machines running at top efficiency. Although conventional desktop PCs now regularly come with 256M to 512M of memory, high-end alternatives routinely pack 1G to 4G of RAM into their reserves.

Kenneth Schnell, information technology manager of Santa Maria, Calif., opted for 1G of memory in the new systems he recently purchased to run a computer-aided dispatching application in the police department.

“We find that [1G] is about the minimum you want to go with for just about anything, whether it’s for graphics or just desktop publishing,” he said. “Below a gig, you’re probably going to run out of gas.” Even so, he wishes he could have afforded machines with 2G of memory. He said he fears “we’ll need another stick of memory for these machines” in two years.

Although today’s memory prices are bargains compared with ones in the past, beefing up a desktop’s storehouse still represents a significant share of the total PC price. For example, adding 1G of memory to a high-end desktop computer costs about $150, while bumping that amount to 4G can cost $1,250 or more depending on the memory configuration.

Not all memory is created equal. Low-priced RAM deals often trade economy for performance, an unacceptable compromise for such a crucial component. Choose double data rate 2 synchronous dynamic RAM, which squeezes out extra speed thanks to its ability to double-pump data. This allows the memory modules to send data twice during each communications cycle with the system processor. Thus, memory rated at 100 MHz effectively performs at 200 MHz.

3. Supercharged graphics
CPUs aren’t the only desktop components pushing their performance envelopes with dual processors and mountains of memory. Designers are adopting the same techniques to increase the speed and scalability of graphics subsystems.

For example, nVidia offers the Scalable Link Interface, which lets buyers configure their systems with two or four graphics processor units (GPUs).

“It’s a high-speed interconnect that doesn’t get bogged down by the microprocessor or other technologies on the system board,” Vera said. According to nVidia, a “quad,” or four GPU setup, can handle 2G of dedicated memory, enough for a 2,560 x 1,600 resolution, or twice the resolution typically used for office applications.

4. The face of high performance
Now that flat-panel displays have overtaken old tube monitors, purchasers are benefiting from falling LCD prices that let them buy larger screens and still stay within budget. Many PC buyers find 17-inch displays too small for some high-end applications, which require space for multiple open windows and room to graphically display large sets of data.

But because the price difference between 17-inch displays and 19-inch models is usually only $100, opting to go large is easy to justify. Even prices for spacious 21-inch displays, which cost about $450, have come down to Earth. Living large is money well spent, Berger said.

“I recommend that people spend the extra money to give people the best monitor possible,” he said. “I don’t stare at my processor all day, but I do stare at my monitor. I’m much better off with a little bit slower processor” in return for a comfortable monitor.

5. Drive time
Serial Advanced Technology Attachment (ATA) hard drives are becoming commonplace in high-end desktop PCs because of their price advantages and performance competitiveness with high-speed SCSI drives. A 300G Serial ATA drive costs about $150, while a comparable SCSI drive costs approximately $900. SCSI performance for applications with exceptionally high input/output processing needs, such as complex financial or fluid dynamics programs, justify the price premium, Pokorny said.

Two new drive technologies are also arriving — one known as Serial Attached SCSI (SAS) and the other using a new method for packing data onto drive platters. SAS marries the serial connections for data transfers used in Serial ATA drives with SCSI devices’ fast speeds and better performance for reading and writing data. A new generation of SAS controllers, the brains for storage systems, can work with both Serial ATA and SAS units.

In the coming year, as SAS drives become more plentiful, buyers will increasingly choose them for performance-critical applications, Pokorny said. Serial ATA “drives have been optimized for cases where there’s only one person doing one thing at a time,” he said. “But when multiple applications read and write multiple files, performance tends to slow down on [Serial ATA] drives. That’s not the case with SAS or SCSI drives.”

Computer buyers pay a significant premium for this performance, however. A 300G Serial ATA drive costs around $150 while an SAS drive with half that capacity can cost $715 to $1,060 depending on its speed.

6. Rest of the best
High-performance computers are like race cars — the best engines on the track will underperform if they’re bogged down by mundane ancillary components. For high-end desktop PCs, that includes the power supply.

Power supplies are especially important in high-performing desktop PCs because heat dissipation is a critical factor in keeping hot-running CPUs and GPUs healthy. Power supplies that work most efficiently waste less energy and thereby give off less heat to make the electronics sweat. Computer vendors don’t typically publish the efficiency ratings of their power supply choices, but a new initiative can help buyers track down — or specify — the best designs.

The 80 Plus program, an initiative backed by some power companies and consulting firms, encourages the use of power supplies rated at 80 percent efficiency and above and maintains a list of qualifying products at www.80plus.org. In addition to reducing heat within computers, an efficient power supply can cut $30 in energy costs per machine over its lifetime, the group estimated.

Other criteria for high-end desktop PCs include plenty of ports — often eight USB 2.0 and two FireWire connectors with at least half located on the front of the box — for links to printers and other peripherals. Dual integrated 10/100 Gigabit Ethernet network interfaces should also be standard on desktop PCs, in addition to combined DVD and CD-RW optical drives for importing and exporting data that’s not available on a network.

Finally, spring for keyboards and mice that are solidly built and designed for user comfort. “You will end up paying 5,000 times over the cost of a mouse in carpal tunnel claims if you’re too cheap to give people a decent mouse with a gel wrist pad,” Berger said.

Joch is a business and technology writer based in New England. He can be reached at ajoch@monad.net .

Standing data storage on its head

With high-end desktop PC users’ data-storage needs growing unabated, hard-drive manufacturers are searching for new and more efficient ways to cram information onto disk platters. A new design stands the magnetic bits that record data on their edges instead of laying them flat on the disk surface, thus taking up less space.

The reorientation “is a significant advancement that allows drives to increase capacities higher and higher,” said Vic Berger, lead technologist at CDW Government. Some of the first vertical or perpendicular drives arrived late last year and early this year from Toshiba and Seagate Technology. Other vendors plan similar introductions this year.

In the past, vendors boosted capacity by building hard-drive units with multiple platters whose read/write heads could find related data stored across the platters. Storing the data on one platter increases drive efficiency by reducing the physical area heads must search when looking for data.

“If the data is on one platter in vertical stacks, it’s going to increase your access speed,” Berger said.

— Alan Joch

The 2014 Federal 100

Get to know the 100 women and men honored this year for going above and beyond in federal IT.

Reader comments

Please post your comments here. Comments are moderated, so they may not appear immediately after submitting. We will not post comments that we consider abusive or off-topic.

Please type the letters/numbers you see above