Surveying Web performance
- By Heather Harreld
- Jul 10, 2000
The stakes are getting higher on the World Wide Web. As their Web sites
evolve from places to browse to places to conduct business, agencies must
ensure those sites are online and working smoothly.
But Webmasters will find that tools for monitoring Web site performance
are evolving as well. Instead of software that merely logs the number of
visitors and page views, the latest products can simulate traffic patterns
to test beta applications, alert administrators immediately if a site has
a glitch and sometimes even automatically diagnose and fix a problem in
the infrastructure that supports the site.
Despite the various approaches, the market is focused on one goal: ensuring
that every visitor can quickly and easily traverse Web pages to access services
or complete transactions.
"The Web sites are absolutely central to the National Library of Medicine's
mission, and therefore, the technical performance of the sites is absolutely
central," said Fred Wood, special expert at NLM's Office of Health Information
Program Development. "Health information is one of those types of information
that can be very important or even critical to people. It's really important
to have a high level of performance, because you never know when a physician
or a nurse needs the information — and they need it [right away]."
Those high stakes, combined with the increasingly complex infrastructure
that supports Web applications, create a volatile world where site administrators
need performance monitoring tools to make sure they know about problems
before users do, said Chris Rogers, vice president of marketing at Holistix
Holistix offers Web system management software designed to monitor and
manage all the individual components that support a Web site including databases,
application servers, Web servers and network devices.
The software was designed to manage components based on function within
an application, Rogers said. For example, if one application requires a
Web server, application server and database, the software will correlate
the performance from those particular machines to give a manager an idea
of the application's overall performance.
The data gathered from the infrastructure components also can be used
to predict how an application may perform later under different conditions,
because "just knowing how fast you're going today is very limited information,"
Rogers said. "You don't know if tomorrow your customers are going to ask
you to go 100 miles per hour."
While Holistix has designed its technology to monitor and manage the
infrastructure behind a Web site, eBSure Inc. focuses its products on tackling
Web site performance management from the user's perspective, said Kurt Ziegler,
chief executive officer of the company. For federal agencies, this means
looking at "what would induce a person not to use the phone or the fax...and
then satisfying them so they'll come back."
The company's technology, which began shipping earlier this summer,
is designed to track Web page download times, the time lapse between clicks,
and the order that users are accessing pages and how they're using them.
For example, if a users stays too long on a page with a form to fill out,
an agency may want to try to change the instructions for the form or tweak
it in other ways to bump up "close out" times.
"It's a way of looking at where the payback will be," Ziegler said.
"Is your Web site being effective? If everyone's coming in on a slow-speed
line, you'd do better to get rid of the eye candy than having the user wait
for some silly icon to paint on the screen."
While many companies in the Web site performance management arena offer
stand-alone products, firms that have traditionally focused on enterprise
management are including Web site monitoring and management in their tools.
BMC Software Inc.'s Patrol product is designed to provide a central point
of control for all applications, local-area networks, wide-area networks
and communications devices throughout the enterprise.
Patrol is designed to manage the performance and availability of all
layers of the applications stack, including the hardware layer, operating
system and storage devices, said Dean Mericka, regional sales manager for
BMC's federal business.
"There are so many moving parts to a customer's environment," Mericka
said. "Customers aren't getting rid of technology, they are just adding
on to it. To manage service delivery, you can't do component-based management."
Some of the newest players in the market are beginning to offer tools
focused solely on automatically correcting potential performance problems
before systems go down. Within six weeks, Peakstone Corp. plans to ship
a product designed to automatically control the application infrastructure
of a Web site.
P.M. Ravi, founder and CEO of the company, describes the technology
as similar to the autopilot system on an airplane. In fact, the product
is based on predictive modeling technology, which has been used in spacecraft,
in nuclear power plants and on airplanes.
Peakstone is able to analyze application traffic and predict how much
capacity is going to be needed to deliver certain services, Ravi said. For
example, if a server goes down, the software will automatically redirect
traffic, schedule and prioritize traffic, and reallocate capacity.
"What we do is make sure that the plane is still meeting its performance
objectives even if the engine has fallen off," he said. "The challenge of
delivering good service quality today is really the ability to dynamically
provision and allocate capacity to meet the traffic demands. It allows you
to guarantee service quality."
For federal agencies easing into the e-commerce arena, the importance
of investing in Web site management tools depends on the nature of the applications,
said Caryn Gillooly, senior analyst at Hurwitz Group Inc.
"If having an instant response is important for that type of business,
then making sure performance is on par is a big part of that," she said.
In addition, agencies need to weigh the "soft money" factors, where
performance may not directly affect core operations but may influence people
to use paper instead of Web offerings, thus bumping up operation costs.
"If you go long enough without giving people good customer service, they're
going to go somewhere else," she said.
Despite advances in the market, agencies still may find that the available
tools do not solve all their problems.
The Defense Technical Information Center, which hosts about 90 Web sites,
uses a monitoring service to measure how their sites are performing compared
to others, and a mix of homegrown tools and commercial offerings to monitor
wide-area network connections, local-area network capacity, server capacity
and storage capacity. In addition, DTIC has hired a consultant to look at
performance measurements for capacity management and customer service.
"The thing that I wish is that some of the tools would be able to handle
large, massive volumes," said Kurt Molholm, DTIC administrator. "We do have
tools, but in many cases, they're really hard to use or to get in a framework
where you can analyze them. We need to have better tools for measuring bigger
things like what [RealNetworks Inc.'s] RealMedia is going to do to you."
—Harreld is a freelance writer based in Cary, N.C.