WebTrends puts log files to work
- By Steve Jefferson
- Mar 13, 2001
WebTrends Enterprise Suite is one of those products that, once implemented, you'll wonder how you ever got along without it.
WebTrends works by mining the plethora of information your Web server logs have gathered since the moment your first visitor clicked over to your site. Among that information are the Internet Protocol addresses of every visitor; the dates, times and durations of every visit; the pages the visitors viewed; the files they downloaded; the URL that referred them; and the type of browser and operating system that they were using.
Useful information? Yes. But unless you have the right application to process it, that information goes to waste sitting in log files.
To test WebTrends Enterprise Suite, we simply created a profile pointing to where the server's log file was stored, the name of the home page and a few other specifics. In a few seconds, WebTrends assembled a comprehensive series of reports detailing everything we could want to know about that server's Web traffic. The reports were highly organized and loaded with charts and graphs to help make sense of the data. Output was available in several forms including HTML and Microsoft Corp.'s Word and Excel.
In fact, we were startled to find that several machines from outside our firewall had gone to the server more than once, which means it's time to adjust some security policies. By clicking on an IP address in the demographics reports, WebTrends performed a reverse DNS look up to tell us what domain that machine was a part of. While we didn't get a name, we did find that someone who is part of the home.com network is awfully fond of our MP3 files.
More importantly, after perusing through each of the reports, we got a detailed understanding of the server and its daily work. Not bad from a bunch of entries that look like this: 192.168.1.200 - - [03/Jan, 2001:10:14:32 -1000] "GET index.html HTTP/1.0" 14322 413 http://www.somewebsite.com/" "Mozilla/4.01 (WinNT; I)"
WebTrends' only weakness is the material it has to work with. There is only so much you can do with a fairly limited log file entry, and it is surprising how much juice the program can squeeze from the lemons.
In addition to Web traffic analysis, the suite also comes with the ability to do link, proxy traffic and streaming media analysis, as well as alerting and monitoring. Keep in mind, running the link analyzer will skew the statistics, as it creates a tremendous amount of traffic by downloading every file and page in your site in order to test its integrity. This will be amplified if you have large, rarely downloaded files. To avoid confusion, it would be best to regularly schedule the Link Analyzer so at least the additional traffic will be consistent and can be factored into the results.
More importantly, while the findings were precise, they were not always accurate. The vast majority of the tested site is composed of an HTML-based file server. After six hours of checking all the files and links it stated: Of the 165,466 links, 132 refer to non-existent pages. Upon verification, we noticed the links were all good and the use of irregular characters (such as spaces, commas and ampersands) appear to have been the source of the discrepancy.
As with the Web analysis tool, the link analysis tool produced comprehensive reports in a variety of formats, complete with graphs, charts and inline help tools. Further, these tools also can analyze a site that is spread over several servers.
Rounding out the suite is a tool that analyzes your streaming media traffic, and there's a simple but handy module called Alerting/Monitoring that monitors disk space, Windows NT systems, SNMP get/trap and other IP devices attached to your LAN. When there is a change in the system that hits a user-defined alert level, it will trigger an alert to you and/or perform a series of actions, depending on how you choose to configure it.
On the whole, I was impressed with the quality and comprehensiveness of the WebTrends suite. While Web log analysis is not rocket science, the suite does a great job of presenting system information that will help you make better sense of your Web site.
More importantly, you will have all the information, analysis and facts you need to make sound decisions on investments your agency makes in Web-based systems.
Jefferson is a freelance analyst and writer based in Honolulu. He has been covering technology for seven years.