How we tested HTML conversion software
We decided not to perform formal speed tests on these HTML conversion tools. This was done for two reasons. First, the programs were not capable of dealing with the same sets of legacy file formats, so it was impossible to assemble a single meaningful set of test files. Second, because HTML conversion is generally a chore performed by a Webmaster or systems administrator rather than by end users, small differences in processing times are not critical.
In similar fashion, because end users are not involved, the ease of installation did not rate sufficiently high to warrant treatment in a separate category.
HTML conversion received the heaviest weighting. In order to receive a score of satisfactory, the program must allow the user to perform batch conversion of multiple files to HTML format. To receive a score of good, the program also must be able to convert embedded images and tables. We tested each program with a set of files of different formats, including older WordPerfect for DOS formats. The files contained a variety of graphic images, tables and charts. The programs that more accurately translated these files to HTML earned extra points.
Programs that can generate tables of contents, indexes and navigational buttons and that demonstrate high ease of use also earned extra points.
Format support is critical to people with legacy documents that need to move to the World Wide Web. To receive a score of satisfactory, the program must offer a conversion path for the most popular Microsoft Word and Corel Corp. WordPerfect word processors for Windows and DOS. The programs that provide direct support for additional file formats received higher scores.
All that is required to earn a score of satisfactory in this category is that the program provide some means for updating the set of HTML files that result from conversion. Programs that automate the process, that support incremental updates and/or that make the process notably easy to perform received higher scores. Built-in search and editing tools also earned extra points.
At a minimum, documentation had to tell us how to install the program and make use of the program's features. Comprehensive, well-organized and well-written manuals received higher scores. We lowered the score if the manual was poorly organized, lacked a table of contents and index, did not include information or contained factual errors in the text.
We based technical support scores on the quality of service we received during multiple anonymous support calls. Busy signals, voice-mail-only service and excessive resolution times all resulted in lower scores.
To receive a score of satisfactory, the company must provide technical support via the telephone. We awarded bonus points for unconditional money-back guarantees, extended support hours, bulletin board support—such as CompuServe—and a toll-free number. We subtracted points for no technical support or a limited support period.
Given that corporate purchases of these products are generally small in number, we did not assign a great deal of weight to this category. Pricing for single-user packages was scored in the following categories:
$0 to $249: Excellent
$250 to $399: Very Good
$400 to $549: Good
The program's score may have been modified to account for the price of extensions or server utilities.