Comparison of HTML Produced by Several HATs
Recently, there was a lively discussion on the Help Authoring Tools and Techniques (HATT) mailing list about the relative compactness and efficiency of the HTML code produced by various Help authoring tools. As a result of these discussions, several industry consultants decided to collaborate on a project to compare the HTML, CSS, and CHM files produced by a variety of Help authoring tools. On this page, you'll find:
- Why this test matters
- How we tested the tools
- Summary of results
- Comparison of output file sizes (tables and charts)
- Comparison of validation results
- Detailed output samples and validation reports
Note: This comparison was last updated in October 2003. The data and samples presented here do not necessarily reflect the most recent versions of the products compared.
Why this test matters
Compact HTML is important for a variety of reasons. Bloated HTML slows download speeds -- a concern for all users and especially for the many users who are still limited to dial-up network access -- and increases the time required for browsers to display a downloaded page.
In addition, valid HTML and CSS—that is, HTML and CSS that conform to the HTML, CSS, and related standards published by the World Wide Web Consortium (W3C)—is an important factor for individuals and organizations concerned about interoperability. Producing valid HTML and CSS makes it easier to create, edit, and maintain content using the tool of your choice, rather than be locked into a proprietary solution from a single vendor. In addition, valid HTML and CSS make it easier to publish content on multiple platforms and for users with varying operating systems, browsers, and other types of viewer applications (cell phones, PDAs, and so forth).
Note: Compact HTML, valid HTML, and valid CSS are only some of the factors to consider when choosing an authoring tool. The test we performed is not an overall evaluation of any of the authoring tools, nor are the results of the test intended as an endorsement of any of the authoring tools involved.
How we tested the tools
Several industry consultants decided to collaborate on a short project to compare the HTML and CSS output produced by several Help authoring tools. Each consultant took responsibility for creating a test project and generating an HTML file, a CSS file, and a compiled Microsoft HTML Help (CHM) file using one or more authoring tools. The following are the tools and consultants who were involved in the test:
Authoring Tool | Consultant |
---|---|
AuthorIT 3.2* | Char James-Tanny |
AuthorIT 4.0 | Char James-Tanny |
Doc-To-Help 2000* | Paul Neshamkin |
Doc-To-Help 6.0 | Paul Neshamkin |
Dreamweaver MX | David Knopf |
ForeHTML 5.02a* | Dana Cline |
Mif2Go 33u31 | Alison White |
RoboHelp Classic 2002* | MJ Plaster |
RoboHelp HTML 2002* | MJ Plaster |
RoboHelp HTML X3 | MJ Plaster |
WebWorks Publisher Professional Edition 7.0.5 | David Knopf |
WebWorks Publisher WordHelp 1.0 | David Knopf |
* Product or version is no longer commercially available. |
The consultants worked as a group to agree on a simple test project consisting of a single HTML file containing such typical elements as paragraphs, text formatted with CSS styles, text formatted manually, numbered and bulleted lists, tables, and images. We decided to use the HTML file to produce a compiled Microsoft HTML Help system, so we also included two elements specific to HTML Help: a text-only popup and a shortcut control, both created using the HTML Help ActiveX control.
Each consultant implemented the test project from scratch, avoided "guru-level tricks" to optimize the output, and, as much as possible, created the HTML page and accompanying CSS file in the same way. To a certain extent, this was impossible because the tools themselves operate differently. For example, tools like Dreamweaver and RoboHelp HTML are largely based on a WYSIWYG model, while tools like Doc-To-Help, AuthorIT, and WebWorks rely principally on styles and templates. Because of such differences, the test we performed is scientifically imperfect. Nonetheless, we believe that valid conclusions can be drawn from the results.
Summary of results
HTML file sizes
Of the tools that we tested, Dreamweaver produced the most compact code; the HTML file we created with Dreamweaver was 7 KB. In our view, even hand-coded HTML would not be noticeably more compact than the HTML we produced with Dreamweaver, although Mif2Go made an impressive second-place showing with an HTML file of only 8 KB.
Most of the other tools yielded a single HTML file of 10-12 KB. RoboHelp Classic and especially RoboHelp HTML produced the least compact code of the tools we tested, with HTML files of 17 KB and 27-28 KB, respectively. The output from RoboHelp HTML X3 was four times the size of the Dreamweaver output and nearly three times the size of the files produced by most of the other non-RoboHelp tools.
CSS file sizes
The size of the CSS files varied widely. The most compact were produced by Dreamweaver, Mif2Go, ForeHTML, and RoboHelp HTML—all under 5 KB. In the midrange were RoboHelp Classic, Doc-To-Help, WebWorks Publisher Professional, and WebWorks Publisher WordHelp, ranging from 9 to 15 KB. The least compact CSS file was produced by AuthorIT (18.4 KB for AuthorIT 4.0 and 30 KB for AuthorIT 3.2).
Valid HTML and CSS
Of the tools that we tested, only AuthorIT 4.0, Dreamweaver, and Mif2Go produced valid HTML. None of the HTML code produced by the other tools complied with the relevant W3C specifications. In our view, the HAT vendors have some work to do in order to comply with the W3C's published HTML standards.
Five of the tools we tested produced valid CSS files: AuthorIT, Dreamweaver, Mif2G0, WebWorks Publisher Professional, and WebWorks Publisher WordHelp.
Comparison of output file sizes
The following table shows the total file size of the individual HTML file, the generated CSS file(s) and the Compiled HTML Help (CHM) file produced by each of the authoring tools we tested.
Authoring Tool |
File Sizes |
||
---|---|---|---|
HTML |
CSS |
CHM |
|
AuthorIT 3.2 | 12 KB |
30.4 KB |
23 KB |
AuthorIT 4.0 | 13 KB |
18.4 KB |
24 KB |
Doc-To-Help 6.0 | 12 KB |
15.4 KB |
22 KB |
Doc-To-Help 2000* | 9 KB |
5.9 KB |
23 KB |
Dreamweaver MX | 7 KB |
0.5 KB |
20 KB |
ForeHTML 5.02a* | 10 KB |
1.9 KB |
24 KB |
Mif2Go 33u31 | 8 KB |
1.1 KB |
21 KB |
RoboHelp Classic 2002* | 17 KB |
9.0 KB |
47 KB |
RoboHelp HTML 2002* | 27 KB |
4.2 KB |
48 KB |
RoboHelp HTML X3 | 28 KB |
4.1 KB |
48 KB |
WebWorks Publisher Professional 7.0.5 | 11 KB |
13.8 KB |
22 KB |
WebWorks Publisher WordHelp 1.0 | 11 KB |
11.6 KB |
27 KB |
* Product or version is no longer commercially available. |
Note: For most Help authors, the size of the HTML files is far more significant than the size of the CSS files. The CHM file is, for the most part, a consolidated collection of all the individual HTML and CSS files. In a typical Help system, the majority of individual files are HTML files. A typical 500 or 1,000 topic Help system would include 500 or 1,000 HTML files, but only one or two CSS files. Therefore, large HTML files lead to large Help systems, while the effect of one or two large CSS files is limited. This is clearly illustrated in the table above and the charts below.
The following charts visually represent the differences in the file sizes produced by each of the authoring tools we tested.
Comparison of validation results
The following table shows whether or not each tool produced HTML and CSS that are valid, according to the published standards and specifications of the W3C.
Authoring Tool |
Valid HTML? |
Valid CSS? |
---|---|---|
AuthorIT 3.2 | NO |
YES |
AuthorIT 4.0 | YES |
YES |
Doc-To-Help 6.0 | NO |
NO |
Doc-To-Help 2000* | NO |
NO |
Dreamweaver MX | YES |
YES |
ForeHTML 5.02a* | NO |
NO |
Mif2Go 33u31 | YES |
YES |
RoboHelp Classic 2002 | NO |
NO |
RoboHelp HTML 2002 | NO |
NO |
WebWorks Publisher Professional 7.0.5 | NO |
YES |
WebWorks Publisher WordHelp 1.0 | NO |
YES |
* Product
or version is no longer commercially available.
|
Output samples and validation reports
Use the following links to see the CHM file, HTML file, and validation report for the output from each of the authoring tools we tested: