michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: Performance Tools for Gecko michael@0: michael@0: michael@0: michael@0: michael@0:
  michael@0: michael@0: michael@0: michael@0: michael@0:
michael@0:
michael@0:
michael@0: michael@0:
michael@0: michael@0: michael@0: michael@0:
michael@0:
michael@0:

michael@0: Performance Monitoring for Gecko

michael@0: michael@0:
michael@0:
michael@0: maintainer:  marc attinasi 
michael@0: michael@0:
michael@0:
michael@0: attinasi@netscape.com
michael@0:
michael@0: michael@0:
michael@0:
michael@0:
michael@0:
michael@0: michael@0:

michael@0: Brief Overview

michael@0: Gecko should be fast. To help us make sure that it is we monitor michael@0: the performance of the system, specifically in terms of Parsing, Content michael@0: Creation, Frame Creation and Style Resolution - the core aspects of layout. michael@0: Facilitating the monitoring of performance across build cycles is a small michael@0: set of tools that work in conjunction with program output coming from the michael@0: Mozilla or Viewer applications to produce tables of performance values michael@0: and historical comparisons of builds analysed in the past. The tools, their michael@0: dependencies and their general care and feeding are the topics of this michael@0: document. michael@0:

michael@0: Usage: A five-step plan to enlightenment

michael@0: michael@0: michael@0: michael@0:

michael@0: The PerfTools

michael@0: IMPORTANT: The tools created for monitoring performance michael@0: are very tightly coupled to output from the layout engine. As Viewer (or michael@0: Mozilla) is run it spits-out various timing values to the console. These michael@0: values are captured to files, parsed and assembled into HTML tables showing michael@0: the amount of CPU time dedicated to parsing the document, creating the michael@0: content model, building the frame model, and resolving style during the michael@0: building of the frame model. All of the scripts that make up the perftool michael@0: are locate in the directory \mozilla\tools\performance\layout. michael@0: Running them from another location may work, but it is best to run michael@0: from there. michael@0:

The perl script, perf.pl, is used to invoke Viewer and direct michael@0: it to load various URLs. The URLs to load are contained in a text file, michael@0: on per line. The file 40-URL.txt is the baseline file and contains michael@0: a listing of file-URLs that are static, meaning they never change, because michael@0: they are snapshots of popular sites. As the script executes it does two michael@0: things: michael@0:

    michael@0:
  1. michael@0: Invoke Viewer and feed it the URL-file, capturing the output to another michael@0: file
  2. michael@0: michael@0:
  3. michael@0: Invoke other perl scripts to process the Viewer output into HTML tables
  4. michael@0:
michael@0: A set of perl scripts are used to parse the output of the Viewer application. michael@0: These scripts expect the format of the performance data to be intransient, michael@0: in other words, it should not change or the scripts need to be updated. michael@0: Here are the files involved in parsing the data and generating the HTML michael@0: table: michael@0: michael@0: michael@0:

michael@0: The URLs

michael@0: It is critical that the URLs that we load while measuring performance do michael@0: not change. This is because we want to compare performance characteristics michael@0: across builds, and if the URLs changed we could not really make valid comparisons. michael@0: Also, as URLs change, they exercise different parts of the application, michael@0: so we really want a consistent set or pages to measure performance against. michael@0: The builds change, the pages do not. michael@0:

On February 3, 2000 the top 40 sites were 'snaked' using the tool WebSnake. michael@0: These sites now reside in disk-files and are loaded from those files during michael@0: the load test. The file 40-URL.txt contains a listing of the file-URLs michael@0: created from the web sites. The original web sites should be obvious from michael@0: the file-URLs. michael@0:
  michael@0:

NOTE: There are some links to external images in michael@0: the local websites. These should have been resolved by WebSnake but were michael@0: not for some reason. These should be made local at some point so we can michael@0: run without a connection to the internet.
michael@0: michael@0:

michael@0: Historical Data and Trending

michael@0: Historical data will be gathered and presented to make it easy for those michael@0: concerned to see how the relative performance of various parts of the product michael@0: change over time. This historical data is kept in a flat file of comma-delimited michael@0: values where each record is indexed by the pull-date/milestone and buildID michael@0: (note that the buildID is not always reliable, however the pull-date/milestone michael@0: is provided by the user when the performance package is run, so it can michael@0: be made to be unique). The Historical Data and Trending table will show michael@0: the averages for Parsing, Content Creation, Frame Creation, Style Resolution, michael@0: Reflow, Total Layout and Total Page Load time for each build, along with michael@0: a simple bar graph representation of each records weight relative to the michael@0: other records in the table. At a later this can be extended to trend individual michael@0: sites, however for most purposes the roll-up of overall averages is sufficient michael@0: to track the performance trends of the engine. michael@0:

michael@0: The Execution Plan

michael@0: Performance monitoring will be run on a weekly basis, and against all Milestone michael@0: builds. The results of the runs will be published for all interested parties michael@0: to see. Interested and/or responsible individuals will review the performance michael@0: data to raise or lower developer awareness of performance problems and michael@0: issues as they arise. michael@0:

Currently, the results are published weekly at http://techno/users/attinasi/publish michael@0:

michael@0: Revision Control and Archiving

michael@0: The scripts are checked into cvs in the directory \mozilla\tools\performance\layout. michael@0: The history.txt file is also checked in to cvs after every run, as are michael@0: the tables produced by the run. Commiting the files to cvs is a manual michael@0: operation and should be completed only when the data has been analysed michael@0: and appears valid. Be sure to: michael@0:
    michael@0:
  1. michael@0: Commit history.txt after each successful run.
  2. michael@0: michael@0:
  3. michael@0: Add / commit the new table and new trend-table after each successful run michael@0: (in the Tables subdirectory).
  4. michael@0: michael@0:
  5. michael@0: Commit any chages to the sciripts or this document.
  6. michael@0:
michael@0: michael@0:
michael@0:

michael@0: History:

michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0: michael@0:
02/04/2000Created - attinasi
03/17/2000Removed QA Partner stuff - no longer used
michael@0: michael@0: michael@0: