michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: Performance Tools for Gecko
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: |
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: Performance Monitoring for Gecko
michael@0:
michael@0:
michael@0: -
michael@0: maintainer: marc attinasi
michael@0:
michael@0:
michael@0: -
michael@0: attinasi@netscape.com
michael@0: |
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: -
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: Brief Overview
michael@0: Gecko should be fast. To help us make sure that it is we monitor
michael@0: the performance of the system, specifically in terms of Parsing, Content
michael@0: Creation, Frame Creation and Style Resolution - the core aspects of layout.
michael@0: Facilitating the monitoring of performance across build cycles is a small
michael@0: set of tools that work in conjunction with program output coming from the
michael@0: Mozilla or Viewer applications to produce tables of performance values
michael@0: and historical comparisons of builds analysed in the past. The tools, their
michael@0: dependencies and their general care and feeding are the topics of this
michael@0: document.
michael@0:
michael@0: Usage: A five-step plan to enlightenment
michael@0:
michael@0:
michael@0: -
michael@0: First, the tools are all designed to run only on Windows. That is really
michael@0: a bummer, but since most of what we are measuring is XP it should not really
michael@0: matter. Get a Windows NT machine if you want to run the tools.
michael@0:
michael@0: -
michael@0: Next, you need a build that was created with performance monitoring enabled.
michael@0: To create such a build you must compile the Mozilla source with a special
michael@0: environment variable set. This environment variable turns on code that
michael@0: accumulates and dumps performance metrics data. The environment variable
michael@0: is: MOZ_PERF=1. Set this environment variable and then build all
michael@0: of Mozilla. If you can obtain a build that was built with MOZ_PERF=1 set
michael@0: then you can just use that build.
michael@0:
michael@0: -
michael@0: Third, run the script perf.pl to execute Viewer and run through
michael@0: the test sites gathering performance data.
michael@0:
michael@0: -
michael@0: Fourth, make sure the script completed and then open the resultant HTML
michael@0: files which is dropped in the Tables subdirectory.
michael@0:
michael@0: -
michael@0: Lasty, stare at the table and the values in it and decide if performance
michael@0: is geting better, worse, or staying the same.
michael@0:
michael@0:
michael@0:
michael@0: The PerfTools
michael@0: IMPORTANT: The tools created for monitoring performance
michael@0: are very tightly coupled to output from the layout engine. As Viewer (or
michael@0: Mozilla) is run it spits-out various timing values to the console. These
michael@0: values are captured to files, parsed and assembled into HTML tables showing
michael@0: the amount of CPU time dedicated to parsing the document, creating the
michael@0: content model, building the frame model, and resolving style during the
michael@0: building of the frame model. All of the scripts that make up the perftool
michael@0: are locate in the directory \mozilla\tools\performance\layout.
michael@0: Running them from another location may work, but it is best to run
michael@0: from there.
michael@0: The perl script, perf.pl, is used to invoke Viewer and direct
michael@0: it to load various URLs. The URLs to load are contained in a text file,
michael@0: on per line. The file 40-URL.txt is the baseline file and contains
michael@0: a listing of file-URLs that are static, meaning they never change, because
michael@0: they are snapshots of popular sites. As the script executes it does two
michael@0: things:
michael@0:
michael@0: -
michael@0: Invoke Viewer and feed it the URL-file, capturing the output to another
michael@0: file
michael@0:
michael@0: -
michael@0: Invoke other perl scripts to process the Viewer output into HTML tables
michael@0:
michael@0: A set of perl scripts are used to parse the output of the Viewer application.
michael@0: These scripts expect the format of the performance data to be intransient,
michael@0: in other words, it should not change or the scripts need to be updated.
michael@0: Here are the files involved in parsing the data and generating the HTML
michael@0: table:
michael@0:
michael@0: -
michael@0: perf.pl : The main script that orchestrates the running
michael@0: of viewer and the invocation of other scripts, and finally copies files
michael@0: to their correct final locations. An example of an invocation of the perf.pl
michael@0: script is: 'perl perf.pl
michael@0: Daily-0215 s:\mozilla\0215 cpu'
michael@0:
michael@0:
michael@0: -
michael@0: Daily-0215 is
michael@0: the name of the build and can be anything you like.
michael@0:
michael@0: -
michael@0: s:\mozilla\0215 is
michael@0: the location of the build. There must be a bin directory under the directory
michael@0: you specified, and it must contain the MOZ_PERF enabled build.
michael@0:
michael@0: -
michael@0: cpu indicates
michael@0: that we are timing CPU time. The other option is clock but that is not
michael@0: currently functional because of the clock resolution.
michael@0:
michael@0:
michael@0: -
michael@0: Header.pl : a simple script that generates the initial
michael@0: portion of the HTML file that will show the performance data for the current
michael@0: build.
michael@0:
michael@0: -
michael@0: AverageTable2.pl : a slightly more complicated script that
michael@0: parses the output from Viewer, accumulates data for averaging, and generates
michael@0: a row in the HTML table initialized by header.pl. This file is must
michael@0: be modified if the performance data output fromat changes.
michael@0:
michael@0: -
michael@0: Footer.pl : a simple script that inserts the last row in
michael@0: the HTML table, the averages row. It also terminates the table and ends
michael@0: the HTML tag.
michael@0:
michael@0: -
michael@0: GenFromLogs.pl : a script that generates the HTML table
michael@0: from already existing logs. This is used to regenerate a table after the
michael@0: QA Partner script has run, in case the table file is lost or otherwise
michael@0: needs to be recreated. Also, if old logs are kept, they can be used to
michael@0: regenerate their corresponding table.
michael@0:
michael@0: -
michael@0: Uncombine.pl : a script that breaks up a single text file
michael@0: containing all of the timing data for all of the sites into a separate
michael@0: file for each individual site.
michael@0:
michael@0: -
michael@0: History.pl : a script that generates an HTML file showing
michael@0: historical comparison of average performance values for current and previous
michael@0: builds.
michael@0:
michael@0:
michael@0:
michael@0: The URLs
michael@0: It is critical that the URLs that we load while measuring performance do
michael@0: not change. This is because we want to compare performance characteristics
michael@0: across builds, and if the URLs changed we could not really make valid comparisons.
michael@0: Also, as URLs change, they exercise different parts of the application,
michael@0: so we really want a consistent set or pages to measure performance against.
michael@0: The builds change, the pages do not.
michael@0: On February 3, 2000 the top 40 sites were 'snaked' using the tool WebSnake.
michael@0: These sites now reside in disk-files and are loaded from those files during
michael@0: the load test. The file 40-URL.txt contains a listing of the file-URLs
michael@0: created from the web sites. The original web sites should be obvious from
michael@0: the file-URLs.
michael@0:
michael@0:
NOTE: There are some links to external images in
michael@0: the local websites. These should have been resolved by WebSnake but were
michael@0: not for some reason. These should be made local at some point so we can
michael@0: run without a connection to the internet.
michael@0:
michael@0:
michael@0: Historical Data and Trending
michael@0: Historical data will be gathered and presented to make it easy for those
michael@0: concerned to see how the relative performance of various parts of the product
michael@0: change over time. This historical data is kept in a flat file of comma-delimited
michael@0: values where each record is indexed by the pull-date/milestone and buildID
michael@0: (note that the buildID is not always reliable, however the pull-date/milestone
michael@0: is provided by the user when the performance package is run, so it can
michael@0: be made to be unique). The Historical Data and Trending table will show
michael@0: the averages for Parsing, Content Creation, Frame Creation, Style Resolution,
michael@0: Reflow, Total Layout and Total Page Load time for each build, along with
michael@0: a simple bar graph representation of each records weight relative to the
michael@0: other records in the table. At a later this can be extended to trend individual
michael@0: sites, however for most purposes the roll-up of overall averages is sufficient
michael@0: to track the performance trends of the engine.
michael@0:
michael@0: The Execution Plan
michael@0: Performance monitoring will be run on a weekly basis, and against all Milestone
michael@0: builds. The results of the runs will be published for all interested parties
michael@0: to see. Interested and/or responsible individuals will review the performance
michael@0: data to raise or lower developer awareness of performance problems and
michael@0: issues as they arise.
michael@0: Currently, the results are published weekly at http://techno/users/attinasi/publish
michael@0:
michael@0: Revision Control and Archiving
michael@0: The scripts are checked into cvs in the directory \mozilla\tools\performance\layout.
michael@0: The history.txt file is also checked in to cvs after every run, as are
michael@0: the tables produced by the run. Commiting the files to cvs is a manual
michael@0: operation and should be completed only when the data has been analysed
michael@0: and appears valid. Be sure to:
michael@0:
michael@0: -
michael@0: Commit history.txt after each successful run.
michael@0:
michael@0: -
michael@0: Add / commit the new table and new trend-table after each successful run
michael@0: (in the Tables subdirectory).
michael@0:
michael@0: -
michael@0: Commit any chages to the sciripts or this document.
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: History:
michael@0:
michael@0:
michael@0:
michael@0: 02/04/2000 |
michael@0:
michael@0: Created - attinasi |
michael@0:
michael@0:
michael@0:
michael@0: 03/17/2000 |
michael@0:
michael@0: Removed QA Partner stuff - no longer used |
michael@0:
michael@0:
michael@0:
michael@0: |
michael@0:
michael@0: |
michael@0:
michael@0:
michael@0:
michael@0: |
michael@0:
michael@0: |
michael@0:
michael@0:
michael@0:
michael@0: |
michael@0:
michael@0: |
michael@0:
michael@0:
michael@0:
michael@0:
michael@0: