1.1 --- /dev/null Thu Jan 01 00:00:00 1970 +0000 1.2 +++ b/tools/performance/layout/perf-doc.html Wed Dec 31 06:09:35 2014 +0100 1.3 @@ -0,0 +1,281 @@ 1.4 +<!-- This Source Code Form is subject to the terms of the Mozilla Public 1.5 + - License, v. 2.0. If a copy of the MPL was not distributed with this 1.6 + - file, You can obtain one at http://mozilla.org/MPL/2.0/. --> 1.7 + 1.8 +<!doctype html public "-//w3c//dtd html 4.0 transitional//en"> 1.9 +<html> 1.10 +<head> 1.11 + <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1"> 1.12 + <meta name="Author" content="Marc Attinasi"> 1.13 + <meta name="GENERATOR" content="Mozilla/4.7 [en] (WinNT; U) [Netscape]"> 1.14 + <title>Performance Tools for Gecko</title> 1.15 +<style> 1.16 + BODY { margin: 1em 2em 1em 2em; background-color: bisque } 1.17 + H1, H2, H3 { background-color: black; color: bisque; } 1.18 + TABLE.boxed { border-width: 1; border-style: dotted; } 1.19 + </style> 1.20 +</head> 1.21 +<body> 1.22 + 1.23 +<dl> 1.24 +<table WIDTH="100%" > 1.25 +<tr> 1.26 +<td> 1.27 +<center><img SRC="mozilla-banner.gif" height=58 width=600></center> 1.28 +</td> 1.29 +</tr> 1.30 +</table> 1.31 + 1.32 +<center><table COLS=1 WIDTH="80%" class="boxed" > 1.33 +<tr> 1.34 +<td> 1.35 +<center> 1.36 +<h2> 1.37 +Performance Monitoring for Gecko</h2></center> 1.38 + 1.39 +<center> 1.40 +<dd> 1.41 +<i>maintainer: marc attinasi </i></dd></center> 1.42 + 1.43 +<center> 1.44 +<dd> 1.45 +<i><a href="mailto:attinasi@netscape.com">attinasi@netscape.com</a></i></dd></center> 1.46 +</td> 1.47 +</tr> 1.48 +</table></center> 1.49 + 1.50 +<center> 1.51 +<dd> 1.52 +</dd></center> 1.53 +</dl> 1.54 + 1.55 +<h3> 1.56 +Brief Overview</h3> 1.57 +Gecko should be <i>fast</i>. To help us make sure that it is we monitor 1.58 +the performance of the system, specifically in terms of Parsing, Content 1.59 +Creation, Frame Creation and Style Resolution - the core aspects of layout. 1.60 +Facilitating the monitoring of performance across build cycles is a small 1.61 +set of tools that work in conjunction with program output coming from the 1.62 +Mozilla or Viewer applications to produce tables of performance values 1.63 +and historical comparisons of builds analysed in the past. The tools, their 1.64 +dependencies and their general care and feeding are the topics of this 1.65 +document. 1.66 +<h4> 1.67 +Usage: A five-step plan to enlightenment</h4> 1.68 + 1.69 +<ul> 1.70 +<li> 1.71 +First, the tools are all designed to run only on Windows. That is really 1.72 +a bummer, but since most of what we are measuring is XP it should not really 1.73 +matter. Get a Windows NT machine if you want to run the tools.</li> 1.74 + 1.75 +<li> 1.76 +Next, you need a build that was created with performance monitoring enabled. 1.77 +To create such a build you must compile the Mozilla source with a special 1.78 +environment variable set. This environment variable turns on code that 1.79 +accumulates and dumps performance metrics data. The environment variable 1.80 +is: <b>MOZ_PERF=1</b>. Set this environment variable and then build all 1.81 +of Mozilla. If you can obtain a build that was built with MOZ_PERF=1 set 1.82 +then you can just use that build.</li> 1.83 + 1.84 +<li> 1.85 +Third, run the script <b>perf.pl</b> to execute Viewer and run through 1.86 +the test sites gathering performance data.</li> 1.87 + 1.88 +<li> 1.89 +Fourth, make sure the script completed and then open the resultant HTML 1.90 +files which is dropped in the Tables subdirectory.</li> 1.91 + 1.92 +<li> 1.93 +Lasty, stare at the table and the values in it and decide if performance 1.94 +is geting better, worse, or staying the same.</li> 1.95 +</ul> 1.96 + 1.97 +<h3> 1.98 +The PerfTools</h3> 1.99 +IMPORTANT: The tools created for monitoring performance 1.100 +are very tightly coupled to output from the layout engine. As Viewer (or 1.101 +Mozilla) is run it spits-out various timing values to the console. These 1.102 +values are captured to files, parsed and assembled into HTML tables showing 1.103 +the amount of CPU time dedicated to parsing the document, creating the 1.104 +content model, building the frame model, and resolving style during the 1.105 +building of the frame model. All of the scripts that make up the perftool 1.106 +are locate in the directory <tt>\mozilla\tools\performance\layout.</tt> 1.107 +Running them from another location <i>may</i> work, but it is best to run 1.108 +from there. 1.109 +<p>The perl script, <tt>perf.pl</tt>, is used to invoke Viewer and direct 1.110 +it to load various URLs. The URLs to load are contained in a text file, 1.111 +on per line. The file <tt>40-URL.txt</tt> is the baseline file and contains 1.112 +a listing of file-URLs that are static, meaning they never change, because 1.113 +they are snapshots of popular sites. As the script executes it does two 1.114 +things: 1.115 +<ol> 1.116 +<li> 1.117 +Invoke Viewer and feed it the URL-file, capturing the output to another 1.118 +file</li> 1.119 + 1.120 +<li> 1.121 +Invoke other perl scripts to process the Viewer output into HTML tables</li> 1.122 +</ol> 1.123 +A set of perl scripts are used to parse the output of the Viewer application. 1.124 +These scripts expect the format of the performance data to be intransient, 1.125 +in other words, it should not change or the scripts need to be updated. 1.126 +Here are the files involved in parsing the data and generating the HTML 1.127 +table: 1.128 +<ul> 1.129 +<li> 1.130 +<tt><b>perf.pl</b> : </tt>The main script that orchestrates the running 1.131 +of viewer and the invocation of other scripts, and finally copies files 1.132 +to their correct final locations. An example of an invocation of the perf.pl 1.133 +script is: '<b><tt><font color="#000000">perl perf.pl</font><font color="#000099"> 1.134 +Daily-0215 s:\mozilla\0215 cpu</font><font color="#000000">'</font></tt></b></li> 1.135 + 1.136 +<ul> 1.137 +<li> 1.138 +<tt><b><font color="#000099">Daily-0215 </font></b><font color="#000000">is 1.139 +the name of the build and can be anything you like.</font></tt></li> 1.140 + 1.141 +<li> 1.142 +<tt><b><font color="#000099">s:\mozilla\0215 </font></b><font color="#000000">is 1.143 +the location of the build. There must be a bin directory under the directory 1.144 +you specified, and it must contain the MOZ_PERF enabled build.</font></tt></li> 1.145 + 1.146 +<li> 1.147 +<tt><b><font color="#000099">cpu </font></b><font color="#000000">indicates 1.148 +that we are timing CPU time. The other option is clock but that is not 1.149 +currently functional because of the clock resolution.</font></tt></li> 1.150 +</ul> 1.151 + 1.152 +<li> 1.153 +<b><tt>Header.pl</tt></b> : a simple script that generates the initial 1.154 +portion of the HTML file that will show the performance data for the current 1.155 +build.</li> 1.156 + 1.157 +<li> 1.158 +<tt><b>AverageTable2.pl</b> </tt>: a slightly more complicated script that 1.159 +parses the output from Viewer, accumulates data for averaging, and generates 1.160 +a row in the HTML table initialized by header.pl. This file is <b>must</b> 1.161 +be modified if the performance data output fromat changes.</li> 1.162 + 1.163 +<li> 1.164 +<tt><b>Footer.pl</b> </tt>: a simple script that inserts the last row in 1.165 +the HTML table, the averages row. It also terminates the table and ends 1.166 +the HTML tag.</li> 1.167 + 1.168 +<li> 1.169 +<tt><b>GenFromLogs.pl</b> </tt>: a script that generates the HTML table 1.170 +from already existing logs. This is used to regenerate a table after the 1.171 +QA Partner script has run, in case the table file is lost or otherwise 1.172 +needs to be recreated. Also, if old logs are kept, they can be used to 1.173 +regenerate their corresponding table.</li> 1.174 + 1.175 +<li> 1.176 +<b><tt>Uncombine.pl</tt></b> : a script that breaks up a single text file 1.177 +containing all of the timing data for all of the sites into a separate 1.178 +file for each individual site.</li> 1.179 + 1.180 +<li> 1.181 +<b><tt>History.pl</tt></b> : a script that generates an HTML file showing 1.182 +historical comparison of average performance values for current and previous 1.183 +builds.</li> 1.184 +</ul> 1.185 + 1.186 +<h3> 1.187 +The URLs</h3> 1.188 +It is critical that the URLs that we load while measuring performance do 1.189 +not change. This is because we want to compare performance characteristics 1.190 +across builds, and if the URLs changed we could not really make valid comparisons. 1.191 +Also, as URLs change, they exercise different parts of the application, 1.192 +so we really want a consistent set or pages to measure performance against. 1.193 +The builds change, the pages do not. 1.194 +<p>On February 3, 2000 the top 40 sites were 'snaked' using the tool WebSnake. 1.195 +These sites now reside in disk-files and are loaded from those files during 1.196 +the load test. The file <tt>40-URL.txt</tt> contains a listing of the file-URLs 1.197 +created from the web sites. The original web sites should be obvious from 1.198 +the file-URLs. 1.199 +<br> 1.200 +<blockquote><i><b>NOTE</b>: There are some links to external images in 1.201 +the local websites. These should have been resolved by WebSnake but were 1.202 +not for some reason. These should be made local at some point so we can 1.203 +run without a connection to the internet.</i></blockquote> 1.204 + 1.205 +<h3> 1.206 +Historical Data and Trending</h3> 1.207 +Historical data will be gathered and presented to make it easy for those 1.208 +concerned to see how the relative performance of various parts of the product 1.209 +change over time. This historical data is kept in a flat file of comma-delimited 1.210 +values where each record is indexed by the pull-date/milestone and buildID 1.211 +(note that the buildID is not always reliable, however the pull-date/milestone 1.212 +is provided by the user when the performance package is run, so it can 1.213 +be made to be unique). The Historical Data and Trending table will show 1.214 +the averages for Parsing, Content Creation, Frame Creation, Style Resolution, 1.215 +Reflow, Total Layout and Total Page Load time for each build, along with 1.216 +a simple bar graph representation of each records weight relative to the 1.217 +other records in the table. At a later this can be extended to trend individual 1.218 +sites, however for most purposes the roll-up of overall averages is sufficient 1.219 +to track the performance trends of the engine. 1.220 +<h3> 1.221 +The Execution Plan</h3> 1.222 +Performance monitoring will be run on a weekly basis, and against all Milestone 1.223 +builds. The results of the runs will be published for all interested parties 1.224 +to see. Interested and/or responsible individuals will review the performance 1.225 +data to raise or lower developer awareness of performance problems and 1.226 +issues as they arise. 1.227 +<p>Currently, the results are published weekly at <a href="http://techno/users/attinasi/publish">http://techno/users/attinasi/publish</a> 1.228 +<h3> 1.229 +Revision Control and Archiving</h3> 1.230 +The scripts are checked into cvs in the directory \mozilla\tools\performance\layout. 1.231 +The history.txt file is also checked in to cvs after every run, as are 1.232 +the tables produced by the run. Commiting the files to cvs is a manual 1.233 +operation and should be completed only when the data has been analysed 1.234 +and appears valid. Be sure to: 1.235 +<ol> 1.236 +<li> 1.237 +Commit history.txt after each successful run.</li> 1.238 + 1.239 +<li> 1.240 +Add / commit the new table and new trend-table after each successful run 1.241 +(in the Tables subdirectory).</li> 1.242 + 1.243 +<li> 1.244 +Commit any chages to the sciripts or this document.</li> 1.245 +</ol> 1.246 + 1.247 +<hr WIDTH="100%"> 1.248 +<h3> 1.249 +History:</h3> 1.250 + 1.251 +<table BORDER WIDTH="50%" > 1.252 +<tr> 1.253 +<td WIDTH="25%">02/04/2000</td> 1.254 + 1.255 +<td>Created - attinasi</td> 1.256 +</tr> 1.257 + 1.258 +<tr> 1.259 +<td>03/17/2000</td> 1.260 + 1.261 +<td>Removed QA Partner stuff - no longer used</td> 1.262 +</tr> 1.263 + 1.264 +<tr> 1.265 +<td></td> 1.266 + 1.267 +<td></td> 1.268 +</tr> 1.269 + 1.270 +<tr> 1.271 +<td></td> 1.272 + 1.273 +<td></td> 1.274 +</tr> 1.275 + 1.276 +<tr> 1.277 +<td></td> 1.278 + 1.279 +<td></td> 1.280 +</tr> 1.281 +</table> 1.282 + 1.283 +</body> 1.284 +</html>