tools/performance/layout/perf-doc.html

changeset 0
6474c204b198
equal deleted inserted replaced
-1:000000000000 0:2d56321c00d7
1 <!-- This Source Code Form is subject to the terms of the Mozilla Public
2 - License, v. 2.0. If a copy of the MPL was not distributed with this
3 - file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
4
5 <!doctype html public "-//w3c//dtd html 4.0 transitional//en">
6 <html>
7 <head>
8 <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
9 <meta name="Author" content="Marc Attinasi">
10 <meta name="GENERATOR" content="Mozilla/4.7 [en] (WinNT; U) [Netscape]">
11 <title>Performance Tools for Gecko</title>
12 <style>
13 BODY { margin: 1em 2em 1em 2em; background-color: bisque }
14 H1, H2, H3 { background-color: black; color: bisque; }
15 TABLE.boxed { border-width: 1; border-style: dotted; }
16 </style>
17 </head>
18 <body>
19
20 <dl>&nbsp;
21 <table WIDTH="100%" >
22 <tr>
23 <td>
24 <center><img SRC="mozilla-banner.gif" height=58 width=600></center>
25 </td>
26 </tr>
27 </table>
28
29 <center><table COLS=1 WIDTH="80%" class="boxed" >
30 <tr>
31 <td>
32 <center>
33 <h2>
34 Performance Monitoring for Gecko</h2></center>
35
36 <center>
37 <dd>
38 <i>maintainer:&nbsp; marc attinasi&nbsp;</i></dd></center>
39
40 <center>
41 <dd>
42 <i><a href="mailto:attinasi@netscape.com">attinasi@netscape.com</a></i></dd></center>
43 </td>
44 </tr>
45 </table></center>
46
47 <center>
48 <dd>
49 </dd></center>
50 </dl>
51
52 <h3>
53 Brief Overview</h3>
54 Gecko should be <i>fast</i>. To help us make sure that it is we monitor
55 the performance of the system, specifically in terms of Parsing, Content
56 Creation, Frame Creation and Style Resolution - the core aspects of layout.
57 Facilitating the monitoring of performance across build cycles is a small
58 set of tools that work in conjunction with program output coming from the
59 Mozilla or Viewer applications to produce tables of performance values
60 and historical comparisons of builds analysed in the past. The tools, their
61 dependencies and their general care and feeding are the topics of this
62 document.
63 <h4>
64 Usage: A five-step plan to enlightenment</h4>
65
66 <ul>
67 <li>
68 First, the tools are all designed to run only on Windows. That is really
69 a bummer, but since most of what we are measuring is XP it should not really
70 matter. Get a Windows NT machine if you want to run the tools.</li>
71
72 <li>
73 Next, you need a build that was created with performance monitoring enabled.
74 To create such a build you must compile the Mozilla source with a special
75 environment variable set. This environment variable turns on code that
76 accumulates and dumps performance metrics data. The environment variable
77 is: <b>MOZ_PERF=1</b>. Set this environment variable and then build all
78 of Mozilla. If you can obtain a build that was built with MOZ_PERF=1 set
79 then you can just use that build.</li>
80
81 <li>
82 Third, run the script <b>perf.pl</b> to execute Viewer and run through
83 the test sites gathering performance data.</li>
84
85 <li>
86 Fourth, make sure the script completed and then open the resultant HTML
87 files which is dropped in the Tables subdirectory.</li>
88
89 <li>
90 Lasty, stare at the table and the values in it and decide if performance
91 is geting better, worse, or staying the same.</li>
92 </ul>
93
94 <h3>
95 The PerfTools</h3>
96 IMPORTANT: The tools created for monitoring performance
97 are very tightly coupled to output from the layout engine. As Viewer (or
98 Mozilla) is run it spits-out various timing values to the console. These
99 values are captured to files, parsed and assembled into HTML tables showing
100 the amount of CPU time dedicated to parsing the document, creating the
101 content model, building the frame model, and resolving style during the
102 building of the frame model. All of the scripts that make up the perftool
103 are locate in the directory <tt>\mozilla\tools\performance\layout.</tt>
104 Running them from another location <i>may</i> work, but it is best to run
105 from there.
106 <p>The perl script, <tt>perf.pl</tt>, is used to invoke Viewer and direct
107 it to load various URLs. The URLs to load are contained in a text file,
108 on per line. The file <tt>40-URL.txt</tt> is the baseline file and contains
109 a listing of file-URLs that are static, meaning they never change, because
110 they are snapshots of popular sites. As the script executes it does two
111 things:
112 <ol>
113 <li>
114 Invoke Viewer and feed it the URL-file, capturing the output to another
115 file</li>
116
117 <li>
118 Invoke other perl scripts to process the Viewer output into HTML tables</li>
119 </ol>
120 A set of perl scripts are used to parse the output of the Viewer application.
121 These scripts expect the format of the performance data to be intransient,
122 in other words, it should not change or the scripts need to be updated.
123 Here are the files involved in parsing the data and generating the HTML
124 table:
125 <ul>
126 <li>
127 <tt><b>perf.pl</b> : </tt>The main script that orchestrates the running
128 of viewer and the invocation of other scripts, and finally copies files
129 to their correct final locations. An example of an invocation of the perf.pl
130 script is: '<b><tt><font color="#000000">perl perf.pl</font><font color="#000099">
131 Daily-0215 s:\mozilla\0215 cpu</font><font color="#000000">'</font></tt></b></li>
132
133 <ul>
134 <li>
135 <tt><b><font color="#000099">Daily-0215 </font></b><font color="#000000">is
136 the name of the build and can be anything you like.</font></tt></li>
137
138 <li>
139 <tt><b><font color="#000099">s:\mozilla\0215 </font></b><font color="#000000">is
140 the location of the build. There must be a bin directory under the directory
141 you specified, and it must contain the MOZ_PERF enabled build.</font></tt></li>
142
143 <li>
144 <tt><b><font color="#000099">cpu </font></b><font color="#000000">indicates
145 that we are timing CPU time. The other option is clock but that is not
146 currently functional because of the clock resolution.</font></tt></li>
147 </ul>
148
149 <li>
150 <b><tt>Header.pl</tt></b> : a simple script that generates the initial
151 portion of the HTML file that will show the performance data for the current
152 build.</li>
153
154 <li>
155 <tt><b>AverageTable2.pl</b> </tt>: a slightly more complicated script that
156 parses the output from Viewer, accumulates data for averaging, and generates
157 a row in the HTML table initialized by header.pl. This file is <b>must</b>
158 be modified if the performance data output fromat changes.</li>
159
160 <li>
161 <tt><b>Footer.pl</b> </tt>: a simple script that inserts the last row in
162 the HTML table, the averages row. It also terminates the table and ends
163 the HTML tag.</li>
164
165 <li>
166 <tt><b>GenFromLogs.pl</b> </tt>: a script that generates the HTML table
167 from already existing logs. This is used to regenerate a table after the
168 QA Partner script has run, in case the table file is lost or otherwise
169 needs to be recreated. Also, if old logs are kept, they can be used to
170 regenerate their corresponding table.</li>
171
172 <li>
173 <b><tt>Uncombine.pl</tt></b> : a script that breaks up a single text file
174 containing all of the timing data for all of the sites into a separate
175 file for each individual site.</li>
176
177 <li>
178 <b><tt>History.pl</tt></b> : a script that generates an HTML file showing
179 historical comparison of average performance values for current and previous
180 builds.</li>
181 </ul>
182
183 <h3>
184 The URLs</h3>
185 It is critical that the URLs that we load while measuring performance do
186 not change. This is because we want to compare performance characteristics
187 across builds, and if the URLs changed we could not really make valid comparisons.
188 Also, as URLs change, they exercise different parts of the application,
189 so we really want a consistent set or pages to measure performance against.
190 The builds change, the pages do not.
191 <p>On February 3, 2000 the top 40 sites were 'snaked' using the tool WebSnake.
192 These sites now reside in disk-files and are loaded from those files during
193 the load test. The file <tt>40-URL.txt</tt> contains a listing of the file-URLs
194 created from the web sites. The original web sites should be obvious from
195 the file-URLs.
196 <br>&nbsp;
197 <blockquote><i><b>NOTE</b>: There are some links to external images in
198 the local websites. These should have been resolved by WebSnake but were
199 not for some reason. These should be made local at some point so we can
200 run without a connection to the internet.</i></blockquote>
201
202 <h3>
203 Historical Data and Trending</h3>
204 Historical data will be gathered and presented to make it easy for those
205 concerned to see how the relative performance of various parts of the product
206 change over time. This historical data is kept in a flat file of comma-delimited
207 values where each record is indexed by the pull-date/milestone and buildID
208 (note that the buildID is not always reliable, however the pull-date/milestone
209 is provided by the user when the performance package is run, so it can
210 be made to be unique). The Historical Data and Trending table will show
211 the averages for Parsing, Content Creation, Frame Creation, Style Resolution,
212 Reflow, Total Layout and Total Page Load time for each build, along with
213 a simple bar graph representation of each records weight relative to the
214 other records in the table. At a later this can be extended to trend individual
215 sites, however for most purposes the roll-up of overall averages is sufficient
216 to track the performance trends of the engine.
217 <h3>
218 The Execution Plan</h3>
219 Performance monitoring will be run on a weekly basis, and against all Milestone
220 builds. The results of the runs will be published for all interested parties
221 to see. Interested and/or responsible individuals will review the performance
222 data to raise or lower developer awareness of performance problems and
223 issues as they arise.
224 <p>Currently, the results are published weekly at <a href="http://techno/users/attinasi/publish">http://techno/users/attinasi/publish</a>
225 <h3>
226 Revision Control and Archiving</h3>
227 The scripts are checked into cvs in the directory \mozilla\tools\performance\layout.
228 The history.txt file is also checked in to cvs after every run, as are
229 the tables produced by the run. Commiting the files to cvs is a manual
230 operation and should be completed only when the data has been analysed
231 and appears valid. Be sure to:
232 <ol>
233 <li>
234 Commit history.txt after each successful run.</li>
235
236 <li>
237 Add / commit the new table and new trend-table after each successful run
238 (in the Tables subdirectory).</li>
239
240 <li>
241 Commit any chages to the sciripts or this document.</li>
242 </ol>
243
244 <hr WIDTH="100%">
245 <h3>
246 History:</h3>
247
248 <table BORDER WIDTH="50%" >
249 <tr>
250 <td WIDTH="25%">02/04/2000</td>
251
252 <td>Created - attinasi</td>
253 </tr>
254
255 <tr>
256 <td>03/17/2000</td>
257
258 <td>Removed QA Partner stuff - no longer used</td>
259 </tr>
260
261 <tr>
262 <td></td>
263
264 <td></td>
265 </tr>
266
267 <tr>
268 <td></td>
269
270 <td></td>
271 </tr>
272
273 <tr>
274 <td></td>
275
276 <td></td>
277 </tr>
278 </table>
279
280 </body>
281 </html>

mercurial