[back to Home Page] [back to Articles Page]
TullyRunners.com - Article
Evaluating XC Team Strength with Composite / Average Times
by Bill Meylan
|What are Composite / Average Times in Cross
Country?? ... Composite Time is the total time of the top five
runners from each team - that is, take the individual times of the top
five runners and add them together ... Average Time is the Composite
Time divided by five (which is simply the average of the five individual
times). Some computerized XC results list both the composite and
average times (such as results from the
American XC Festival and XC results from
More people are more comfortable discussing average time (as opposed to
composite time), so that will be my approach for the remainder of the
article ... The purpose of this article is to familiarize readers with
the ability of average time to evaluate team strength ... positive and
negative aspects of average time evaluation will be discussed through
examples ... and some important considerations concerning the properly
application of average time ranking will be noted.
Coaches and serious observers have used average times to compare XC teams for many years ... Some meets (and courses) even keep "average time" records; for example, the Shenedehowa boys (12:58.7) and Saratoga girls (14:57.5) set "average time" records at the 2003 Manhattan Invitational on the Van Cortlandt Park 2.5 mile course. If cross country meets were scored using average times, it would effectively make an XC race similar to a track meet relay (where all relay competitors run at the same time) ... I am not familiar with any XC meet that officially scores the meet solely by average time (or composite time).
Why am writing an article at this time??
... Two upcoming events prompted me to write this article ... First,
the Nike Team Nationals (NTN) are beginning year (see
related information at DyeStat) ... According to the
qualifying procedures discussed by Marc Bloom with respect to team
evaluation, "One important pillar of strength is team times, and
since 5 runners score, 5-runner team time averages are analyzed. For
example, a boys team averaging 16:20 for 5,000 meters (5k), or a girls
team averaging 19:20, may be worthy of consideration, depending on
whether the course is flat and fast or hilly and slow." ...
Therefore, average time will be used in ranking teams regionally and
nationally ... Second, I have seen proposals that use average
time as part of the process of selecting the 16 NYSPHSAA teams that get
invited to the NY Federation Meet (note ... just
proposals, nothing definite).
Team Strength Evaluation ... For starters, I am not advocating that "average time" be used to score XC meets ... I like the way XC meets are scored currently by finish-place ... But the speed of the individual runners and team is a valid way of ranking teams ... track leaderboards "rank" individual runners by fastest times - and if the four fastest times from each team are added together, you will get a decent idea of which relay team might be faster than another relay team ... by analogy in cross country, add together the times of the five top runners from each team to see which team might be faster.
So, if cross county meets were scored by average time scoring, how would it compare to regular XC scoring?? ... This is a fairly easy concept to test because the results of thousands of meets are readily available through Internet and other sources ... It is simply a matter of gathering the results and compiling them into comparable statistics (I am just one of many people who have done this over a period of years) ... For the purpose of this article, I have placed some relevant examples on a separate web-page (Comparison of XC Team Scores vs. Team Average Time) - and I will refer to the information on that web-page.
General Conclusion: ... The results of "team scoring" versus "average time" scoring are very similar ... there can be some place-switching when team scores are close, but the average times are usually close as well ... this applies to both "head-to-head" competition and "merged" competition ... So team strength evaluations by team score and team average time are nearly the same (for "head-to-head" and "merged" competitions) ... But one exception needs to be noted...
Exception: ... the 2003 Tully boys team is a perfect example of the exception ... at the 2003 NY Federation Championship, Tully finished 14th in the team score and 4th in the team average score (see comparison web-page) ... the team places tell the story (1-2-72-135-137) - Lopez Lomong & Dominic Luka finished 1st & 2nd and the next place was 72nd ... this is similar to a track relay team where two super fast runners can make up the time for the slower runners (and then some) ... Another Tully example is the Brown University Invitational:
Score Average Team Places ====== ======== ================== 1 Bis Hendricken-Ri 148 16:22.02 5 30 33 36 44 2 St.Anthony-Ny 153 16:21.24 6 14 37 45 51 3 Northport - NY 155 16:24.60 7 19 25 32 72 4 Amherst Regional-Ma 164 16:27.22 10 21 34 42 57 5 North Kingstown-Ri 174 16:32.62 17 27 35 46 49 6 Cumberland-Ri 179 16:33.42 18 28 29 39 65 7 St.John's Prep-Ma 186 16:34.44 11 26 38 41 70 8 Seton Hall Prep 194 16:33.80 9 23 43 58 61 9 Tully Central School 200 16:22.86 1 2 56 66 75
At Brown, Tully was 9th by score but a very close 3rd by average (and in that race, Lomong strained a hamstring in the final 800 meters which probably cost him at least 15 seconds in finish time, and Tully would have had the best average time) ... Tully is the extreme example with two runners like Lomong & Luka ... But the comparison web-page has some other examples where teams with one or two exceptionally fast runners (compared to other team members) benefit from "average time" evaluation - for example, North Rockland boys at Federations and the Cicero-North Syracuse girls at the OHSL Championships.
Important Considerations for Using Average Time Team Ranking
All examples on the comparison web-page illustrate teams that raced head-to-head or raced at the same course on the same day ... National or State-wide team ranking requires evaluation on different courses, so extending the team "average time" concept to different courses and different teams, and comparing them, is a very useful goal for ranking teams that have not raced each other or raced on the same course, same day ... but this extension (if real accuracy is necessary) is not a simple process - it requires appropriate application.
Absolute Time ... is the actual time recorded by a stopwatch ... the times on a track leaderboard are absolute times ... Absolute time is appropriate for ranking outdoor track performances because everybody runs the same distance on a flat surface that is similar to other track surfaces (ignoring the weather) ... But using absolute time for ranking cross country performances is totally inappropriate for the vast majority of races for a variety reasons including the following:
(1) Actual Length of XC Courses ... the exact length of many XC courses is not known (or available) ... many are rated as a 5K course or a 3-mile course, but in reality, their exact distance varies slightly from the rated distance ... for example, Baldwinsville has both a 3-mile course and a 5K course - the actual distance of the 3-mile course is about 2.97 miles (or about 50 meters short of 3-miles) which would lower a team average time by roughly 10 seconds (compared to a full 3 miles) ... Another example is Saratoga State Park which I've seen rated as both a 3-mile course and a 5K course - in reality, the actual distance of the course normally used for high school races is closer to 3.02 miles.
Some courses are very consistent with regards to their start-finish line positions and distance from year to year ... many courses are not ... I've seen numerous examples where the start-or-finish line positions change from year to year (construction projects a common cause) ... seemingly minor changes can affect team average times.
(2) Surfaces of XC Courses ... surface composition can vary from asphalt to grass to wood chips and mud ... even if an XC course is perfectly flat, the surface can a make a big difference in time ... so comparing absolute times from two courses the exact same distance with different surfaces can be bad.
(3) Surface Topography ... comparing absolute times from a flat course to a very hilly course is obviously not a good idea.
Bottom-Line ... Due to the variation of XC distances and surfaces, absolute time ranking is practically meaningless if applied on a national or state-wide basis ... So, how can team averages be compared on a national or state-wide basis??...
Adjusting Absolute Times So Comparisons Are More Accurate
The most common method of comparing times from one XC course to another is the application of "comparison tables" developed from data and observation over a period of time ... Here is the example cited by Marc Bloom on DyeStat for the Nike Team Nationals - "In comparing New York and New Jersey teams in the Northeast Region, there is a time difference of about 3:20 between performances at Van Cortlandt (2.5 mile course) and those at New Jersey's Holmdel Park, a hilly 5k where the state meet and other events are held. A top boys team averaging 13:10 at Van Cortlandt equates to about a 16:30 at Holmdel. Years of performance comparisons bear this out."
Over the years, a number of coaches and serious observers have developed course comparison tables that compare the speed of different race courses to each other ... I have my own comparison tables for most of the courses I evaluate (in NY State and elsewhere) - here are a few examples using the SUNY Utica course as a starting point with a 17:00 rating:
SUNY Utica 17:00 Bowdoin Park 16:50-16:57 Sunken Meadows 16:50-17:00 Westchester CC 17:20-17:25 Saratoga Park 15:45-15:50 Bear Mountain 15:35-15:40 McQuaid 15:35-15:45 Van Cortlandt Park (2.5) 13:10-13:20 Van Cortlandt Park (3.1) 16:40-16:50
Limitations of Course Comparison Tables
My comparison tables (and anybody else) are only approximations ... they typically consider only normal/good running conditions ... and you must assume the course distance doesn't change from year to year ... Factor in the potential effect of weather and other variables (such as soft ground, newly paved trails, etc) and the accuracy can become suspect.
As noted in previous articles, I rarely use comparison tables when calculating my speed ratings because, at times, they are not accurate enough ... Here is an obvious example - consider the difference at Sunken Meadows on a nice weather day versus a day with high winds and rain - times can easily be 15-30 seconds slower on average ... direct application of the comparison table (without correcting for the weather variable) will give very misleading team average times. Another example is the NY State Class Meet (at Marcus Whitman) ... compared to the earlier invitational at Marcus Whitman, the times were much slower at States due to the very cold windy conditions (I was turning into a popsicle after 4 hours at the finish line) - but conditions began to improve near the end, and the course speed actually began improving.
Whenever possible, I calculate course speed for that particular day (and sometimes from race to race) because significant variations do occur ... often, my calculations fall within the range on the comparison tables, and that's actually reassuring.
How will the Nike Team Nationals calculate average times on a national and regional basis?? ... I don't know because I'm not part of process ... Reading the DyeStat article, it appears comparison tables will be used to compare courses when possible ... other than that, it appears races will be separated into distances (such as 5K or 3-miles) and further separated in categories such as flat and hilly ... they have a conversion time for difference between a 5K and 3-mile course ... Overall, I think many team average times will have questionable accuracy ... However, I wonder how important those average times are in the final ranking of teams? (maybe it's a minor factor)
In a follow-up article concerning the Saratoga girls team (plus other NY teams and some nationally ranked teams), I illustrate the use of average times in evaluating team strength ... and demonstrate the exceptional strength of Saratoga ... The article also demonstrates how absolute times can be used.
(Because I was asked) - Example use of speed ratings in calculating team average times ... below are my overall speed ratings for the top five Saratoga girls after the State Class meet ... Remember, a speed rating is just a 5K race time converted to number ... the average rating is 142 which equals an 18:54 time at SUNY Utica ... one point equals 3 seconds - from below, Nicole Blood is rated 10 points higher than Lindsey Ferguson which equals 30 seconds, so Blood had been running about 30 seconds faster in races than Ferguson ... Note: that average time is not an absolute time, it's an adjusted time.
Nicole Blood 10 Saratoga 157 Lindsey Ferguson 10 Saratoga 147 Ruby Solomon 11 Saratoga 142 Cameron Vahanian 10 Saratoga 137 Karyn Delay 11 Saratoga 128 ------ AVERAGE = 142 speed rating of 142 = 18:54 at SUNY Utica