Thursday, August 27, 2009

Schedules – The Decade’s Toughest

As I was compiling my statistical based list of the best teams of the decade (so far), and thinking about how to rank the teams overall, strength of schedule naturally came to mind.

I use a strength of schedule component in my own Power Rankings each year, and I choose to use the NCAA’s system (last year’s results here).

I prefer the NCAA’s method of schedule ranking for its transparency, lack of bias and simplicity. I also think that, for the most part, it is an accurate representation of overall schedule strength. “Black box” systems like Sagarin don’t reveal how they calculate SOS and seem (to my eyes at least) to favor certain conferences without adequate explanation of why.

The NCAA method merely looks at the win-loss records of your opponents. As such, mid-major teams tend to fare poorly as they play in conferences against teams that tend to have a lot of losses. Take last year’s NCAA final strength of schedule ratings where the top 20 slots went to major conference teams. The highest mid-major was TCU at 22nd who just happened to play Oklahoma, BYU, Utah and Boise State this past season. Moreover the bottom slots are all filled with mid-major conference teams. Oklahoma, who faced a tough Big 12 schedule as well as Florida in the BCS title, had the toughest overall NCAA schedule, a verdict with which I firmly agree (Florida was 2nd).

So which teams have faced the toughest schedules over the past decade (from 2000)? I have compiled the records of all 1054 teams and they toughest schedules were (with their opponents win – loss percentage) –

1-2005 Oklahoma 0.6885
2-2004 Texas A&M 0.6719
3 -2006 Florida 0.6711
4-2002 Southern California 0.6622
5-2003 Alabama 0.6622
6-2008 Oklahoma 0.6552
7-2004 North Carolina 0.6549
8-2001 California 0.6514
9-2003 Florida 0.6496
10-2003 Notre Dame 0.6449


The 2005 Oklahoma team comes out at the top of our list with a group of opponents that won nearly 69% of their games with an 84-38 record. They played 6 teams that season that ended up ranked in the final poll – Nebraska (24th), Texas Tech (19th), UCLA (13th), Oregon (12th), TCU (9th) and BCS Champion Texas. They finished the season 8-4.

Oklahoma and Florida are the only teams to appear twice in the top 10, and the 2006 Gators are the only team to win the BCS title with a schedule this tough.

Since we are a Gator blog let’s look at the two UF teams that make an appearance here –

2003 Florida

Poor Ron Zook and his 8-5 Gators in 2003. They faced (and lost to) these teams ranked in the final poll – Tennessee (16th), Ole Miss (14th), FSU (10th), Miami (12th) and Iowa (8th). You can say what you want about 2003, but Florida lost to only ranked teams. And the other two ranked teams UF played, Georgia (6th) and LSU (1st and BCS Champs) the fighting Zooks managed to beat! That’s 7 ranked teams in one season – and one really hard schedule.

2006 Florida

On their way to the BCS title Florida faced a murderer’s row of teams for the toughest schedule that year, including (with final rankings) –

At 9-4 Tennessee (23rd)
11-2 LSU (3rd)
At 11-2 Auburn (8th)
9-4 Georgia
10-4 Arkansas (16th)
12-1 Ohio State (2nd)

The Gators, who naturally wound up number 1, played the 2nd and 3rd final ranked teams.


This leads us to the final step in this project – the top 10 teams of the decade (so far) using statistical analysis. Since I have all the data I use for my Power Poll for every team this decade I’m going to use the same formula to reveal the top teams of the decade. In doing so whether you were a BCS winner or not is irrelevant, and some teams may have more than 1 loss. Also with a strength of schedule component perhaps some teams have been overlooked for their greatness – like the 2006 Gators (hint).

Next: The Decade’s (thus far) top teams.

4 comments:

Clark said...

Strength of schedule is a very tough thing to measure, which is why so many people have such a convoluted way of doing it. While your method is certainly simple and transparent, it has drawbacks of it's own.

1: I hope D-1AA opponents are excluded from the strength of schedule. If every team in a conference plays a D-1AA opponent (and wins) then every other team in the conference picks up 8 easy wins for their strength of schedule calculation which are really pretty meaningless. (Specific example: LSUs SoS gets a boost because Florida beat up on the Citadel, rather than playing a real opponent.)

2: Non-BCS teams fair poorly in this calculation because they have losses to top teams. Utah State has a pretty bad football program. When scheduling non-conference games, they could schedule teams like North Texas and have a decent shot at beating them. I'd pick USU over UNT in 6 games out of 10. But that doesn't really do anything for USU. Instead, they look to schedule teams like OK who will pay them big bucks to fall down, essentially. This decision to go for money over wins ends up hurting everyone else in the conference, to the tune of 0.6 wins per season. And if multiple teams are doing that in the conference, that starts to add up.

3: I'm just thinking off the top of my head, but I'd like to see a SOS system that goes one layer deeper and looks at the record of the opponents of a teams opponents. Yes, it starts to get downright recursive in conference play, but I'd be curious to see how it works out. Because we all know that beating an undefeated Hawaii team from 2006(?) is less impressive than defeating a one-loss SEC champion. This system would drop teams down that are beating up on teams with good records against cream puffs (see: non-conference schedules of significant portions of Big12, Big10, etc.) and would reward conferences which are on the whole attempting to work their way up in the world by taking on challenging non-conference opponents.

Anonymous said...

I wonder if part of reason the 2006 Gator opponents ended ranked so highly is that Florida barely beat so many of them.

The 2008 team crushed good teams and thus they ended up dropping more than if they had endured a close loss.

How about some fun statistics with the impact on weekly ranking of the two years?

Mergz said...

Clark - I agree it has drawbacks, as every system must. But to your points -

1. It only includes games from teams in FBS (the former Division I-A).

2. The non-BCS conference teams end up with bad SOS for the reasons you cite. The typical mid-major plays 3 non-conference games against major conference teams for money, and loses those games. As a result every team in their own conference has 3 losses from non conference, then whatever conference losses it suffers. But in my opinion that is the way it should be, as they are, by game example, clearly inferior to the big teams.

3. So would I.

Clara - I think they were highly ranked mostly by their records, not the close losses. The teams Florida played in '06 were a cumulative 100-49, the third best winning percentage of the decade. By comparison the 2nd toughest '06 schedule of Michigan was against teams with a cumulative 9 fewer wins, or almost 1 fewer win per game played.

Andy said...

I really enjoy the statistical analysis done on this site and the various approaches taken when doing the analysis. Keep up the good work and go Gators!