Week 12 Power Rankings
Our top 25 Power Rankings using the methodology found here.
1 Alabama 11 0 2.1913
2 Florida 11 0 2.1673
3 TCU 11 0 2.0969
4 Texas 11 0 2.0249
5 Virginia Tech 8 3 1.8242
6 Cincinnati 10 0 1.8185
7 Ohio St. 10 2 1.8025
8 Oregon 9 2 1.7630
9 Penn St. 10 2 1.7629
10 North Carolina 8 3 1.6541
11 Oklahoma 6 5 1.6375
12 Clemson 8 3 1.6324
13 Oklahoma St. 9 2 1.6138
14 Arkansas 7 4 1.6118
15 Miami (FL) 8 3 1.5859
16 Iowa 10 2 1.5825
17 Boise St. 11 0 1.5639
18 Georgia Tech 10 1 1.5632
19 Pittsburgh 9 1 1.5463
20 Texas Tech 7 4 1.5426
21 Mississippi 8 3 1.5220
22 Nebraska 8 3 1.5165
23 Arizona 6 4 1.5101
24 Wisconsin 8 3 1.5048
25 South Fla. 7 3 1.4339
Alabama takes the number 1 slot from Florida but the margin is very narrow. By this poll the Tide are 1.1% “better” than Florida at a neutral location.
Oddities include Virginia Tech in 5th, North Carolina in 10th, and most especially 5 loss Oklahoma in 11th. The Sooner’s just came off a total thumping by the team ranked 20th here (Texas Tech) at Tech. (The poll does indicate that the Red Raiders would be a slight edge to Oklahoma at home). The reasons they continue to rank so high are –
28th ranked Offense
8th ranked Defense
10th ranked Schedule
I admit it is driving me nuts, but this is the system that has seemed to work so well in the past, and I’m not going to manipulate the data to get the results I want. Maybe it will sort out a little better in the end.
The rest of the poll passes the “eye test” for me. Other notable’s ranks –
28. LSU
36. USC
39. FSU
40. Tennessee
42. Georgia
48. Notre Dame
50. South Carolina
UPDATE: As noted, I’ve been questioning my methodology that resulted in ranking Oklahoma 11th.
Perhaps not anymore.
Jeff Sagarin of BCS computer relevance ranks the Sooners 7th by his “predictor” rankings, which he considers his most accurate rankings for team quality. (The Sooners are 30th in the rankings he uses for the BCS).
Apparently Sagarin’s method sees something in Oklahoma too.
6 comments:
Doesn't pass our eye test at all. Not close.
The point is not "does it pass an eye test," but how does one account for the apparent disparity?
Perhaps the methodology is flawed, or perhaps there is some other explanation.
Floridian,
I'm not sure there is a disparity. When you are talking about a team's overall "goodness" (quality?), one can imagine easily a scenario where the "better" team, through bad luck or whatever, has more losses than a team that is not as good.
In last year's final power rankings there was a team with 5 losses ranked in the top 25 (Clemson at 22), but that was after all the games, and certainly nowhere near where Oklahoma comes in at 11th.
Maybe it is correct, maybe Oklahoma is the 11th "best" team in the land, but it doesn't feel correct.Maybe I need to tweak the methodology, but that makes me somewhat uncomfortable.
As far as "tweaks" go, do I make SOS count less (that would send Oklahoma down). Or make losses count more (the same). Should I be making adjustments because one team makes me uncomfortable?
Rather than making SOS count less, I would suggest making multiple losses count off more. For example, deduct an additional .1 for each loss after the first. That way a team that loses one game doesn't get dropped too far, but a team with multiple losses, unlucky or not, won't stay at the top of the pile.
The specific problem with Oklahoma is that they haven't beaten anyone with a pulse and their relatively high offensive ranking is comprised solely of beating up on the little sisters of the poor. Take out Idaho State and Tulsa (I realize your system isn't set up to do that), and their scoring offense rank would take of the other issue.
Robert -
Interesting thought. I'm going to tinker with it, with a bit of caution (I fear it might over-elevate the Boise States of the world).
Post a Comment