You can now load past years and either click through real results one game at a time, or fill in the whole year with Reset and tweak key results to see what changes.
Load a past season: FIXTURE ➡️ Allow tipping of past games
Progress one real result at a time: keep tapping ACTUAL.
Fill in the whole season: RESET ➡️ RESET TO ACTUAL.
First, the headlines: Geelong had the easiest fixture, GWS the hardest. But before we go any further, an important disclaimer: the Cats were so comfortably in far of everyone else, not even the league’s hardest fixture would have kept them from the minor premiership.
Home advantage is important, but not that important. If home advantage was as important as people say, all the left-column circles would be green and all the right ones would be red:
Games Won & Lost in 2022 (incl finals)
With Significant Home Advantage
Neutral-ish
With Significant Away Disadvantage
1
Geelong
WWWWLWWWWWW
WWWW
WWWLLWWWWL
2
Sydney
LWWWWWLWWWW
WWW
WLWLLWLWWWL
3
Brisbane
WWWLLWWWWWWW
W
LWWLWLWWWLLL
4
Collingwood
WLWWWWLW
WWWWLLWWWWL
LLWWWL
5
Fremantle
LWWLWLLWWWW
WW
WLLWWWDLWWW
6
Melbourne
WLLLWL
WLWWWWLLWWW
WWWLWWW
7
Richmond
WWWWDW
WWLWWLWLLWL
LLWLWL
8
Bulldogs
WLWWLL
WWLWWWLW
WWLLWLLLL
9
Carlton
WWWWLW
WLWLWWWLLL
WLWLLL
10
St Kilda
WLLLW
LLWLWWWLLW
LWLWWLW
11
Port Adelaide
LLWWWWLWWL
WLWL
LLLLLWLW
12
Gold Coast
LLWWWLWW
LWW
LLLLLLWLLWW
13
Hawthorn
WWWWLWW
LLLLLLLLLLLW
LWL
14
Adelaide
LLLLWLWLWW
WL
LLLWWLLLLW
15
Essendon
WWWLLL
LWWLLLLLLLW
LLLLW
16
GWS
LLLLLLWWWW
LL
LLLLLLWWLL
17
West Coast
LLLLWLLLLL
LL
LWLLLLLLLL
18
North Melbourne
LLLLLW
LLLLLLLLW
LLLLLLL
There is a bias there – home advantage is worth something – but it’s not a guaranteed ride to the Top Eight, or even a single extra win. You still actually have to be a good team.
(In the above table, “Significant Home Advantage” means games between interstate teams at a home ground, Geelong playing anyone at Kardinia Park, and Hawthorn or North dragging anyone off to Tassie.)
Of course, there are different degrees of home advantage. In Round 19 alone, we had:
West Coast vs St Kilda @ Perth Stadium (WA) – an interstate game with fervent crowd support for the home team – that’s about as extreme a case as you’ll find, and good for 13.3 pts, by Squiggle’s model, which is generally calibrated to the level of crowd support
Carlton vs GWS @ Docklands (Victoria) – an interstate game with good home crowd support, at a venue frequented fairly often by the away team – that’s 7.9 pts
Brisbane vs Gold Coast @ the Gabba (Queensland) – two teams with smaller fan bases from the same state at one’s home ground – 2.8 pts
Collingwood vs Essendon @ MCG (Victoria) – an extremely well-supported team hosts a very well-supported team at the Magpies’ home ground – 2.6 pts
North Melbourne vs Hawthorn @ Bellerive Oval (Tasmania) – two teams in their secondary state, at a ground more often played by the Kangaroos – 2.0 pts
There’s a real hodge-podge of scenarios, which over the season shake out a bit like this:
Don’t stare at that too long, though; there’s not much to be gleaned from it. The Squiggle model considers Collingwood and Richmond to enjoy many games of mild home advantage, by virtue of their large crowds at MCG games. The South Australian & West Australian teams usually have 10 games of extreme home advantage but fewer games of extreme disadvantage, as they revisit the same venues repeatedly (especially Docklands). NSW and Queensland teams essentially never create the same level of home advantage as the rest of the league, due to their lack of fan-filled cauldrons. And the Cats have a cauldron as well as warm fan support at many of their away games, which is a pretty handy setup.
Let’s now throw in Opposition Strength, because that’s the other big piece of the puzzle. As you know, each year the AFL carefully divides the previous year’s ladder into blocks of 6 teams, and assigns double-up games based on an equalisation strategy, so that weaker teams receive gentler match-ups.
Ha ha! We know that never works, since it only takes a couple of teams to shoot up or down the ladder to throw the whole thing out. But it may never have worked worse than this year, with Geelong, the eventual premier (and last year’s preliminary finalist) receiving quite gentle double-up games, while back-to-back wooden spooners North Melbourne faced a much sterner test. To some extent, this happens because teams can’t play themselves – you can’t fixture the wooden spooner against the wooden spooner – but still, things have not gone well when the premier has double-up games against the bottom 2 teams (representing 4 wins combined), while the bottom team faces both Grand Finalists, who have 34 wins.
Overall, Adelaide did well out of the 2022 fixture – which, as a bottom-6 team, was at least to plan. Gold Coast, also lowly ranked in 2021, received a terrific set of double-up games, but lost it all to home advantage, as they hosted interstate teams at Carrara only 8 times while flying out 10 times themselves – and not just to familiar Docklands; the Suns were dispatched to every state plus the Northern Territory (twice), and even country Victoria.
St Kilda had terrible everything, as usual; St Kilda always have a terrible fixture, to the point where I’m starting to think it must be written into the AFL constitution. They hosted just 4 interstate teams (at Docklands, which their opponents visit often) while taking 6 interstate trips, including two to Perth, plus a bus to Kardinia. Their five double-up games – which should have been mild, as a middle-6 team – included both Grand Finalists, a Preliminary Finalist, and a Semi-Finalist. This combination of bad luck and bad design is very St Kilda, as was the Round 7 home game the Saints sold to play in Cairns and subsequently lost by a single point: a rare sighting of the case where a team’s unfair fixture really did cost them the match.
GWS also had four finalists in its five double-up games, and its fifth opponent was Carlton, who missed finals by a point. That’s enough for the Giants to take the booby prize for the worst set of match-ups.
Geelong’s bounty, while appreciated, I’m sure, was mostly wasted, since they finished two wins and percentage clear on top of the ladder, and were decidedly the best team in finals as well as the second half of the year in general (after Melbourne’s slide). It’s unlikely their fixture affected anything, and the Cats almost had a case for being dudded, escaping by 3 points against the Tigers in a home game played at the MCG, and by a goal against Collingwood in a home final at the same venue.
The 2023 AFL fixture will be released in the near future, and I have some thoughts. Chief among them: We are not actually achieving much equalization when we focus on the 6-6-6 system – which is obviously flawed and often produces the opposite effect – while ignoring systemic, completely predictable imbalances, such as:
Poor teams sell home games.
Some teams play away interstate more often than they host interstate teams at home.
Some teams have many more games at the Grand Final ground – which doesn’t matter if you don’t make it, but can matter quite a lot if you do.
Teams with smaller fan bases generate less home advantage.
Geelong generate more home advantage playing any Melbourne-based team at Kardinia Park than they give away in the reverse match-up.
Some teams have home games shifted to their opponents’ home ground.
To be fair, the fixture-makers do seem to be aware of most of the above, and I think they make some effort to avoid any of them becoming too egregious. But the priority is clearly the double-up games, which is the least predictable part of the equation. The result is that Docklands teams – especially St Kilda! – are almost guaranteed a bottom-4 fixture every year.
And maybe we can’t fix that; maybe the world isn’t ready for a fixture that provides kinder fixtures to poor teams with smaller fan bases. But it should be part of the conversation. Today, any talk of fixture fairness quickly shifts to how many times each team should play each other, and stops there, as if that’s the whole problem. It’s not: a 17-round fixture (or 34 rounds) won’t stop teams selling games, or being shifted to the MCG to face Richmond and Collingwood, or being sent to country Victoria; or, for that matter, being lucky enough to play a team when they have a bunch of outs versus when they don’t.
It’s a grab-bag of factors, and there’s no way to smooth them all out. Teams will inevitably have good fixtures and bad fixtures. But we can do better if we don’t rest the whole thing on 6-6-6 and the clearly wrong assumption that next year’s ladder will look just the same as today’s.
Oh sure, now, everyone looks back on the preseason ladders and mocks how wrong they were. “Essendon to make finals,” they say, shaking their heads. “Not even close.”
But no-one was close, of course; everyone’s ladder has a howler or two. If you picked Essendon to fall, you probably didn’t also pick Collingwood to rise, or Port Adelaide to miss.
That doesn’t mean they’re all equally bad, though. Here at Squiggle, we value the signal in the noise, even if there’s still a lot of noise. And ladder predictions that were less wrong than everyone else’s are to be celebrated.
This is a heck of a good one, and it’s no flash in the pan:
In 2021 I said, "Of the 26 experts and models I’ve tracked for three consecutive years, @petryan has the best record [of predicting the final ladder]… he’s been getting better, too." This year Peter's was the most accurate out of 45 experts & models. https://t.co/rKPYyuPGam
Ryan’s ladder managed to get 7/8 finalists, which is fantastic given that three of them finished last year in 11th, 12th, and 17th. (His tip of Fremantle for 6th — a single rung too low — was especially good.) Like everyone else, he missed Collingwood, but correctly foresaw exits by Port Adelaide, Essendon and GWS. He also resisted the popular urge to push Geelong down the ladder, and wisely slotted the Eagles into the bottom 4.
Damian Barrett also registered a good ladder this year, with 6/8 finalists and three teams in the exact right spot. There was a fair gap from these two to Jake Niall in third.
Squiggle nudged out other models with some optimism on Sydney and pessimism on Port Adelaide, but not enough of the former on Collingwood and not enough of the latter on GWS and the Bulldogs.
Not everyone publishes a ladder prediction every year — it’s a little shocking how frequently journalists come and go from the industry — so although I always have a bag of 40 or 50 experts and models to rank, only half appear in all four of the years I’ve been doing this. Of those, Peter Ryan has the best record, finishing 19th (out of 45), 9th (/56), 3rd (/42) and 1st (/45). That’s an average rank of 8th, making him the only one to outperform Squiggle over the same period.
Squiggle pipped AFLalytics and Wheelo Ratings on the Ladder Scoreboard this year, mostly thanks to some solid returns in the early rounds.
Throughout the year — but especially early — the teams models overrated the most were GWS and Hawthorn, while they underrated Collingwood and Fremantle.
So I’m not super familiar with Discord, but Elo Predicts! is, and as a result we have a Squiggle Discord server. What will it be used to discuss? I don’t know. But here is an invite link: https://discord.gg/2ac6SBRnDD
Last week, in the Squiggle models group chat – of course there’s a group chat – Rory had a good idea:
Rory’s idea
It turned out that everybody had data on hand for this, because if you have a model, you also have a rating system. So I began collecting this, and now there’s a page to view it.
There’s also a widget here on the site, to the right of this post, or else above it.
On the main page, you can see how ratings change over time, and compare ratings from different models.
Power Rankings measure team strength at a point in time. They ignore the fixture, home ground advantage, and all the other factors that go into predicting the outcome of a match or a season. Instead, they’re a simple answer to Rory’s question: Which teams are actually good?
I enjoy a useless AFL stat as much as the next person, but this kind of thing tests me:
“Curse” is a bit of a tell in footy. It usually means “coincidence.” If it was a real effect, we’d have a decent theory about why. People love to invent theories. There’s no effect we won’t try to pair with a cause, no matter how thin the evidence. When there’s an effect and no cause, I tend to doubt it’s due to the spooky unseen hand of an unnamed force.
Usually a “curse” is an odd stat that, at first glance, seems like it can’t be the result of random chance, but that’s only because we don’t understand randomness. Our gut tells us that flipping five heads in a row is basically impossible, for example, when in fact true randomness tends to contain a lot more natural variation than people think. (You can flip ten heads in a row, if you’re willing to toss coins for a few hours, and people will think you’re a magician.)
Here’s the 0-2 stat:
I have a few problems with this.
First, I have to point out it’s technically wrong, because we’ve had nine finalists from 0-2, counting Carlton in 2013 who were elevated from ninth after Essendon’s disqualification.
But more importantly, the underlying effect sounds suspiciously like “It’s harder to make finals if you lose games.” And we knew that already. Is there anything magical about the first two games? Because if not, it’s just saying that dropping games hurts your finals chances.
Then there’s two snipes: the starting point (2010), and the number of games (2). If there’s a genuinely interesting effect here, and not a coincidence, we should expect to see not-quite-as-dramatic-but-still-suggestive numbers when those key numbers are varied a little.
Instead, it vanishes pretty abruptly. If you look at a longer time period, you see about 20% of 0-2 teams making finals, and if you look at 0-1 or 0-3 or 0-4 teams, the numbers again are about what you’d expect: about one-third of 0-1 teams make it, about one-in-ten 0-3 teams, and only Sydney 2017 has made it from 0-4 this century. So the more games you lose, the harder it is to make finals, in a steady and predictable way.
Because what actually happened here – the whole reason this stat became popular – is that between 2008 and 2016, there was a patch where only two 0-2 teams made finals (Carlton 2013 and Sydney 2014). This hit rate was quite a bit lower than the years before and after, although not wildly so:
Year
Finalists from 0-2
2000
2 out of 5
2001
1 out of 5
2002
1 out of 5
2003
1 out of 3
2004
2 out of 4
2005
0 out of 3
2006
2 out of 3
2007
1 out of 4
2008
0 out of 4
2009
0 out of 4
2010
0 out of 6
2011
0 out of 4
2012
0 out of 5
2013
1 out of 7
2014
1 out of 6
2015
0 out of 5
2016
0 out of 4
2017
1 out of 8
2018
1 out of 4
2019
1 out of 5
2020
1 out of 4
2021
3 out of 5
Eyeballing that, you might notice something else about the middle years: There are more 0-2 teams. And indeed we had a number of clubs at historical lows in this period, including two teams who were introduced to the league. Fourteen of those 0-2 non-finalists from 2008-2016 are actually just four clubs failing over and over: the two expansion teams plus Melbourne and Richmond.
So this always looked a fair bit like random variation plus an unusually weak bottom end of the comp. But somehow it gave birth to a “curse” that meant flag contenders couldn’t afford to drop their second game.
And now that regular service has resumed – implying that there was never much to see in the first place – “a new trend is emerging.”
You can now use the ladder predictor on seasons as far back as 2000. Relatedly, the Squiggle API now serves fixture info on games dating back to 2000, and you can also use it to get a list of which teams were playing in any of those years.
You might be wondering why you’d ever want to predict past ladders. To be honest, I’m not sure. I just know that people write in sometimes asking if the site can let them do that.
This particular addition was triggered by Jake, who emailed me to say he’d been in iso for a month, and he kept busy by re-entering past seasons into the predictor one game at a time to see how the ladder changed. Jake had done this for 2011-2022, but wanted to go back further.
So now you can. I am all about football as a mental escape from reality, Jake. That’s the best possible use of football.
Heading into 2021, there was a bit of hive mind syndrome going around:
You can order 18 teams in 6.4 quadrillion different ways, but after reviewing a whole lot of 2021 ladder predictions, I see we're all picking the same 3 or 4.
So everybody had Richmond way too high, and Melbourne, Sydney and Essendon too low. Collingwood were generally tipped for somewhere around mid-table, often pushing into the Eight, as were St Kilda.
This same-same field of predictions delivered neither a spectacularly good nor spectacularly bad ladder. Instead, everyone was just kind of okay. The average was better than just tipping a repeat of 2020, but not by much.
All year long, the Western Bulldogs looked a deserving top 2 team. Then they plunged from 1st to 5th in the final three rounds, upending a lot of ladder predictions along the way. A benefactor was Daniel Cherny, who’d tipped them for 6th, and suddenly had the best projection out of anyone. He had 6 of the Top 8, missing Sydney & Essendon for Richmond & St Kilda, and half the Top 4. He also wisely tipped Collingwood to fall further than most (although not as far as they actually did).
Of the 26 experts and models I’ve tracked for three consecutive years, Peter has the best record, averaging 65.03 points across that period. He’s been getting better, too, finishing 19th in 2019, 9th in 2020, and 3rd this year.
Honourable Mention: Squiggle (5th in 2019, 20th in 2020, 9th in 2021)
Mid-Season Predictions
If you’re interested in how models predicted the final ladder during the season, head on over to the Ladder Scoreboard. New model Glicko Ratings scored best this year, while as usual all models significantly outperformed the actual ladder.