AFL Prediction & Analysis

How the Fixture Screwed St Kilda

It’s hard to prove the fixture affects any team’s finishing position. To say that, you need to find games that were so close that the result could easily have been different, and establish that they were scheduled in a lopsided way.

The only really clear examples of this tend to be self-inflicted wounds, where a club sells a home game and loses it narrowly — such as Richmond’s famous after-the-siren loss in Cairns to the Gold Coast in 2012, or St Kilda’s 3-point defeat by Brisbane in New Zealand in 2014.

These cases are nice and clear: Computer modeling can tell us with a high degree of certainty that Richmond’s home ground advantage at the M.C.G. is worth more than 2 points, and therefore they should have won the game if it had been played there. Likewise, St Kilda should have defeated Brisbane if they’d stuck to their regular home ground in the Docklands. Of course, you can point to many things that could have changed a close result, but one of them is the venue.

Otherwise, though, the picture is muddier. You can establish that a team had a favourable fixture — weaker opponents than average, or games at friendlier venues — but you can’t really say it was worth a certain number of wins. When Fremantle played Gold Coast “away” in Perth this year, due to the unavailability of Carrara, that was certainly unbalanced fixturing… but the Dockers won the game by 28 points, so would the result have been different in Queensland? Modeling suggests probably not.

However, you can say one thing for sure: St Kilda got screwed.

St Kilda (2018)
Net Benefit from Fixture: -123 points (ranked 18th)
Opposition: 18th Timing: 12th Home Ground Advantage: 18th

St Kilda is no stranger to scheduling screwings. It had the AFL’s worst fixture in 2014, the worst in 2015, the 9th best in 2016, and the worst in 2017. This year, it was the worst again, and that was also the worst fixture of any team in five years.

This analysis rests on three key factors: who you play (Opposition), when you play them (Timing), and where you play (HGA).

(Two factors that come up sometimes in these kinds of discussions, and which aren’t included because they don’t matter, are Six-Day Breaks and The Cumulative Effect of Travel. There is more on these at the end of the article.)

Opposition

The simplest factor is the strength of opposition. All clubs face each other at least once, of course, and the AFL attempts to equalize the competition by ensuring that Bottom 6 teams have no more than one double-up game against a Top 6 opponent.

This is fine in theory, but since this year’s performance isn’t a reliable guide to next year, it often works out less well in practice.

St Kilda was a Middle 6 team in 2017, finishing 11th. They were duly given a 2/2/1 split of double-up games: That’s two matches against Top 6 opponents, two against Middle 6 opponents, and one against a Bottom 6 opponent.

This was already a touch on the mean side, since two other Middle 6 teams were assigned the more favourable 1/2/2 split.

But the Saints’ real misfortune came from how its double-up opponents — Richmond, GWS, Melbourne, Hawthorn, and North Melbourne — performed unexpectedly well. This turned their 2/2/1 split into 4/1/0: four double-up games against this year’s Top 6, one against the Middle 6, and none against the Bottom 6.

And that put the Saints into a whopping 95-point hole for the season.

You can also see that in practice there isn’t a whole lot of equalization going on here. Many of 2017’s stronger teams had weaker than average double-up opponents (Melbourne, Port Adelaide, GWS, Hawthorn, Richmond), while many of last year’s lower teams faced harder than average opposition (Gold Coast, Brisbane, Fremantle, Western Bulldogs, St Kilda).

At least North Melbourne, the recipient of this year’s most beatable opponents, had a generous fixture by design. After finishing 15th in 2017, the Kangaroos were given a 1/2/2 split, which turned out to be 0/1/4, with repeat games against Brisbane, Gold Coast, Western Bulldogs, St Kilda, and Sydney.

Timing

There’s a problem with considering strength of opposition like this: It assumes that teams are equally strong all season long. That’s clearly not the case. Team strength fluctuates both on a weekly basis, as star players get injured and miss games, and over the medium- and long-term, as a club gets stronger or weaker for any number of reasons: fundamental gameplan changes (Essendon), or deciding the season is lost and looking to the future (Carlton), ramping up toward finals (West Coast, Melbourne), or simply having the wheels fall off from no discernible cause (Port Adelaide).

Each club will happen to play some opponents around their weakest point and meet others near their peak. Lucky clubs will meet more opponents at times of relative weakness; unlucky clubs will run into more at times of relative strength. It should average out, but doesn’t always. And since team form can rise and fall quite a lot, it can make a real difference.

After a stirring Round 1 victory over Adelaide, the Bombers lurched from one unconvincing performance to another, culminating in their Round 8 defeat at the hands of Carlton. This led to a very public dissection, staff changes, and a very different looking team for the remainder of the season.

As such, it was quite a lot better to face Essendon early in the year. In fact, it’s possible to identify the worst possible time to play Essendon: Round 21. This is when the Bombers were performing well as a team, but just before they lost Orazio Fantasia (Round 22) and Tom Bellchambers (Round 23). As it happened, the team they played in Round 21 was St Kilda.

(Note: This is a naive rating, which means it rates the apparent strength of a team each round before they played, in order that it remain uncontaminated by the performance of the team they played against. It means there’s often a real difference between how beatable a team appeared to be and how they performed on the day. Essendon provide a fine example of this, too, in Round 9, when, after looking abysmal against the Blues and then losing Hurley and Parish, they upset Geelong.)

In truth, though, St Kilda weren’t particularly screwed by timing; not this year. They rank around the middle of the league on that metric, receiving a mix of good luck (Geelong and North Melbourne early in the season) and bad (Adelaide early, Essendon and Hawthorn late).

The worst timing belongs to Fremantle, who managed to encounter a string of opponents near their peak: Collingwood (round 23), Hawthorn (round 19), Essendon (round 18), Port Adelaide (round 17), Sydney (round 9), Carlton (round 13), and Gold Coast (round 3).

The Blues had the most fortunate timing, thanks to repeatedly running into teams who were losing key players to injury — although perhaps this is less to do with good fortune than their opponents seizing the opportunity to rest players. The Blues also had early-season games against teams who would dominate the season later: West Coast, Collingwood, Richmond, and Melbourne.

Home Ground Advantage

But back to the screwing. In theory, home ground avantage (HGA) is balanced: every team receives roughly the same advantage from its home games that it must face in its away games.

West Coast, for example, play ten home games against opponents traveling from interstate, and ten games to which they must travel, plus two local derbies. There’s no real HGA in a derby, and the benefit the Eagles receive from playing interstate sides at home is neatly counter-balanced by the penalty of traveling interstate to play away.

A Melbourne-based team such as Collingwood, by contrast, plays many games against other local opponents at relatively neutral venues in the M.C.G. and Docklands while hosting interstate sides about five times and traveling away about the same number.

Either way, the net benefit is roughly zero.

But there are exceptions. Sometimes teams give up a home venue they’re entitled to, such as Melbourne playing in the Northern Territory. Hawthorn and North Melbourne occasionally drag a Melbourne-based team to Tasmania, creating more venue-based advantage than there would otherwise be. And occasionally there are weird situations like the Commonwealth Games depriving Gold Coast of a home ground, sending the Suns to play a home game against Fremantle in Perth.

Also, sometimes a team is given unbalanced travel.

Now, HGA is hard to define, and various models use different methods. You can get a reasonable approximation simply by assigning 10 or 12 points of HGA whenever a team hosts an opponent visiting from out of state. A more sophisticated, but also more fragile, strategy is to attempt to divine particular teams’ affinity for particular venues from the historical record of how they under- or over-perform there.

I employ something in between, which is essentially a Ground Familiarity model. This awards HGA based on how familiar a team is with the venue and location; in practice, it serves as a proxy for an array of factors that probably comprise HGA in the real world, including crowd noise, travel time, and psychological disruption.

There’s a fair argument that Sydney wasn’t actually the second-most HGA advantaged team this year, because Sydney didn’t play its home ground very well. Similarly, many believe that Richmond received more advantage from M.C.G. games this year than the average M.C.G. tenant. Such ideas are popular, but tend to be transient and based on few data points. For example, for me, Richmond’s much-discussed four interstate losses are more easily explained by the fact that those were its four hardest games. So there is no attempt here to model that kind of theory.

Then again, a Ground Familiarity model has quirks of its own. Much of the reason Sydney scores well on this measure is that the Swans played 18 matches at just three grounds: the S.C.G., the M.C.G., and Docklands. They traveled to Western Australia just once and South Australia not at all. This means the Swans frequently play away at grounds they’re reasonably familiar with, while their opponents don’t have the same experience at the S.C.G.

This small but persistent imbalance affects Docklands tenants in reverse: They are almost always less familiar with their opponents’ home grounds than their opponents are with theirs. For example, the Saints are relatively familiar with Perth for a Victorian team, being dispatched there at least once a year, and four times in the last two years. But when they last met the Eagles in Melbourne, that was the Eagles’ fifth trip to Docklands that year alone. The venue still offered St Kilda an advantage (especially compared to Perth), but it was a little less than it might have been.

Therefore, under a Ground Familiarity model, being based at the Docklands is the worst option. You probably don’t even get to play home finals there.

But the real culprit behind St Kilda’s poor rating on home advantage is their persistent travel imbalance:

St Kilda’s Games vs Non-Melbourne Opponents

Year Home Away Difference
2014 5 7 -2
2015 5 7 -2
2016 5 6 -1
2017 6 6 0
2018 4 7 -3
Average 5.0 6.6 -1.6

Having to travel to your opponents more often than they travel to you is a clear source of home ground advantage disparity. This is rare for non-Victorian teams, who almost always have a 10/0/10 split, but common for those in Melbourne: They will often benefit a little more or a little less from travel than their opponents. For the Saints, almost every year, it’s less.

This year’s version was the most extreme yet. St Kilda enjoyed home advantage from hosting travelling teams only four times (Brisbane, Sydney, GWS, and Adelaide), while facing disadvantage from travelling interstate five times (GWS, West Coast, Port Adelaide, Fremantle, and Gold Coast), as well as having to visit Geelong at Kardinia Park (with no return home match), and Hawthorn in Tasmania.

The Saints have actually been sent to play North Melbourne or Hawthorn in Tasmania for five years in a row now, each time turning what would be a neutral game at Docklands or the M.C.G. into one where the venue favours the opposition.

Three extra games of significant disadvantage is quite a lot. The ratio is eyebrow-raising, too; the equivalent of a non-Victorian team playing 8 home games and 14 away.

Sooner or later, the law of averages will ensure that St Kilda get lucky with their double-up opponents, or else their timing. But their unbalanced travel is an enduring, deliberately fixed anchor. It ensures that whenever fortune smiles on the Saints, it’s more of a smirk.

St Kilda’s Fixture: A History

Year Fixture Rank Oppo Timing HGA
2014 18th 12th 11th 18th
2015 18th 12th 17th 18th
2016 9th 2nd 10th 14th
2017 18th 13th 16th 14th
2018 18th 18th 12th 18th

Overall

When the three factors are combined, most teams arrive at a net fixture benefit between -50 and +50 points. That’s not a lot over a season: only a couple of points per game.

And teams tend to have a mix of fortunes. For example, Melbourne had an advantageous set of double-up opponents (Adelaide, Geelong, St Kilda, Western Bulldogs, Gold Coast) and fortuitous timing, but this was offset by poor HGA due to playing home games in the Northern Territory.

Geelong had good HGA and very lucky timing, but somewhat harder than average double-up opponents. Fremantle had the league’s best HGA and the league’s worst timing. Richmond also had terrible timing, as well as an interstate travel disparity, but only an average set of double-up opponents.

St Kilda had bad everything. More specifically, it had league-worst opponents, league-worst HGA, and below-average timing.

St Kilda’s 2018 games rated from easiest to hardest

Rating Rnd Vs Oppo Timing HGA
+37.2 1 W Brisbane Lions Docklands (VIC) +16.4 +12.9 +7.9
+33.3 17 W Carlton Docklands (VIC) +28.6 +3.5 +1.1
+22.7 2 L North Melbourne Docklands (VIC) -2.3 +24.2 +0.9
+22.3 13 W Gold Coast Carrara (QLD) +31.4 +3.7 -12.8
+18.5 20 L Western Bulldogs Docklands (VIC) +14.8 +3.8 -0.1
+2.9 8 L Fremantle Perth Stadium (WA) +14.9 -0.1 -12.0
+2.7 7 L Melbourne Docklands (VIC) -11.7 +12.4 +2.0
+0.1 12 L Sydney Docklands (VIC) -7.7 -0.5 +8.4
-5.0 23 L North Melbourne Docklands (VIC) -2.3 -3.4 +0.8
-8.4 5 D Greater Western Sydney Docklands (VIC) -14.0 -3.3 +8.8
-10.2 21 L Essendon Docklands (VIC) +0.4 -11.1 +0.5
-12.2 6 L Hawthorn York Park (TAS) -7.0 +2.4 -7.5
-13.0 3 L Adelaide Docklands (VIC) -7.9 -13.6 +8.5
-13.2 9 L Collingwood Docklands (VIC) -11.3 -3.6 +1.7
-13.4 22 L Hawthorn Docklands (VIC) -7.0 -8.7 +2.3
-13.9 4 L Geelong Kardinia Park Gee -12.7 +10.9 -12.0
-15.3 15 W Melbourne M.C.G. (VIC) -11.7 -1.7 -1.9
-26.5 16 L Port Adelaide Adelaide Oval (SA) -13.5 -1.7 -11.4
-27.6 11 L West Coast Perth Stadium (WA) -9.2 -7.2 -11.3
-30.6 19 L Greater Western Sydney Sydney Showground (NSW) -14.0 -3.3 -13.3
-36.1 10 L Richmond M.C.G. (VIC) -34.8 +1.4 -2.7
-37.0 18 L Richmond Docklands (VIC) -34.8 -3.6 +1.4
TOTALS -95.5 +13.4 -40.8

The Saints’ third-easiest game in 2018 was against North Melbourne at Docklands. Think about that for a moment. (This was the game where the teams were 2.10 apiece at half-time. The Saints had a chance to seize control before North Melbourne put it together, but missed it.) North won 12 games and were in finals contention. Docklands is their home ground.

Five of St Kilda’s games were significantly easier than league average; fourteen were significantly harder. The Saints had one cheapie — a game with home ground advantage against lowly opposition — against Brisbane in Round 1, which they won. Their only other game with significant home advantage against a non-finalist was Adelaide in Round 3, because, remarkably, in 2018 the entire bottom 10 was otherwise made up of Docklands tenants and interstate teams that the Saints had to play away.

Conclusion

To be clear, St Kilda are a bad team.

They wouldn’t have made finals with the most generous fixture in the world, and didn’t deserve to. They may not have even won any more games.

But they were screwed by the fixture as hard as any team has been screwed in the last five years. And it’s not unusual. St Kilda are on the receiving end of fixture disadvantage year after year.

It’s time that changed.

And Another Thing: Two Other Factors

Brief notes on two other factors. This analysis is based on a Squiggle model that predicts the outcome of football games. Any factor that can help the model make more accurate predictions is employed; factors that don’t help are ignored. This is an objective way to establish what does and doesn’t significantly affect the outcome of a football match. So the short answer for “Why isn’t ____ considered?” is that it doesn’t make the model more accurate.

You can review the performance of the model against other well-known public models here and find more detail here.

Non-Factor #1: The Cumulative Effect of Travel

An argument goes like this: It’s easy to figure out who has the hardest fixture: it’s West Coast, because they traveled 59,000km. It is always West Coast or Fremantle every year.

The cumulative effect of travel is undoubtedly a real thing. There is a particularly interesting analysis by Matt Cowgill in Footballistics that suggests Perth-based players have shorter careers due to the wear and tear of maintaining an extreme travel regime.

However, there’s no way objectively to conclude that West Coast would be a better team if only they were based in Melbourne. I don’t know how you would run that experiment. It’s also slightly counter-intuitive, since high-travel teams have been quite a lot more successful than average in the AFL era, and this proposition would mean they are actually even better than that, because they’ve also been fighting a cumulative travel penalty.

Who knows; maybe it’s so. But it’s conjecture, so inadmissible.

Non-Factor #2: Six Day Breaks

It seems reasonable to assume that teams are at a disadvantage if they have fewer days to prepare than their opponents. And indeed, the number of short breaks usually comes up in discussions on fixture difficulty, or ahead of important matches.

Quantifying this advantage, though, is beyond me. I can’t find it in the data. And if it were anything like the magnitude of effect that people suggest (and, as with The Cumulative Effect of Travel, sometimes it is suggested to be the defining determinant of overall fixture difficulty), it should be easily spotted. But it isn’t.

If you’re interested, though, St Kilda also had the equal second most 6-day breaks (6) and the equal most consecutive 6-day breaks (2).

Find Max Barry @SquiggleAFL on Twitter

Models Leaderboard: Winners 2018

Congratulations to Darren O’Shaughnessy at AFL Live Ladders for topping the 2018 Squiggle computer models leaderboard with 147 tips!

147 was a popular number: That’s also the number of correct tips recorded by Punters (an average of many bookies’ closing lines) and Aggregate (an average of models in the competition). I noticed The Roar’s “The Crowd” scored 147, too.

That’s a good showing, and one that demonstrates, I think, a real “wisdom of crowds” effect, where the average tip is smarter than most of the individual tipsters it’s averaging. There was a lot of variation between models across the three tracked metrics (correct tips, Bits, and Mean Average Error), but Aggregate finished near the top on all of them. In fact, had it not been for one wild model tip in the West Coast v Melbourne prelim, Aggregrate would have finished clear on top with 148.

Models leaderboard: Click for interactive table

Massey Rankings, new to the Squiggle leaderboard in 2018, had a great year, correctly tipping the Grand Final to equal AFL Live Ladders on 147 tips. (We tie-break on Bits.)

In terms of MAE, AFL Live Ladders was #1 here, too, with a delightful 26.55 for the season, narrowly beating out Matter of Stats (26.61) and Aggregate (26.63).

On Bits, Squiggle was the best computer model with 39.27, although was narrowly pipped by Punters (39.76) overall.

Tony Corke at Matter of Stats had a year that demonstrates why the number of correct tips is a noisy metric, jostling for top spot on MAE and Bits but coming dead last in tips. This was never more evident than in Round 20, where Matter of Stats got the wrong end of all four of Super Saturday’s 50/50 games. Sometimes football is a harsh mistress.

HPN’s new model PERT provided tons of contrarian excitement, being the only one to tip 9/9 more than once — it did it three times — and finishing a solitary tip behind the leaders. This came about despite blowouts in MAE and Bits, so it will be interesting to see how it backs up in 2019.

Stattraction, also new this year, stumbled early, then made an extraordinary 24/27 run to close out the home & away season.

plusSixOne recorded the most correct finals tips with 7, including the Grand Final. It also finished only 1 behind the leaders for the year, along with Footy Maths Institute, who fought a highly entertaining running battle for outright leadership that came unstuck only in the finals.

Click for interactive chart

Heartfelt thanks to all the model authors — as well as those I’ve named, there’s also The Arc, Graft, and Swinburne University — who allow me to aggregate their tips! They do fantastic work and I hope that a comp like this means more people get to see it.

I also hope this site is useful to model builders out there who just enjoy playing around with footy numbers and want to know how their tips compare. (See: the Squiggle API.)

And thank you to the regular human visitors, who just like to keep an eye on what the models are saying.

The off-season is time for model tuning, so I hope our current crop will be back in 2019 with new and improved engines! I’d also love to add new competitors to keep them on their toes — if you’d like to join the fun, please check out this wishlist and contact me via Twitter.

Have a happy off-season! May your trades be fruitful, your drafts foresighted, and your spuds delisted.

Max.

What If: The Top Team Always Hosted The Grand Final (An Alternate History)

Rule: The Grand Final shall be hosted at the home ground of the participating side that finishes highest in the regular season and wins all its finals.

Result HGA* Adjusted venue Adjusted HGA Adjusted result
1991 HAW by 53 HAW +6.3 Waverley Park

Disaster in the west as the Eagles, seemingly on track to host the first ever Grand Final outside Victoria, lose their Qualifying Final at Subiaco to Hawthorn. It delivers the Hawks the right to host the Grand Final at their home ground of Waverley Park, which, in a convenient twist of fate, is where it would have been held anyway, as the M.C.G. is unavailable due to renovations.

Hawthorn duly defend Victorian pride, keeping both the premiership and the Grand Final a local affair for one more year.

Result HGA Adjusted venue Adjusted HGA Adjusted result
1992 WCE by 28 GEE +3.7 Subiaco WCE +12.1 WCE by 44

A series of unexpected results sends the Grand Final to Perth after the Eagles upset the Cats in a semi-final. A fortnight later, the Eagles repeat the performance and set Perth alight with a historic home premiership.

Continue reading “What If: The Top Team Always Hosted The Grand Final (An Alternate History)”

Squiggle: Now with Team Ins/Outs Awareness!

Earlier this year, HPN unveiled a new model named PERT based on player ratings, instead of team ratings like most other models. And it’s landed with a splash, currently sitting atop the models leaderboard on 74 correct tips.

It’s doing less well on Bits* and MAE*, which is a little suspicious, since those metrics tend to be better indicators of underlying model accuracy. But still! It’s enough to suggest there might be something in this crazy idea of considering who’s actually taking to the field.

So I’m hopping aboard. Starting this week, Squiggle’s in-house model considers selected teams and adjusts tips accordingly.

The difference that team selections make to each tip can be seen in the TIPS section of Live Squiggle.

In most cases, team selections will make only a difference of a few points to the Squiggle tip, which remains focused on team-based ratings. The adjustment is derived from a simple comparison of scores from AFL Player Ratings. So it will only swing a tip when it’s already close to a 50/50 proposition.

Over the last six years, this seems to deliver about a 0.40 point improvement in MAE. Naturally, though, 2018 will be the year it all goes to hell.

*  “Bits”: Models score points based on how confidently they predicted the correct winner. Confident & correct = gain points, unsure = no points, confident & wrong = lose points.

* “MAE”: Mean absolute error, which is the average difference between predicted and actual margins, regardless of whether the correct winner was tipped.

How to Become a Squiggle-approved Model

The Squiggle stable of models has grown to ten this year, and now includes almost all of the well-known public tipping models from across the web.

This site aims to bring together and promote the best of such models, and I’m interested in adding more. In order to keep the quality of such models high, this is what I look for:

  1. An official site where tips are posted
  2. A history of doing so for at least a year
  3. Public discussion and analysis of football
  4. Some transparency about the model

These aren’t all mandatory. There are currently some very mysterious models included in the stable because they are widely known and respected. But this is the ideal.

I know there are many solid tipsters with an Excel spreadsheet and a Twitter handle, which I don’t plan to list since they don’t meet the above criteria.

Ultimately my goal is to expose more insight into the how and why of football analysis, looking not only at who is tipping best, but why. What is the best way to treat Home Ground Advantage? How much difference does the weather make? How important is it to consider player rankings, or recent form? What factors matter most in determining which side wins a game of football?

Squiggle Ladder Predictor

The official AFL ladder predictor has a few problems:

  • Not available until late in the season
  • Requires ten thousand clicks to complete
  • Created by Telstra monkeys in 1994

So I made a new one!

Squiggle AFL Ladder Predictor

Squiggle AFL Ladder Predictor

Now you can tip as few or as many games as you like and lean on the world’s best computer models to fill in the rest.

You can even go back and change some of the computer tips if you want.

I believe the world is a better place when people can generate wildly optimistic ladder projections with ease.  At the moment it’s only for the home & away season, but I’ll probably add finals sometime later.

How AutoTip Works

The Predictor fetches data from the Squiggle API, including Aggregate tips from computer models such as The Arc, Matter of Stats, FMI, HPN and many more. This provides a tip for each match as well as a confidence rating about how likely the team is to win.

It’s no good to simply tip the favourite in each game, though, because in real life, favourites don’t always win. For example, if two people play 10 games of chess, and one player is 60% likely to win each match, we want to be able to predict that the final tally will be about 6-4, and not 10-0, like we’d get if we tipped the favourite to win each time.

So AutoTip runs several thousand simulations, each time applying an amount of randomness to the predicted result, so that a team that is 70% likely to win a match will only win it in about 70% of simulations.

The simulations are then analyzed to determine the average finishing rank, number of premiership points, and percentage of each team.

This provides a pretty reliable estimation of where each team is likely to wind up on the ladder. However, we don’t merely want average numbers: We want to see specific tips for each game. So next AutoTip scores each simulation based on how closely its ladder resembles the average. Then it selects the “most normal” one.

This means AutoTip will contain many upsets, and which upsets they are will change each time. But the upsets will be spread around evenly, so that each team finishes the season about where they’re expected.

Richmond v Daylight

Richmond have been doing two things Squiggle particularly likes: Holding oppositions to low scores, and generating plenty of scoring shots.

The Tigers have been kicking plenty of behinds this season right from the start: In Round 1, they defeated Carlton by 4 goals and 14 behinds, while in losing to Adelaide the following round, they lost by 6 goals and 0 behinds.

On the surface, that’s a close-ish 26 point win followed by a heavier 36-point defeat; in terms of scoring shots, it’s a +16 smashing followed by a much closer -6.

Squiggle’s model considers the reality to be somewhere in between. As a result, it considers the Tigers’ only loss of the season so far to be a relatively close one, away interstate to a very good team – the kind of game that even a top team will often drop. The Tigers’ wins, on the other hand, have included some extraordinary smashings when viewed in terms of scoring shots.

Richmond’s opposition to date has mostly included mid- to upper-tier rated teams in Adelaide, Collingwood, Hawthorn, and Melbourne, yet across the season’s 6 rounds, the Tigers have averaged 50% more scoring shots. The result is a lot of Squiggle love for the yellow and black.