logo

Stats vs. Eye-Test is Dead – There is Only the Acceptance or Denial of Evidence-Based Analysis

alt
Photo credit:Kelley L Cox - USA TODAY Sports
Jeremy Davis
6 years ago
For as long as sports have been around, they’ve been romanticized. We like to weave stories around athletes who perform impressive feats at the unlikeliest of times, and the amalgamation of these tales, passionately stitched together, produces the mythology of a player, a team, or a sport.
For a long time now, statistics have disrupted this mythology. They don’t buy into the big moments, treating them like any other event. They aren’t fooled by fluctuations in samples known to be affected mainly by luck. And, in general, they’re all business and very little fun.
I can see why the average sports fan, and hockey fans in particular, would be wary of statistics. Not only do they tend to poke holes in their favourite storylines, but there has also been a tendency to poke holes in beliefs that are not only commonly held, but seem to backed by common sense.
Among the targets of the advanced stats movement have been face-off percentages, blocked shots, zone start ratios, quality of competition, and of course plus-minus. It’s generated a significant amount of skepticism in the uninitiated, many of whom often wonder if stats folk are watching the games at all.

The Dichotomy

Thus the stats versus eye-test debate was born. This was pitted as a dichotomy, with each faction having an opposing opinion on every event and concept with the game. As the story goes, the stats community thought the eye-test community placed too much emphasis on intangible elements, while the eye-test community felt that numbers couldn’t possibly capture the randomness and nuance of a continuous and flowing sport such as hockey.
Perhaps both “sides” would have a point. Statistics will always struggle to explain every single thing that happens out on the ice. There will also always be a debate as to how much weight should be given to certain events and characteristics, and whether aspects like leadership and clutch will ever be measurable, or if they’re even worth measuring.
Unfortunately, this whole two-sided debate is built on a lie: there is no such thing as Stats versus Eye-Test.
The entire structure is based on a false dichotomy. The problem is that there are no fans worth their salt on the “stats side” that don’t also watch the games and generate opinions based on things that they see. To suggest such a thing is merely a tactic designed to undermine the power of a stats-driven argument. And without a doubt, this tactic is frequently used.
alt
Both sides are capable of using the eye-test, and indeed both sides will on many occasions. The only actual debate is the acceptance of statistics as a trustworthy explanation of what fans are seeing, or aren’t seeing.

Denial of the Scientific Method

As a card-carrying member of the eye-test-only camp, what you are really stating is your denial of evidence-based analysis — the denial of the scientific method.
It’s a bold stance to take. History hasn’t been kind to those who refuse to go along with an evidence-based approach. Let’s go through a brief and non-exhaustive review of some of the casualties of the scientific method.
  • Geocentricism – the concept that the sun and the stars revolve around the Earth rather than it being the other way around.
  • Flat Earth Theory – the idea that the Earth is flat, instead of spherical.
  • Alchemy – the theory that contained claims that included, among others, that one could convert base metals, like lead, in noble metals like gold.
  • Astrology – the study of how the movement of celestial objects affect human traits and affairs.
  • Phrenology –  the belief that one could read bumps on a person’s skull to learn about their personalities.
Each of these once-widely accepted theories was put to rest when testable and repeatable evidence was mounted, overcoming commonly held beliefs. This is the function of the scientific methods, one of the most essential techniques in human history. The method is simple, yet profoundly effective. You test, observe, and record. Make a note of what the evidence indicates. Build on the ideas that are supported, and discard those that are not, even if you favoured them.
If you continue to refuse to believe how evidence-based analysis can see things that you can’t and make conclusions that don’t jive with your assumptions, the ideas above are the types of ideas with which you’ve aligned yourself. Pseudoscience, disproven ideas that look utterly ridiculous with the knowledge that we have now.
Society isn’t done with these types of controversies either. We’re witnessing one of the biggest ones right now in the climate change divide. One side has a figurative mountain of evidence behind it, while the other refuses to let science tell them something that they cannot perceive with their own senses. This debate, I think, provides a very relevant comparison to evidence-based analysis in hockey.
Consider the following explanation of weather versus climate from Neil deGrasse Tyson on Cosmos: A Space Odyssey. In the video, Tyson walks a dog along a beach. The dog veers back and forth, testing the limits of its leash, but all the while heading in the same general direction. The dog’s path is weather, Tyson reveals, while his path is climate. This is how scientists can predict the climate of the planet, even if they struggle to predict weather on a day-to-day basis.

Failed to load video.

The same can be said of events in hockey. Our ability to predict individual games is measured in likelihoods, not assurances. Shots in hockey are tiered based on their level of success. Some are blocked before coming close to the net; some miss the net; some are stopped by the goalie. It’s only a select few that land in the back of the net. It’s also only these that are required for a victory.
Only one in every 20 or so attempted shots becomes a goal, on average, and it’s difficult to pinpoint which shots those are going to be, even with what we’ve nailed down regarding shot quality. But the fact that in some games there are goals three goals for every 20 shot attempts and in others there are none doesn’t change the fact that the average will force the shooting percentage to balance out over the long term. This is how regression to the mean works.
It’s also why shots are a more reliable indicator of future success than goals are. By and large, we know that eventually a specific percentage of them are likely to go in the net, so their value can still be objectified, and there are roughly 20 times as many of them. So we can feel comfortable making some light conclusions based on shots after 20 or 30 games, while goals (or wins, for that matter) can be misleading even on a season scale. That’s how analysts used statistical evidence to accurately predict the downfall of Patrick Roy’s Avalanche and Bob Harley’s Flames after their pop-up playoff runs in 2013-14 and 2014-15 respectively.

Parsing Meaning from Randomness

Despite a history of predictability at large scales, many assume that hockey is simply too complicated for numbers to grasp. I think the following points from Garret Hohl and Felix Sicard sum up convincing rebuttals to this proposition.
Firstly, in spite of hockey’s obvious inherent randomness, predictive models have been able to predict single-game results at well above success levels of 50%, which is what you would expect if the results were completely random. At season scales, predictions have an even greater rate of success.
Secondly, there is either a sense of extreme self-importance or extreme naivety associated with the idea that the sport of hockey is too difficult for math to comprehend. We, the race that is predicting changes in the climate of the planet, that uses mathematics to determine with accuracy how many asteroids will pass between the Earth and the moon in the next hundred years, that has cured a myriad of fatal diseases, landed astronauts on the surface of the moon, and split atoms apart, cannot possibly devise a way explain 10 grown men using sticks to whack a disc of vulcanized rubber around a sheet of ice.
I’m forced to assume that this logic is paired with a lack of understanding or appreciation of those other accomplishments. It seems like there would be a large amount of overlap between these doubters and people that deny climate change, think vaccines cause autism, believe the moon landing was faked, and are more familiar with the Big Bang Theory TV show than the theory that explains all of existence.

Missing the Forest for the Trees

The thing is, hockey analytics is an evidence-based endeavour, and by definition, that means that there is plenty of evidence out there to back up its claims. How often do you see people that denounce the predictability of hockey analytics back up their claims with evidence?
Rather, I think the case is somewhere between a lack of interest and a lack of understanding. Opponents of analytics tend to miss the forest for the trees: the forest, in this case, being evidence-based analysis and the trees being singular entities like Corsi and regression. Many seem to get so focused on their dissatisfaction with a single metric that they can’t be bothered to understand the movement as a whole. They associate Corsi directly with analytics, and when this one basic metric fails to line up with their expectations, they lump the entire idea of analytics together and thoroughly dismiss it.
The truth is, there is no one measure or one technique that defines hockey analytics. Analytics is not a stat, or a collection of stats. Analytics is hockey’s representation of the scientific method. To cast it aside is to deny one of the of our species’ greatest assets because of a case of cognitive dissonance: the stats didn’t tell you what you wanted them to, and therefore they are flawed.
There is so much more information out there, if you are simply willing to look. It might seem daunting for the uninitiated, but aggregators like NHL Explainers on Twitter, or the website MetaHockey.com have put in a lot of effort to ensure that the wealth of research is easily searchable and accessible.
As I said, there is no Stats versus Eye-Test. There is no dichotomy. What there is is a continuum on which your trust and comprehension of evidence-based analysis lies. Some are more comfortable than others with allowing stats to tell them something that they haven’t seen, even if they both have faith in the measures.
At the other end of the spectrum, you have the group of people that are certain that analytics will never be able to explain hockey. Right next to them, you have people like Andrew Walker, that claim that they see value in analytics, but that it just isn’t quite developed enough yet. This is generally an excuse for a lack of understanding or investigation into what these people are dismissing. Do you think Walker has spent much time on Hockey Graphs studying the efficacy of predictive models? Researching why some skills are overrated, and others are underrated? I’d be shocked if he had and is still able to make the claims that he puts forth on twitter, radio, or television. Mixing that with the pompous attitude required to tell others what they do or do not understand without even bothering to understand the evidence behind the other’s position, and what you get is a stubborn refusal to accept advancement.

Progressivism versus Traditionalism

I get that this is a difficult time for some members of traditional media, with bloggers and fans sitting at home fact-checking their every claim. Gone are the days when radio personalities can be taken at their word.
Gone too are the days when you can guarantee whether a player has been “good.” What defines good is not the same in every person’s mind. While some are still swayed by production, like Derek Dorsett’s surprising seven goals so far this year, others take into account data that is known to predict future outcomes, such as being horrifically outshot every time he’s on the ice. Are seven goals definitely “good” if even the staunchest defenders acknowledge that it won’t continue, or is it simply lucky? And if it were luck, would that really be so bad? Luck is why the games are played after all.
I don’t expect people like Andrew Walker to ever fully accept or understand the value of analytics. If anything, he’s headed steadfast in the opposite direction, declaring on Saturday and again on Monday that the advanced stats community has “lost him.”
And that’s fine. People who are wired a certain way aren’t ever likely to change. It took the Catholic church hundreds of years to accept that it was wrong about the location of the Earth relative to the sun and stars. Donald Trump and his cronies aren’t ever likely to fully grasp the concept of climate change.
The difference here is that the Catholic church and the Republican government have the power to stand in the way of systemic change in favour of evidence-based reasoning. Traditional media members like Walker have no such power. They will sit on the sidelines and express their doubts, ever falling onto deafer ears. When is the last time a radio host was hired by an NHL team to consult or even head up a department?
The advanced stats community, as it were, is certainly far from infallible, and they don’t always get things right. However, those that accept the use of logic, reason, trial and error, empiricism, and science, are going to be the ones pushing progress and making a difference when all is said and done. History has repeated itself in this regard many times over, and it will continue to do so in the future. I’d say “pick a side,” but as I’ve established, there are no sides here. There is only the acceptance and denial of evidence-based analysis. If Walker and his ilk choose to remain in the realm of denial, I’d suggest they enjoy their relevance in the field while they still have it. That era is coming to an end.

Check out these posts...