According to Evolving-Hockey.com, the Minnesota Wild’s skaters led the NHL in goals above replacement (GAR) in the 2019–2020 regular season. In other words, their calculations, which isolate team performance by adjusting for contextual factors such as strength of schedule and back-to-backs, and separate the performance of a team’s goaltenders from their skaters, led to the conclusion that if you replaced every single skater on the Minnesota Wild with a replacement level player (an easily replaced player that falls below the 13th forward, 7th defenseman, or 2nd goaltender on their team’s chart), then their goal differential would drop by a larger number than it would if you did the same thing to every other team in the NHL. To put this all more concisely: by this model, outside of goaltending, the Minnesota Wild were the NHL’s best team in the 2019–20 regular season.
If you’re unfamiliar with goals above replacement, I’ll lay out some quick groundwork: it’s a highly effective model made by EvolvingWild that allows us to improve our player evaluation by isolating a player’s individual contributions from various external factors such as competition, teammates, and usage, and then comparing their contributions to those of a typical replacement level player and determining how much value they’ve added. The model is built from the ground up at the individual player level, and team values are merely aggregated player values, but the statistic is also fine for use at the team level. Since 2007–2008, teams with more regular season goals above replacement have won 60.5% of their playoff series, while teams with higher regular season standings rankings have won only 55.9% of their series, which shows the model has solid predictive power even though it is built to be descriptive in nature. I won’t go into further detail, but it’s a great model that arguably a better "catch-all" snapshot of a player’s value than any other hockey stat that currently exists. You can read about it in more detail from this 3-part series authored by the model’s creators:
Wins Above Replacement: History, Philosophy, and Objectives (Part 1)
Now, back to the Wild. It may seem a bit surprising that their skaters ranked first in goals above replacement, considering they also ranked 19th in goal differential and 21st in points percentage. If we assume that the model is valid and accurate – which I’m comfortable doing – then the discrepancy here between model output and real-life standings results can only be explained by two factors:
- The Wild had far-and-away the toughest schedule in the league, playing back-to-backs every other night against exclusively the league’s top teams.
- The Wild’s goaltenders were so terrible that they sunk a bona fide Stanley Cup Contender down to 21st place.
According to the model itself, it’s mostly the latter. Their goaltenders ranked dead last with 6.8 goals below replacement. This means that if you took a typical 3rd string goalie and put them in net for the Wild, their goal differential would’ve improved by about seven goals. This is far worse than the second-worst set of goaltenders, which belonged to the Detroit Red Wings and contributed 1.2 goals below replacement. These numbers aren’t exactly easy to reconcile by save percentage; Minnesota’s goaltenders had a combined save percentage of 90.25%, while Detroit’s goaltenders had a combined save percentage of 89.36%. So, how is it that Evolving Hockey calculated that Detroit had significantly better goaltenders?
The answer lies in shot quality adjustments. Evolving Hockey calculates the probability of a shot on goal or missed shot becoming a goal based on various factors such as reported distance from the net (the most important factor by far), reported angle from the net, and game strength to calculate the probability of each shot becoming a goal, and assigns that shot an "expected goal" value based on the probability of it becoming a goal. (To give an example, a dangerous shot with a 35% chance of becoming a goal would have an expected goal value of 0.35.) Their calculations found that the average shot faced by Minnesota’s goaltenders was significantly less likely to be a goal than the average shot faced by Detroit’s goaltenders – so much so that when you adjust for the quality of these shots, Detroit’s goaltenders come out looking far better despite their inferior save percentages. Intuitively, this makes sense. We know that the biggest factor that determines the probability of a shot becoming a goal is distance from the net. We also know that the reported distance of the average shot on goal that Detroit’s goaltenders faced was 32.2 feet from the net, while the reported distance of the average shot on goal that Minnesota’s goaltenders faced was 36.69 feet from the net. So, there you have it: the Wild just need better goaltending, and they’ll be the best team in the league. Right?
Well, not exactly. While it’s true that Minnesota goaltenders faced, on average, easier shots than Detroit’s goaltenders faced, it’s also true that the stats I just listed may not describe what actually happened. If you closely re-read the prior paragraph, you’ll see that I was careful to specify that this was the reported distance of each shot on goal. I make this distinction because I have ample reason to believe that the scorekeepers at Xcel Energy Center (Minnesota’s home arena) reported that those shots on goal were taken slightly further from the net than they actually were, and that these scorekeepers do the same for shots that miss the net.
Note: For the remainder of this article, I will use the term "shot" to refer to not only shots on goal, but also shots that miss the net. These shots are known by the Analytics community as Fenwicks (named after analyst Matt Fenwick), and I use them instead of shots on goal because I believe that forcing shots wide and hitting the net with shots are skills that goaltenders and skaters respectively possess, and I wish to credit those who display proficiency in these skills and punish those who do not.
Now, back to the Wild: It may seem harsh to accuse Minnesota’s scorekeepers of doing their job poorly, but there’s simply too much evidence to just brush it aside. Over every single one of the past thirteen NHL seasons, the average reported shot distance in Minnesota home games at both ends of the ice has been significantly further from the net than the average reported distance in their away games. Here are the numbers for every one of those seasons displayed alongside league average to give you an idea of the distribution:

On the road, Minnesota’s style of play has wavered back and forth between playing a bit closer to the perimeter than league average and playing a bit closer to the net than league average, but there’s never been a huge discrepancy. Throughout the entirety of this time, the average of reported shot distances for Minnesota’s away games is 34.69 feet, while the average of reported shot distances in all NHL games is…34.69 feet. By comparison, the average of reported shot distances in Minnesota’s home games is 37.27 feet.

This isn’t just some one-off occurrence; this phenomenon has persisted year after year over a sample size of over 500 home games, over 500 away games, and over 15,000 total NHL games.
One valid counter-argument to the erroneous scorekeeping hypothesis is that the Wild just play a different style at home than they do on the road. Perhaps they clamp down and take fewer risks at home where they feel they’re able to control the flow of the game, but they play into their opponent’s more typical style on the road. This is possible, but it’s also very unlikely that this trend would persist across thirteen straight seasons. It becomes even less likely when you consider that the Wild have had four general managers and six different head coaches in charge over this sample, with each of them likely deploying different strategies. In every one of these seasons, regardless of who has been in charge of the team, the average shot distance in their away games has never deviated more than 1.5 feet from league average in either direction, while the average shot distance in their home games has always been over 1.5 feet above league average. So, while we can’t be 100% certain that Minnesota’s scorekeepers track shot data incorrectly, it is significantly more plausible and requires significantly fewer assumptions than any other conclusion. With that in mind, I will be moving forward with the assumption that Minnesota’s scorekeepers do exhibit a pattern of erroneously tracking shots further from the net than they actually are, and I’m going to refer to this pattern of behavior as "scorekeeper bias."
Now that we’ve laid out the groundwork for scorekeeper bias, and agreed to work forward under the assumption that it does exist, the next question is how exactly this plays into Evolving Hockey‘s goals above replacement ranking of the Minnesota Wild. The answer to that question is defense. At the skater level, the defensive components of this metric are mostly quantified through expected goals against, and as we’ve covered, the most component of that stat is the reported shot distance of shots that are allowed. If Minnesota’s scorekeepers report shots are further from the net than they actually are, then the expected goal values of the shots that their skaters allow will be lower than the "true" probability of those shots becoming goals, and therefore it will appear that these skaters are doing a better job of suppressing expected goals than they actually are.
Here’s another way of visualizing it: imagine an opposing player takes a shot in the slot. If this shot is properly recorded, an expected goal model may calculate that it has a 25% probability of scoring, and the shot would be assigned a corresponding expected goal value of 0.25. But, if Minnesota’s scorekeepers register this as a shot from outside of the circle, the expected goal model may use the reported distance to calculate that it only has a 10% chance of scoring, and the shot would only be assigned an expected goal value of 0.1. This would inaccurately flatter the defensive performance of Minnesota’s skaters by suggesting that they allowed a less dangerous shot than they actually did. And, if Minnesota’s goaltender allowed a goal on that shot, they would be erroneously be credited with 0.9 goals below expected, rather than 0.75 goals below expected. This doesn’t sound like much, but over the course of a season, or the course of many seasons, it adds up to a very significant number:

As you can see, Minnesota has absolutely been plagued by poor goaltending. But this issue is being compounded by scorekeeper bias, which isn’t present when they play on the road.
What about offense? Minnesota’s home & away splits in the tables above display shots for both teams, which means Minnesota’s scorekeepers also report that shots taken by the Wild at home are further from the net than they actually are. So, it follows logically that the expected goal values for Minnesota’s shots would also be inaccurately low, and therefore their offensive ability to generate dangerous shots should be overrated to roughly the same degree that their defensive ability to suppress dangerous shots is overrated, right? The same visualization from above, looking at goals scored, rather than saved, above expected at home and away supports this:

As you can see, Minnesota’s skaters look like elite shooters at home, but slightly below average shooters on the road. Logically speaking, one may suspect that these issues essentially balance one another out. And they sort of do; the expected goals on both sides of the ice are equally balanced out. But unlike defense, goals above replacement measures offensive contribution using actual goals, not expected goals. And since the NHL reviews every single goal to ensure it entered the net, scorekeeper bias will not be at play for actual goals. Theoretically speaking, if the Wild display average goal-scoring proficiency, it doesn’t matter whether or not they did so through poor quality shot volume and excellent shooting or the opposite; their offensive goals above replacement will be that of an average team.
Minnesota wasn’t actually an average offensive team this season, though. They were significantly above average. This may surprise you (it surprised me), but Minnesota actually ranked fifth in 5-on-5 goals per hour and tenth in power play goals per hour. This is right in line with their rankings in goals above replacement: fifth in the even strength offense component and ninth in the power play offense component. Given this information, I’m comfortable saying that this model is giving their offense a fair shake, and that defense is the only issue with this particular model.
Before we move on, let’s remember where we started: Evolving Hockey’s goals above replacement is a highly effective evaluation metric which states that in the 2019–2020 NHL regular season, the Minnesota Wild had the best skaters in the league and the worst goaltenders in the league. But we have reason to believe with very strong confidence that the scorekeepers at Minnesota’s home arena have consistently reported that shots are slightly further from the net than they actually are, and we know that if they are doing this, then goals above replacement will overrate the defensive performance of Minnesota’s skaters at the cost of their goaltenders, with no penalty to their offense.
While I have focused exclusively on the scorekeepers at XCel Energy Center, they are not the only ones that exhibit scorekeeper bias. They aren’t even the worst offenders. Historically, the worst offender by far has been the scorekeepers at Madison Square Garden who do the opposite of Minnesota’s scorekeepers and report that shots are closer to the net than they are, and who are somehow even less accurate than Minnesota’s scorekeepers. Additionally, while Madison Square Garden has actually improved their issues over the past two seasons, Minnesota still hasn’t even had the largest discrepancy between their home and away shot distances.

As you can see, the absolute value of the difference between average shot distance at home and away is actually larger for the Philadelphia Flyers who play their home games at Wells Fargo Center than it is for the Wild. Dallas, Anaheim, and Chicago are also all close. Could it be that most rinks are prone to some degree of scorekeeper bias? The following visualization suggests this may be the case:

As you can see, teams show far more variance in average shot distance when they’re playing at home than they do on the road. On the road, the average shot distance for every team falls in between a tight area of 32.5 feet and 35.5 feet, while at home, teams fall between a much larger area of 30.5 feet and 37.5 feet. In addition, the variance for home distance is 2.06 feet, while the variance for away distance is only 0.31 feet!
So, it’s fair to say that the Xcel Energy Center in Minnesota probably isn’t the only rink that suffers from scorekeeper bias. I think it makes sense, logically, that human error is at play here, and that most scorekeepers do things slightly different from other scorekeepers. In a perfect world, we would have scorekeepers traveling to different arenas in order to randomize this issue, but this world is not a perfect one, and doing so would incur major travel costs, so we must make do with the data that we have.
I should also point out that I’m not the only person who has identified this, and I’m far from the first. I was ten years old in 2007 when Alan Ryder released "Product Recall Notice for ‘Shot Quality’" where he introduced this issue and stated:
we clearly have an issue of RTSS scorer bias. The problem appears to be brutal at Madison Square Garden but is clearly non-trivial elsewhere. The NHL needs a serious look at the consistency of this process.
Many other hockey analysts have identified that this is an issue, and provided additional evidence to support their position. It is quite clear that scorekeeper bias in the NHL is a true phenomenon.
Now that we’ve established that scorekeeper bias in XCel Energy Center exists, explained how it leads us to inaccurately evaluate the Minnesota Wild’s skaters and goaltenders, and established that XCel Energy Center isn’t the only arena that scorekeeper bias exists, it only logically follows that many other skaters and goaltenders may be overrated or underrated due to scorekeeper bias. For example, the average shot in Anaheim Ducks games at Honda Center over the past two years is reportedly 2.56 feet closer to the net than the average shot in an Anaheim Ducks away game over this sample. If this figure is actually the result of shot distance being inaccurately reported, then it logically follows that Anaheim’s defensive performance will be underrated, and that their goaltending will be overrated. The same thing is true for Dallas, and the same thing is true in reverse for Philadelphia. The bottom line here is that scorekeeper bias leads us to some inaccurate skater and goaltender evaluations, especially in the case of goals above replacement.
How do we fix this? Well, I don’t. I didn’t make the model, and it’s not for me to decide how to fix something that I consider an issue. It’s not even up to me to decide that it is an issue or that it needs to be fixed. What I’m aiming to do here is simply provide an estimate as to how much the effects of scorekeeper bias may be misrepresenting what skaters and goaltenders actually do on the ice, and allow us to keep those estimated effects in mind when analyzing Evolving Hockey’s goals above replacement and other advanced hockey statistics that use expected goals and do not account for the effects of scorekeeper bias.
In order to do this, I’m going to do three things:
- Build an expected goal model that performs well enough in testing that I am comfortable using it as a descriptive model for evaluating the quality of shots that have already occurred.
- Determine an adjustment for scorekeeper bias that does not harm the performance of my expected goal model when I replace the reported shot distances that I’ve obtained with the adjusted shot distances that I’ve calculated.
- Build a Regularized Adjusted Plus-Minus (RAPM) model that allows me to provide a point estimate of a player’s isolated offensive, defensive, and net impact on expected goals. I will do this because goals above replacement uses a player’s isolated impact on expected goals against from Evolving Hockey’s RAPM model as one of the main components for evaluating skater defense, and most of the other components which they use correlate very closely with this measure. Thus, RAPM will provide a solid estimate of what a skater’s contribution would be to the defensive component of goals above replacement. (Additionally, RAPM is also used by many fans and analysts on its own as a tool for skater evaluation, and the point estimate for a player’s offensive impact on expected goals carries more weight there than it does in goals above replacement.)
The end goal of this process will be to provide one point estimate of a skater’s isolated offensive, defensive, and net impact using reported shot distance, one point estimate of a player’s isolated offensive, defensive, and net impact using adjusted shot distance, and then display them side-by-side in order to provide an estimate of how adjusting for scorekeeper bias may affect a player or team’s isolated impact according to one of these models. I also wish to do the same for goaltenders, providing one point estimate of a goaltender’s isolated performance using reported shot distance and one point estimate of a goaltender’s isolated performance using adjusted shot distance. It is my hypothesis that goaltenders such as Minnesota’s Alex Stalock and Devan Dubnyk, pictured in this article’s featured image, will look significantly better after adjusting for scorekeeper bias.
In part 2, I will go over this process. Stay tuned.