“Last year we had 13 points out of 10 games and these fans have seen now we have 20 points out of 10 games, we are improving a lot but when you think it can be done within one year…” As you might have already noticed I am quoting Van Gaal after getting criticized because of the early exit in the Capital One Cup at home versus Middlesbrough.
While this ‘improvement’ seems nice on the first hand I had some doubts about it. First of all, if you achieved only 1.3 points per game last year (in the first 10 games) improving is not that spectacular. Secondly, are those 20 points a fair reflection of the ‘real’ quality of the team? And finally, which opponents did they face / how tough was their schedule?
If you want to make a comparison between seasons, with such a small amount of games played, I believe it makes way more sense to compare the number of points achieved in the equivalent matches in the previous season. This is exactly what Simon Gleave’s SCoRe (@SimonGleave on Twitter) measures. However, as he unfortunately doesn’t publish those graphs on a regular basis anymore, I started making my own.
SCoRe Methodology :
Seasonal Comparative Results sounds great, but how does it actually work? Well, you start with comparing the results achieved in the current season to the equivalent matches in the previous season. As this method doesn’t work for the promoted teams, they take over the points from the relegated teams. The best promoted team gets the points of the best relegated team, the second best promoted team gets the points of the second best relegated team, etc. Premier League: 18 – 1, 19 – 2 and 20 – 3 (play-off winner).
In every game there are 7 possible outcomes / SCoRe’s for each team, which are: -3, -2, -1, 0, 1, 2, and 3. For example, if Team A wins at home from Team B whilst the same game ended in a draw in the previous season, Team A receives a SCoRe of 2 and Team B a SCoRe of -1. All the SCoRe’s of each team are added up and used as the input for the projected final table. You take the final table from the previous season and lower or upper each team’s points with their current SCoRe to get the projected points of a team this season.
Okay, let’s go back to Van Gaal’s statement that ‘his team has improved a lot’ and have a look at their SCoRe after 10 games.
As you can see in the table, according to SCoRe, Manchester United didn’t ‘improve a lot’ in the first ten games. Last season they took 22 points in these ten fixtures compared to 20 points this season, which gives them a SCoRe of -2. As they ended up with 70 points in the final table, they were projected to take 68 points this season.
In the following seven matches they faced some easier, at least in terms of ELO, opponents. They were expected to improve their SCoRe as they grabbed just 11 points in these fixtures last year, but they ruined their title chances with some awful results and are now projected to end up with 66 points.
Bournemouth <–> Hull, Watford <–> Burnley and Norwich <–> QPR
What I like the most is the simplicity of SCoRe while still being reasonably accurate in predicting the final table. The SCoRe of a team is pretty easy to calculate and you get a nice view of the difficulty of the scheme a team has faced. However, SCoRe is quite slow in picking up surprising teams. For example Chelsea is still predicted to end up in the top four with 69 points. Besides them, there are multiple other examples, like Watford, Leicester, Inter Milan, Hellas Verona, Dortmund and Celta de Vigo which all seem pretty over- or underrated by SCoRe.
Last season, Chelsea took 36 points in the equivalent fixtures they have played this season. At the moment they only have 18 points, which gives them a SCoRe of – 18. However, as they became champions last year with 87 points, they are still forecasted to end with 69 points. This implies that they have to pick up 51 points in the remaining 21 matches to match this projection, which is an average of 2.43 points per game (PPG). Of course, it is very unlikely they will even come close to that. Therefore, I started thinking how can the accuracy of the projected table be improved, whilst still keeping the simplicity.
Then I came up with the Adjusted SCoRe (A-SCoRe). The first part of this method is exactly the same as SCoRe, but there is a calculation added. From the predicted points of each team the number of points already achieved is subtracted. This gives you two parts from which both the PPG is calculated.
Then a weighted average of the PPG achieved and needed PPG is calculated. This number is used as an estimate for the PPG in the rest of the season and is multiplied by the number of games to play. After that, the current number of points is added up to complete the calculation.
((Current PPG * (Matches played / Total matches) + Needed PPG * (Number of matches still to play / Total matches)) * Number of matches still to play + Current Points
As already mentioned earlier on, according to SCoRe, Chelsea is still forecasted to end up with 69 points, despite having picked up just 18 points. Their current points per game is 1.06 while they have to get 2.43 points per game in the remaining 21 matches to reach the projected 69 points. As you can see the model needs an adjustment and that’s exactly what A-SCoRe does.
To calculate the weighted average, their current PPG is multiplied by 17/38 and summed up with their needed PPG which is multiplied by 21/38. This gives Chelsea a forecasted 1.82 PPG for the rest of the season, which seems plausible given their squad. The forecasted 1.82 PPG is multiplied with the number of games to play (21) which leads to an expectation of 38 points. Their current number of points (18) is added up, which gives a projection of 56 points.
(1.06 * 17/38) + (2.43 * 21/38) * 21 + 18 = 56 points
To control the validity of this projection (and the model in general) I have calculated the impact of Chelsea’s next game on their A-SCoRe. For each game there are 9 possible scenario’s which are stated in the following table. An important remark is that the expectation is the same as the result in the equivalent match in the previous season. So, the match between Chelsea and Watford is expected to end in a draw while in terms of ELO Chelsea would be expected to win. Last season: Chelsea 1-1 Burnley, Burnley <- -> Watford.
As you can see in the table, Chelsea can improve their A-SCoRe significantly by winning the game, whilst a loss and a draw would hurt their top six ambitions. The changes in A-SCoRe seem reasonable as winning when expected to loss > than winning when expected to draw > than winning when expected to win. The same logic follows with drawing and losing, which are both the least worse for their A-SCoRe when they were expected to lose and the most worse when expected to win.
Another observation is that the model is pretty, maybe a bit too, harsh after a loss. However, in defence of the model, this model (like most Expected Goals models do as well) includes a strong regression-to-last-season-factor, as last season results are the basis of the computation. But after such an amount of games played in the season, last season performances should not have a high impact in the prediction anymore.
I want to finish this piece with the projected Premier League table with adjusted seasonal comparative results to 2014/2015. I will post these graphs, Bundesliga, La Liga, Serie A and Eredivisie as well, on a regular basis on my twitter timeline. I hope you all enjoyed reading and I look forward to your comments, views, recommendations, etc.