**Euro Club Index & ClubElo**

The Euro Club Index and ClubElo both use historical and actual sporting results as input for each teams ELO rating. The impact of more recent games if of course higher than that of older games.

Before every match the probabilities of the three match results (home team win, draw and away team win) are calculated. These probabilities are calculated using the current difference in ELO ratings between the two teams adjusted with home field advantage. The amount of ELO points a team wins or loses from a game depends on the actual outcome compared with these probabilities. So, when Team A was the clear favourite to win the match versus Team B, because team A had a much higher ELO rating, they will gain only a small amount of ELO points when they win. Whilst Team B could have gained a lot of ELO points by winning the game, because they were not expected to win. Teams always exchange ELO points, which means that the amount of ELO points won by Team A is the same as the amount of ELO points lost by Team B.

The ELO ratings are used to calculate the expected amount of points a team will gain in the remaining games of the season. This sum is added up with the current amount of points achieved to come up with the projected final table.

**11tegen11’s Expected Goals Model**

The above mentioned models only consider results and completely ignore how the game went. However, in football, many games are decided by efficient finishing and quite often the best team does not win. Because of that, I am quite happy that 11tegen11 is willing to share his projections as well. What his Expected Goals Model measures is the amount of goals a team ‘should have’ scored and conceded based on the amount and quality of the chances it had created/faced against. His model uses expected goals, expected goals against, shots, passes, goals and the past season to generate team ratings. These team ratings are used to calculate the expected amount of points a team will gain in the remaining games of the season. This sum is added up with the current amount of points achieved to come up with the projected final table.

**SCoRe & A-SCoRe**

SCoRe, which means Seasonal Comparative Results, is an idea of Simon Gleave (@SimonGleave on Twitter) and implies the difference between the points achieved this season and the points achieved in the equivalent matches in the previous season. For example, if Team A wins at home versus Team B whilst the same game ended in a draw last season, Team A receives a SCoRe of 2 and Team B a SCoRe of -1. As this method does not work with the promoted teams, they take over the points from the relegated teams. The best promoted team gets the points of the best relegated team, the second best promoted team gets the points of the second best relegated team, etc. Contrary to the previous models SCoRe does not use team ratings in its projections. It just takes the final points of each team from the previous season and adds up their current SCoRe to predict the final table.

As I thought the predictive value of SCoRe could be improved by making an adjustment, I came up with the A-SCoRe. From the predicted points of each team (using SCoRe) the number of points already achieved is subtracted. This gives you two parts which are the current number of points achieved and the needed points to achieve the projection using SCoRe. From both parts the points per game (PPG) is calculated. Then a weighted average, based on the number of games played and number of games still to play, of the PPG achieved and PPG needed is calculated. This number is used as an estimate for the PPG in the remaining games of the season and will be multiplied by the number of remaining games to calculate the expected amount of points a team will gain in the remaining games of the season. This sum is added up with the current amount of points achieved to come up with the projected final table.

**Summary**

As you can see, the models differ in the way they calculate the expected points a team will take in the remaining games of the season. The Euro Club Index and ClubElo use ELO-ratings, 11tegen11’s Expected Goals Model uses team ratings, SCoRe uses the amount of points a team achieved last season (in the equivalent matches of the remaining games) and A-SCoRe uses a weighted average of the PPG achieved this season and the PPG needed to end up with the number of points projected by SCoRe.

**Projections**

We will compare the Premier League, Bundesliga, La Liga, Serie A and Eredivisie projections of the Euro Club Index, ClubElo, 11tegen11’s Expected Goals Model, SCoRe and A-SCoRe. The predictions will be compared on two elements which are:

- Average difference between prediction and actual outcome
- Standard deviation

First we will calculate the average difference between predicted position & outcome and the corresponding standard deviation. After that we will calculate the average difference between predicted points & outcome and the corresponding standard deviation. The average difference between prediction and actual outcome is calculated by dividing the total sum of differences by the number of teams.

**Premier League**

Unfortunately, I was a bit too late with collecting the projections, so these are the projections after matchday 20. First thing to notice is, all five models project the same teams in the same order to end up in the top six. They all expect Arsenal to become champions, but the gap with Manchester City is small. An important remark is that the models predict the amount of points a team will gather by summing up the expected points. As a result they expect Arsenal to end up with less than 80 points, while they probably all agree that the winner of the 2015/2016 title will most likely pick up more than 80 points. The explanation behind this is that one of the teams probably will win more games than expected because of skill and luck, but the models/projections do not know which team this will be. Furthermore, all five models project the same teams to relegate. However, the amount of points these teams are projected to achieve differs.

**Bundesliga**

All five models project the same teams in the upper half of the Bundesliga table. However there are some differences, for example SCoRe predicts a quite low amount of points for Dortmund, whilst A-SCoRe seems to overrate Hertha Berlin. In the lower half of the table the projected differences are small and it looks like the relegation battle will be exciting until the very last minute.

**La Liga**

Real Madrid lost quite a lot of points and as a result the title race already seems decided. However, with the announcement of the transfer bans both Madrid teams may be keen on improving their squads in the current transfer window. According to the five models, Villareal, who are enjoying a wonderful season, are favourites to grab the fourth Champions League ticket. However, it seems A-SCoRe overestimates their true level. Based on the projections, there are five teams who are likely to end the season in a relegation spot, which are Levante, Rayo Vallecano, Gijon, Granada and Las Palmas.

**Serie A**

Juventus started the season with some terrible results and achieved only five points from the first six games. However, their underlying statistics were actually quite good and finally they started winning their matches. At the moment they are projected to win the title, but they face some strong competition. It is quite some time ago since Napoli won their last Scudetto, but it may actually happen this year with again a brilliant Argentinean striker upfront. At the bottom side of the table, Carpi, Frosinone and especially Verona are in serious danger to get relegated.

**Eredivisie**

Unfortunately, yet for understandable reasons, we only have four Eredivisie projections as 11tegen11 did not want to share his Eredivisie projection as he will use them soon in an article. If we look at the projections, it seems to be a two-horse race between Ajax and PSV with Feyenoord on an appropriate distance. This year promoted NEC is having an amazing season and are still in the race for a play-off ticket. In relegation land, there are quite a lot of teams who could end up on a play-off spot, whilst de Graafschap already look doomed.

**Conclusion**

My first hypothesis is that A-SCoRe predicts the final tables more accurate than SCoRe does, as it probably uses a better estimator for the points per game in the rest of the season. Furthermore, I hope that A-SCoRe is able to compete with the other three models, but I expect those models to predict the final table a bit better. Special thanks to 11tegen11 (@11tegen11 on Twitter) for sharing his projections! That’s it for now, it’s time to sit back and watch!

If you want to read more about the five models:

]]>While this ‘improvement’ seems nice on the first hand I had some doubts about it. First of all, if you achieved only 1.3 points per game last year (in the first 10 games) improving is not that spectacular. Secondly, are those 20 points a fair reflection of the ‘real’ quality of the team? And finally, which opponents did they face / how tough was their schedule?

If you want to make a comparison between seasons, with such a small amount of games played, I believe it makes way more sense to compare the number of points achieved in the equivalent matches in the previous season. This is exactly what Simon Gleave’s SCoRe (@SimonGleave on Twitter) measures. However, as he unfortunately doesn’t publish those graphs on a regular basis anymore, I started making my own.

**SCoRe Methodology :**

Seasonal Comparative Results sounds great, but how does it actually work? Well, you start with comparing the results achieved in the current season to the equivalent matches in the previous season. As this method doesn’t work for the promoted teams, they take over the points from the relegated teams. The best promoted team gets the points of the best relegated team, the second best promoted team gets the points of the second best relegated team, etc. Premier League: 18 – 1, 19 – 2 and 20 – 3 (play-off winner).

In every game there are 7 possible outcomes / SCoRe’s for each team, which are: -3, -2, -1, 0, 1, 2, and 3. For example, if Team A wins at home from Team B whilst the same game ended in a draw in the previous season, Team A receives a SCoRe of 2 and Team B a SCoRe of -1. All the SCoRe’s of each team are added up and used as the input for the projected final table. You take the final table from the previous season and lower or upper each team’s points with their current SCoRe to get the projected points of a team this season.

**Manchester United:**

Okay, let’s go back to Van Gaal’s statement that *‘his team has improved a lot’* and have a look at their SCoRe after 10 games.

As you can see in the table, according to SCoRe, Manchester United didn’t ‘improve a lot’ in the first ten games. Last season they took 22 points in these ten fixtures compared to 20 points this season, which gives them a SCoRe of -2. As they ended up with 70 points in the final table, they were projected to take 68 points this season.

In the following seven matches they faced some easier, at least in terms of ELO, opponents. They were expected to improve their SCoRe as they grabbed just 11 points in these fixtures last year, but they ruined their title chances with some awful results and are now projected to end up with 66 points.

Bournemouth <–> Hull, Watford <–> Burnley and Norwich <–> QPR

What I like the most is the simplicity of SCoRe while still being reasonably accurate in predicting the final table. The SCoRe of a team is pretty easy to calculate and you get a nice view of the difficulty of the scheme a team has faced. However, SCoRe is quite slow in picking up surprising teams. For example Chelsea is still predicted to end up in the top four with 69 points. Besides them, there are multiple other examples, like Watford, Leicester, Inter Milan, Hellas Verona, Dortmund and Celta de Vigo which all seem pretty over- or underrated by SCoRe.

Last season, Chelsea took 36 points in the equivalent fixtures they have played this season. At the moment they only have 18 points, which gives them a SCoRe of – 18. However, as they became champions last year with 87 points, they are still forecasted to end with 69 points. This implies that they have to pick up 51 points in the remaining 21 matches to match this projection, which is an average of 2.43 points per game (PPG). Of course, it is very unlikely they will even come close to that. Therefore, I started thinking how can the accuracy of the projected table be improved, whilst still keeping the simplicity.

**A-SCoRe Methodology**

Then I came up with the Adjusted SCoRe (A-SCoRe). The first part of this method is exactly the same as SCoRe, but there is a calculation added. From the predicted points of each team the number of points already achieved is subtracted. This gives you two parts from which both the PPG is calculated.

Then a weighted average of the PPG achieved and needed PPG is calculated. This number is used as an estimate for the PPG in the rest of the season and is multiplied by the number of games to play. After that, the current number of points is added up to complete the calculation.

((Current PPG * (Matches played / Total matches) + Needed PPG * (Number of matches still to play / Total matches)) * Number of matches still to play + Current Points

**Chelsea**

As already mentioned earlier on, according to SCoRe, Chelsea is still forecasted to end up with 69 points, despite having picked up just 18 points. Their current points per game is 1.06 while they have to get 2.43 points per game in the remaining 21 matches to reach the projected 69 points. As you can see the model needs an adjustment and that’s exactly what A-SCoRe does.

To calculate the weighted average, their current PPG is multiplied by 17/38 and summed up with their needed PPG which is multiplied by 21/38. This gives Chelsea a forecasted 1.82 PPG for the rest of the season, which seems plausible given their squad. The forecasted 1.82 PPG is multiplied with the number of games to play (21) which leads to an expectation of 38 points. Their current number of points (18) is added up, which gives a projection of 56 points.

(1.06 * 17/38) + (2.43 * 21/38) * 21 + 18 = 56 points

To control the validity of this projection (and the model in general) I have calculated the impact of Chelsea’s next game on their A-SCoRe. For each game there are 9 possible scenario’s which are stated in the following table. An important remark is that the expectation is the same as the result in the equivalent match in the previous season. So, the match between Chelsea and Watford is expected to end in a draw while in terms of ELO Chelsea would be expected to win. Last season: Chelsea 1-1 Burnley, Burnley <- -> Watford.

As you can see in the table, Chelsea can improve their A-SCoRe significantly by winning the game, whilst a loss and a draw would hurt their top six ambitions. The changes in A-SCoRe seem reasonable as winning when expected to loss > than winning when expected to draw > than winning when expected to win. The same logic follows with drawing and losing, which are both the least worse for their A-SCoRe when they were expected to lose and the most worse when expected to win.

Another observation is that the model is pretty, maybe a bit too, harsh after a loss. However, in defence of the model, this model (like most Expected Goals models do as well) includes a strong regression-to-last-season-factor, as last season results are the basis of the computation. But after such an amount of games played in the season, last season performances should not have a high impact in the prediction anymore.

I want to finish this piece with the projected Premier League table with adjusted seasonal comparative results to 2014/2015. I will post these graphs, Bundesliga, La Liga, Serie A and Eredivisie as well, on a regular basis on my twitter timeline. I hope you all enjoyed reading and I look forward to your comments, views, recommendations, etc.

]]>