Menu | Explaining level changesDavid Brierley v David Moore (Thu 10 Aug 2023)Match won by David Brierley. Result: 11-7,11-6,5-11.Starting level for David Brierley: 2,281, level confidence: 79%. Set manually. Starting level for David Moore: 3,325, level confidence: 62%. David Moore to win as he is currently playing 46% better than David Brierley. David Brierley won 67% of the games and 53% of the points. This games result would be expected if he was better by around 20%. This points result would be expected if he was better by around 13% (PAR scoring). These are weighted and combined to calculate that David Brierley played 16% better than David Moore in this match. Assuming that any level changes are shared between both players, for this result it looks like David Brierley actually played at a level of 2,969 and David Moore at a level of 2,554. Without any damping, both players would need to be adjusted by 30% to match this result. Allowing for the difference in level between the players, the adjustments have been reduced to 24% and 24% respectively. As this is a best of 3 match rather than best of 5, these adjustments have been reduced to 18% and 18% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for David Brierley changes to +14% and David Moore changes to -18%. After applying standard match damping, the adjustment for David Brierley becomes +6.1% and for David Moore becomes -6.5%. Given David Brierley's level and the type of match played, an additional damping of 5.5% has been applied to his level change. Given David Moore's level and the type of match played, an additional damping of 24% has been applied to his level change. Looks like he wasn't taking the match too seriously... Apply match/event weighting of 50% for 'Fair Oak Boxes' so the adjustment for David Brierley is +2.9% and for David Moore is -2.5%. Increase level confidence due to one more match played. David Brierley: 89%, David Moore: 79%. Reduce level confidence based on how unexpected the result is. David Brierley: 68%, David Moore: 60%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for David Brierley: 2,343, level confidence: 68%. Final level for David Moore: 3,238, level confidence: 60%. Notes
|