Menu | Explaining level changesPhil Lovell v David Brierley (Mon 17 Jul 2023)Match won by David Brierley. Result: 8-11,7-11,5-11.Starting level for Phil Lovell: 1,173, level confidence: 86%. Set manually. Starting level for David Brierley: 2,270, level confidence: 72%. Set manually. David Brierley to win as he is currently playing 94% better than Phil Lovell. David Brierley won all of the games and 62% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 65% (PAR scoring). These are weighted and combined to calculate that David Brierley played 65% better than Phil Lovell in this match. Due to the difference between the players' levels, allow for the likelihood that David Brierley was taking it easy by anything up to 11%. This gives him an allowed level range for this match between 1,683 and 2,270 without affecting his level. In this case, David Brierley played at level 1,974 and remained within his allowed range so his level will not be adjusted. On the assumption that David Brierley would normally have been playing at level 2,014 (based on typical behaviour), Phil Lovell played better than expected and therefore gains a pre-damping level increase of 2%. Allowing for the difference in level between the players, the adjustments have been reduced to 0% and 1.4% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for David Brierley changes to 0% and Phil Lovell changes to +1.2%. After applying standard match damping, the adjustment for David Brierley becomes 0% and for Phil Lovell becomes +1.1%. Apply match/event weighting of 50% for 'Fair Oak Boxes' so the adjustment for David Brierley is 0% and for Phil Lovell is +0.6%. Increase level confidence due to one more match played. David Brierley: 85%, Phil Lovell: 93%. Reduce level confidence based on how unexpected the result is. David Brierley: 78%, Phil Lovell: 86%. A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Phil Lovell: 1,178, level confidence: 86%. Final level for David Brierley: 2,267, level confidence: 78%. Notes
|