Menu | Explaining level changesSimon Burgess v Howard Parker (Tue 16 Jul 2024)Match won by Simon Burgess. Result: 11-6,11-8,11-7.Starting level for Simon Burgess: 645, level confidence: 71%. Set manually. Starting level for Howard Parker: 321, level confidence: 73%. Set manually. Simon Burgess to win as he is currently playing 101% better than Howard Parker. Simon Burgess won all of the games and 61% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 57% (PAR scoring). These are weighted and combined to calculate that Simon Burgess played 56% better than Howard Parker in this match. Due to the difference between the players' levels, allow for the likelihood that Simon Burgess was taking it easy by anything up to 13%. This gives him an allowed level range for this match between 462 and 645 without affecting his level. In this case, Simon Burgess played at level 532 and remained within his allowed range so his level will not be adjusted. On the assumption that Simon Burgess would normally have been playing at level 564 (based on typical behaviour), Howard Parker played better than expected and therefore gains a pre-damping level increase of 6%. Allowing for the difference in level between the players, including some additional protection for the better player, the adjustments have been reduced to 0% and 4.2% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Simon Burgess changes to 0% and Howard Parker changes to +4.1%. After applying standard match damping, the adjustment for Simon Burgess becomes 0% and for Howard Parker becomes +3.5%. Apply match/event weighting of 50% for 'Grove Park Squash57 Boxes' so the adjustment for Simon Burgess is 0% and for Howard Parker is +1.8%. Increase level confidence due to one more match played. Simon Burgess: 84%, Howard Parker: 85%. Reduce level confidence based on how unexpected the result is. Simon Burgess: 74%, Howard Parker: 75%. A final adjustment of -0.2% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Simon Burgess: 644, level confidence: 74%. Final level for Howard Parker: 326, level confidence: 75%. Notes
|