Menu | Explaining level changesRichard Cobb v Steve McClelland (Fri 05 Apr 2024)Match won by Steve McClelland. Result: 7-11,6-11.Starting level for Richard Cobb: 1,268, level confidence: 38%. Starting level for Steve McClelland: 6,101, level confidence: 25%. Steve McClelland to win as he is currently playing 381% better than Richard Cobb. Steve McClelland won all of the games and 63% of the points. This games result would be expected if he was better by around 55% or more. This points result would be expected if he was better by around 69% (PAR scoring). These are weighted and combined to calculate that Steve McClelland played 69% better than Richard Cobb in this match. Due to the difference between the players' levels, allow for the likelihood that Steve McClelland was taking it easy by anything up to 41%. This gives him an allowed level range for this match between 2,129 and 6,101 without affecting his level. In this case, Steve McClelland played at level 2,781 and remained within his allowed range so his level will not be adjusted. On the assumption that Steve McClelland would normally have been playing at level 3,604 (based on typical behaviour), Richard Cobb played better than expected and therefore gains a pre-damping level increase of 30%. Allowing for the difference in level between the players, including some additional protection for the better player, the adjustments have been reduced to 0% and 13% respectively. As this is a best of 3 match rather than best of 5, these adjustments have been reduced to 0% and 9.3% respectively. Factoring in the relative levels of confidence which allows players with low confidence in their levels to change more quickly, the adjustment for Steve McClelland changes to 0% and Richard Cobb changes to +6.1%. After applying standard match damping, the adjustment for Steve McClelland becomes 0% and for Richard Cobb becomes +4.1%. Apply match/event weighting of 100% for 'UK Series North East Open 2024' so the adjustment is unchanged. Increase level confidence due to one more match played. Steve McClelland: 50%, Richard Cobb: 62%. Reduce level confidence based on how unexpected the result is. Steve McClelland: 30%, Richard Cobb: 37%. A final adjustment of -0.1% has been made to both players as part of the automatic calibration that is performed after each match. All players in this pool will have been adjusted equally in order to remain equivalent to other player pools. Final level for Richard Cobb: 1,319, level confidence: 37%. Final level for Steve McClelland: 6,095, level confidence: 30%. Notes
|