A mismatch between the training exertion intended by a coach and the exertion perceived by players is well established in sports. However, it is unknown whether coaches can accurately observe exertion of individual players during training. Furthermore, the discrepancy in coaches’ and players’ perceptions has not been explained.
To determine the relation between intended and observed training exertion by the coach and perceived training exertion by the players and establish whether on-field training characteristics, intermittent endurance capacity, and maturity status explain the mismatch.
During 2 mesocycles of 4 wk (in November and March), rating of intended exertion (RIE), rating of observed exertion (ROE), and rating of perceived exertion (RPE) were monitored in 31 elite young soccer players. External and internal training loads were objectively quantified with accelerometers (PlayerLoad) and heart-rate monitors (TRIMPmod). Results of an interval shuttle-run test (ISRT) and age at peak height velocity (APHV) were determined for all players.
RIE, ROE, and RPE were monitored in 977 training sessions. The correlations between RIE and RPE (r = .58; P < .01) and between ROE and RPE (r = .64; P < .01) were moderate. The mean difference between RIE and RPE was –0.31 ± 1.99 and between ROE and RPE was –0.37 ± 1.87. Multilevel analyses showed that PlayerLoad and ISRT predicted RIE and ROE.
Coaches base their intended and observed exertion on what they expect players will do and what they actually did on the field. When doing this, they consider the intermittent endurance capacity of individual players.