Decision-support systems are used in team sport for a variety of purposes including evaluating individual performance and informing athlete selection. A particularly common form of decision support is the traffic-light system, where color coding is used to indicate a given status of an athlete with respect to performance or training availability. However, despite relatively widespread use, there remains a lack of standardization with respect to how traffic-light systems are operationalized. This paper addresses a range of pertinent issues for practitioners relating to the practice of traffic-light monitoring in team sports. Specifically, the types and formats of data incorporated in such systems are discussed, along with the various analysis approaches available. Considerations relating to the visualization and communication of results to key stakeholders in the team-sport environment are also presented. In order for the efficacy of traffic-light systems to be improved, future iterations should look to incorporate the recommendations made here.
Samuel Robertson, Jonathan D. Bartlett, and Paul B. Gastin
Jonathan D. Bartlett, Fergus O’Connor, Nathan Pitchford, Lorena Torres-Ronda, and Samuel J. Robertson
The aim of this study was to quantify and predict relationships between rating of perceived exertion (RPE) and GPS training-load (TL) variables in professional Australian football (AF) players using group and individualized modeling approaches.
TL data (GPS and RPE) for 41 professional AF players were obtained over a period of 27 wk. A total of 2711 training observations were analyzed with a total of 66 ± 13 sessions/player (range 39–89). Separate generalized estimating equations (GEEs) and artificial-neural-network analyses (ANNs) were conducted to determine the ability to predict RPE from TL variables (ie, session distance, high-speed running [HSR], HSR %, m/min) on a group and individual basis.
Prediction error for the individualized ANN (root-mean-square error [RMSE] 1.24 ± 0.41) was lower than the group ANN (RMSE 1.42 ± 0.44), individualized GEE (RMSE 1.58 ± 0.41), and group GEE (RMSE 1.85 ± 0.49). Both the GEE and ANN models determined session distance as the most important predictor of RPE. Furthermore, importance plots generated from the ANN revealed session distance as most predictive of RPE in 36 of the 41 players, whereas HSR was predictive of RPE in just 3 players and m/min was predictive of RPE in just 2 players.
This study demonstrates that machine learning approaches may outperform more traditional methodologies with respect to predicting athlete responses to TL. These approaches enable further individualization of load monitoring, leading to more accurate training prescription and evaluation.
Joel T. Fuller, Clint R. Bellenger, Dominic Thewlis, John Arnold, Rebecca L. Thomson, Margarita D. Tsiros, Eileen Y. Robertson, and Jonathan D. Buckley
Stride-to-stride fluctuations in running-stride interval display long-range correlations that break down in the presence of fatigue accumulated during an exhaustive run. The purpose of the study was to investigate whether long-range correlations in running-stride interval were reduced by fatigue accumulated during prolonged exposure to a high training load (functional overreaching) and were associated with decrements in performance caused by functional overreaching.
Ten trained male runners completed 7 d of light training (LT7), 14 d of heavy training (HT14) designed to induce a state of functional overreaching, and 10 d of light training (LT10) in a fixed order. Running-stride intervals and 5-km time-trial (5TT) performance were assessed after each training phase. The strength of long-range correlations in running-stride interval was assessed at 3 speeds (8, 10.5, and 13 km/h) using detrended fluctuation analysis.
Relative to performance post-LT7, time to complete the 5TT was increased after HT14 (+18 s; P < .05) and decreased after LT10 (–20 s; P = .03), but stride-interval long-range correlations remained unchanged at HT14 and LT10 (P > .50). Changes in stride-interval long-range correlations measured at a 10.5-km/h running speed were negatively associated with changes in 5TT performance (r –.46; P = .03).
Runners who were most affected by the prolonged exposure to high training load (as evidenced by greater reductions in 5TT performance) experienced the greatest reductions in stride-interval long-range correlations. Measurement of stride-interval long-range correlations may be useful for monitoring the effect of high training loads on athlete performance.
Clint R. Bellenger, Laura Karavirta, Rebecca L. Thomson, Eileen Y. Robertson, Kade Davison, and Jonathan D. Buckley
Heart-rate variability (HRV) as a measure of autonomic function may increase in response to training interventions leading to increases or decreases in performance, making HRV interpretation difficult in isolation. This study aimed to contextualize changes in HRV with subjective measures of training tolerance.
Supine and standing measures of vagally mediated HRV (root-mean-square difference of successive normal RR intervals [RMSSD]) and measures of training tolerance (Daily Analysis of Life Demands for Athletes questionnaire, perception of energy levels, fatigue, and muscle soreness) were recorded daily during 1 wk of light training (LT), 2 wk of heavy training (HT), and 10 d of tapering (T) in 15 male runners/triathletes. HRV and training tolerance were analyzed as rolling 7-d averages at LT, HT, and T. Performance was assessed after LT, HT, and T with a 5-km treadmill time trial (5TTT).
Time to complete the 5TTT likely increased after HT (effect size [ES] ± 90% confidence interval = 0.16 ± 0.06) and then almost certainly decreased after T (ES = −0.34 ± 0.08). Training tolerance worsened after HT (ES ≥ 1.30 ± 0.41) and improved after T (ES ≥ 1.27 ± 0.49). Standing RMSSD very likely increased after HT (ES = 0.62 ± 0.26) and likely remained higher than LT at the completion of T (ES = 0.38 ± 0.21). Changes in supine RMSSD were possible or likely trivial.
Vagally mediated HRV during standing increased in response to functional overreaching (indicating potential parasympathetic hyperactivity) and also to improvements in performance. Thus, additional measures such as training tolerance are required to interpret changes in vagally mediated HRV.
Nathan W. Pitchford, Sam J. Robertson, Charli Sargent, Justin Cordy, David J. Bishop, and Jonathan D. Bartlett
To assess the effects of a change in training environment on the sleep characteristics of elite Australian Rules football (AF) players.
In an observational crossover trial, 19 elite AF players had time in bed (TIB), total sleep time (TST), sleep efficiency (SE), and wake after sleep onset (WASO) assessed using wristwatch activity devices and subjective sleep diaries across 8-d home and camp periods. Repeated-measures ANOVA determined mean differences in sleep, training load (session rating of perceived exertion [RPE]), and environment. Pearson product–moment correlations, controlling for repeated observations on individuals, were used to assess the relationship between changes in sleep characteristics at home and camp. Cohen effect sizes (d) were calculated using individual means.
On camp TIB (+34 min) and WASO (+26 min) increased compared with home. However, TST was similar between home and camp, significantly reducing camp SE (–5.82%). Individually, there were strong negative correlations for TIB and WASO (r = -.75 and r = -.72, respectively) and a moderate negative correlation for SE (r = -.46) between home and relative changes on camp. Camp increased the relationship between individual s-RPE variation and TST variation compared with home (increased load r = -.367 vs .051, reduced load r = .319 vs –.033, camp vs home respectively).
Camp compromised sleep quality due to significantly increased TIB without increased TST. Individually, AF players with higher home SE experienced greater reductions in SE on camp. Together, this emphasizes the importance of individualized interventions for elite team-sport athletes when traveling and/or changing environments.