Decision-support systems are used in team sport for a variety of purposes including evaluating individual performance and informing athlete selection. A particularly common form of decision support is the traffic-light system, where color coding is used to indicate a given status of an athlete with respect to performance or training availability. However, despite relatively widespread use, there remains a lack of standardization with respect to how traffic-light systems are operationalized. This paper addresses a range of pertinent issues for practitioners relating to the practice of traffic-light monitoring in team sports. Specifically, the types and formats of data incorporated in such systems are discussed, along with the various analysis approaches available. Considerations relating to the visualization and communication of results to key stakeholders in the team-sport environment are also presented. In order for the efficacy of traffic-light systems to be improved, future iterations should look to incorporate the recommendations made here.
Samuel Robertson, Jonathan D. Bartlett, and Paul B. Gastin
Dean Ritchie, Justin Keogh, Steven Stern, Peter Reaburn, Fergus O’Connor, and Jonathan D. Bartlett
Little is known about the effect of preceding endurance-exercise bouts on subsequent resistance-training (RT) performance in team-sport players. Purpose: To examine the effect of prior skills/endurance training and different recovery time periods on subsequent same-day RT performance in professional Australian football players. Methods: Sport-specific endurance-running loads (duration [in minutes], total distance [in meters], mean speed [in meters per minute], high-speed running >15 km·h−1, and relative high-speed running [>75% and >85% of maximal velocity]) were obtained for 46 professional Australian football players for each training session across an entire competitive season. RT was prescribed in 3 weekly mesocycles with tonnage (in kilograms) lifted recorded as RT performance. Endurance and RT sessions were interspersed by different recovery durations: ∼20 min and 1, 2, and 3 h. Fixed- and mixed-effect linear models assessed the influence of skills/endurance-running loads on RT performance. Models also accounted for season period (preseason vs in-season) and recovery duration between concurrent training bouts. Results: An increase in high-speed running and distance covered >75% and >85% of maximal velocity had the greatest reductions on RT performance. In-season total distance covered displayed greater negative effects on subsequent RT performance compared with preseason, while ∼20-min recovery between skills/endurance and RT was associated with greater reductions in RT performance, compared with 1-, 2-, and 3-h recovery. Conclusions: Sport-specific endurance-running loads negatively affect subsequent same-day RT performance, and this effect is greater in-season and with shorter recovery durations between bouts.
Nathan W. Pitchford, Sam J. Robertson, Charli Sargent, Justin Cordy, David J. Bishop, and Jonathan D. Bartlett
To assess the effects of a change in training environment on the sleep characteristics of elite Australian Rules football (AF) players.
In an observational crossover trial, 19 elite AF players had time in bed (TIB), total sleep time (TST), sleep efficiency (SE), and wake after sleep onset (WASO) assessed using wristwatch activity devices and subjective sleep diaries across 8-d home and camp periods. Repeated-measures ANOVA determined mean differences in sleep, training load (session rating of perceived exertion [RPE]), and environment. Pearson product–moment correlations, controlling for repeated observations on individuals, were used to assess the relationship between changes in sleep characteristics at home and camp. Cohen effect sizes (d) were calculated using individual means.
On camp TIB (+34 min) and WASO (+26 min) increased compared with home. However, TST was similar between home and camp, significantly reducing camp SE (–5.82%). Individually, there were strong negative correlations for TIB and WASO (r = -.75 and r = -.72, respectively) and a moderate negative correlation for SE (r = -.46) between home and relative changes on camp. Camp increased the relationship between individual s-RPE variation and TST variation compared with home (increased load r = -.367 vs .051, reduced load r = .319 vs –.033, camp vs home respectively).
Camp compromised sleep quality due to significantly increased TIB without increased TST. Individually, AF players with higher home SE experienced greater reductions in SE on camp. Together, this emphasizes the importance of individualized interventions for elite team-sport athletes when traveling and/or changing environments.
Jonathan D. Bartlett, Fergus O’Connor, Nathan Pitchford, Lorena Torres-Ronda, and Samuel J. Robertson
The aim of this study was to quantify and predict relationships between rating of perceived exertion (RPE) and GPS training-load (TL) variables in professional Australian football (AF) players using group and individualized modeling approaches.
TL data (GPS and RPE) for 41 professional AF players were obtained over a period of 27 wk. A total of 2711 training observations were analyzed with a total of 66 ± 13 sessions/player (range 39–89). Separate generalized estimating equations (GEEs) and artificial-neural-network analyses (ANNs) were conducted to determine the ability to predict RPE from TL variables (ie, session distance, high-speed running [HSR], HSR %, m/min) on a group and individual basis.
Prediction error for the individualized ANN (root-mean-square error [RMSE] 1.24 ± 0.41) was lower than the group ANN (RMSE 1.42 ± 0.44), individualized GEE (RMSE 1.58 ± 0.41), and group GEE (RMSE 1.85 ± 0.49). Both the GEE and ANN models determined session distance as the most important predictor of RPE. Furthermore, importance plots generated from the ANN revealed session distance as most predictive of RPE in 36 of the 41 players, whereas HSR was predictive of RPE in just 3 players and m/min was predictive of RPE in just 2 players.
This study demonstrates that machine learning approaches may outperform more traditional methodologies with respect to predicting athlete responses to TL. These approaches enable further individualization of load monitoring, leading to more accurate training prescription and evaluation.
Dean Ritchie, Will G. Hopkins, Martin Buchheit, Justin Cordy, and Jonathan D. Bartlett
Training volume, intensity, and distribution are important factors during periods of return to play.
To quantify the effect of injury on training load (TL) before and after return to play (RTP) in professional Australian Rules football.
Perceived training load (RPE-TL) for 44 players was obtained for all indoor and outdoor training sessions, while field-based training was monitored via GPS (total distance, high-speed running, mean speed). When a player sustained a competition time-loss injury, weekly TL was quantified for 3 wk before and after RTP. General linear mixed models, with inference about magnitudes standardized by between-players SDs, were used to quantify effects of lower- and upper-body injury on TL compared with the team.
While total RPE-TL was similar to the team 2 wk before RTP, training distribution was different, whereby skills RPE-TL was likely and most likely lower for upper- and lower-body injury, respectively, and most likely replaced with small to very large increases in running and other conditioning load. Weekly total distance and high-speed running were most likely moderately to largely reduced for lower- and upper-body injury until after RTP, at which point total RPE-TL, training distribution, total distance, and high-speed running were similar to the team. Mean speed of field-based training was similar before and after RTP compared with the team.
Despite injured athletes’ obtaining comparable TLs to uninjured players, training distribution is different until after RTP, indicating the importance of monitoring all types of training that athletes complete.
Dean Ritchie, Will G. Hopkins, Martin Buchheit, Justin Cordy, and Jonathan D. Bartlett
Load monitoring in Australian football (AF) has been widely adopted, yet team-sport periodization strategies are relatively unknown. The authors aimed to quantify training and competition load across a season in an elite AF team, using rating of perceived exertion (RPE) and GPS tracking.
Weekly totals for RPE and GPS loads (including accelerometer data; PlayerLoad) were obtained for 44 players across a full season for each training modality and for competition. General linear mixed models compared mean weekly load between 3 preseason and 4 in-season blocks. Effects were assessed with inferences about magnitudes standardized with between-players SD.
Total RPE load was most likely greater during preseason, where the majority of load was obtained via skills and conditioning. There was a large reduction in RPE load in the last preseason block. During in-season, half the total load came from games and the remaining half from training, predominantly skills and upper-body weights. Total distance, high-intensity running, and PlayerLoad showed large to very large reductions from preseason to in-season, whereas changes in mean speed were trivial across all blocks. All these effects were clear at the 99% level.
These data provide useful information about targeted periods of loading and unloading across different stages of a season. The study also provides a framework for further investigation of training periodization in AF teams.
Farhan Juhari, Dean Ritchie, Fergus O’Connor, Nathan Pitchford, Matthew Weston, Heidi R. Thornton, and Jonathan D. Bartlett
Context: Team-sport training requires the daily manipulation of intensity, duration, and frequency, with preseason training focusing on meeting the demands of in-season competition and training on maintaining fitness. Purpose: To provide information about daily training in Australian football (AF), this study aimed to quantify session intensity, duration, and intensity distribution across different stages of an entire season. Methods: Intensity (session ratings of perceived exertion; CR-10 scale) and duration were collected from 45 professional male AF players for every training session and game. Each session’s rating of perceived exertion was categorized into a corresponding intensity zone, low (<4.0 arbitrary units), moderate (≥4.0 and <7.0), and high (≥7.0), to categorize session intensity. Linear mixed models were constructed to estimate session duration, intensity, and distribution between the 3 preseason and 4 in-season periods. Effects were assessed using linear mixed models and magnitude-based inferences. Results: The distribution of the mean session intensity across the season was 29% low intensity, 57% moderate intensity, and 14% high intensity. While 96% of games were high intensity, 44% and 49% of skills training sessions were low intensity and moderate intensity, respectively. Running had the highest proportion of high-intensity training sessions (27%). Preseason displayed higher training-session intensity (effect size [ES] = 0.29–0.91) and duration (ES = 0.33–1.44), while in-season game intensity (ES = 0.31–0.51) and duration (ES = 0.51–0.82) were higher. Conclusions: By using a cost-effective monitoring tool, this study provides information about the intensity, duration, and intensity distribution of all training types across different phases of a season, thus allowing a greater understanding of the training and competition demands of Australian footballers.
Fergus K. O’Connor, Steven E. Stern, Thomas M. Doering, Geoffrey M. Minett, Peter R. Reaburn, Jonathan D. Bartlett, and Vernon G. Coffey
Context: Exercise in hot environments increases body temperature and thermoregulatory strain. However, little is known regarding the magnitude of effect that ambient temperature (Ta), relative humidity (RH), and solar radiation individually have on team-sport athletes. Purpose : To determine the effect of these individual heat-stress variables on team-sport training performance and recovery. Methods: Professional Australian Rules Football players (N = 45) undertook 8-wk preseason training producing a total of 579 outdoor field-based observations with Ta, RH, and solar radiation recorded at every training session. External load (distance covered, in m/min; percentage high-speed running [%HSR] >14.4 km/h) was collected via a global positioning system. Internal load (ratings of perceived exertion and heart rate) and recovery (subjective ratings of well-being and heart-rate variability [root mean square of the successive differences]) were monitored throughout the training period. Mixed-effects linear models analyzed relationships between variables using standardized regression coefficients. Results: Increased solar-radiation exposure was associated with reduced distance covered (−19.7 m/min, P < .001), %HSR (−10%, P < .001) during training and rMSSD 48 h posttraining (−16.9 ms, P = .019). Greater RH was associated with decreased %HSR (−3.4%, P = .010) but increased percentage duration >85% HRmax (3.9%, P < .001), ratings of perceived exertion (1.8 AU, P < .001), and self-reported stress 24 h posttraining (−0.11 AU, P = .002). In contrast, higher Ta was associated with increased distance covered (19.7 m/min, P < .001) and %HSR (3.5%, P = .005). Conclusions: The authors show the importance of considering the individual factors contributing to thermal load in isolation for team-sport athletes and that solar radiation and RH reduce work capacity during team-sport training and have the potential to slow recovery between sessions.
Fergus O’Connor, Heidi R. Thornton, Dean Ritchie, Jay Anderson, Lindsay Bull, Alex Rigby, Zane Leonard, Steven Stern, and Jonathan D. Bartlett
Sprint capacity is an important attribute for team-sport athletes, yet the most appropriate method to analyze it is unclear. Purpose: To examine the relationship between sprint workloads using relative versus absolute thresholds and lower-body soft-tissue and bone-stress injury incidence in professional Australian rules football. Methods: Fifty-three professional Australian rules football athletes’ noncontact soft-tissue and bone-stress lower-body injuries (N = 62) were recorded, and sprint workloads were quantified over ∼18 months using the global positioning system. Sprint volume (m) and exposures (n) were determined using 2 methods: absolute (>24.9 km·h−1) and relative (≥75%, ≥80%, ≥85%, ≥90%, ≥95% of maximal velocity). Relationships between threshold methods and injury incidence were assessed using logistic generalized additive models. Incidence rate ratios and model performances’ area under the curve were reported. Results: Mean (SD) maximal velocity for the group was 31.5 (1.4), range 28.6 to 34.9 km·h−1. In comparing relative and absolute thresholds, 75% maximal velocity equated to ~1.5 km·h−1 below the absolute speed threshold, while 80% and 85% maximal velocity were 0.1 and 1.7 km·h−1 above the absolute speed threshold, respectively. Model area under the curve ranged from 0.48 to 0.61. Very low and very high cumulative sprint loads ≥80% across a 4-week period, when measured relatively, resulted in higher incidence rate ratios (2.54–3.29), than absolute thresholds (1.18–1.58). Discussion: Monitoring sprinting volume relative to an athlete’s maximal velocity should be incorporated into athlete monitoring systems. Specifically, quantifying the distance covered at >80% maximal velocity will ensure greater accuracy in determining sprint workloads and associated injury risk.