This study examined the anticipation and visual behavior of elite rugby league players during two different evasion maneuvers (side- and split-steps). Participants (N = 48) included elite rugby league players (n = 38) and controls (n = 10). Each participant watched videos consisting of side- and split-steps, and anticipation of movement and eye behavior were measured. No significant differences between the groups or evasion maneuvers were found. The split-step was significantly harder to predict. Elite players appeared to spend more time viewing the torso and mid-region of the body compared with the controls.
Jonathan D. Connor, Robert G. Crowther and Wade H. Sinclair
Jonathan D. Bartlett, Fergus O’Connor, Nathan Pitchford, Lorena Torres-Ronda and Samuel J. Robertson
The aim of this study was to quantify and predict relationships between rating of perceived exertion (RPE) and GPS training-load (TL) variables in professional Australian football (AF) players using group and individualized modeling approaches.
TL data (GPS and RPE) for 41 professional AF players were obtained over a period of 27 wk. A total of 2711 training observations were analyzed with a total of 66 ± 13 sessions/player (range 39–89). Separate generalized estimating equations (GEEs) and artificial-neural-network analyses (ANNs) were conducted to determine the ability to predict RPE from TL variables (ie, session distance, high-speed running [HSR], HSR %, m/min) on a group and individual basis.
Prediction error for the individualized ANN (root-mean-square error [RMSE] 1.24 ± 0.41) was lower than the group ANN (RMSE 1.42 ± 0.44), individualized GEE (RMSE 1.58 ± 0.41), and group GEE (RMSE 1.85 ± 0.49). Both the GEE and ANN models determined session distance as the most important predictor of RPE. Furthermore, importance plots generated from the ANN revealed session distance as most predictive of RPE in 36 of the 41 players, whereas HSR was predictive of RPE in just 3 players and m/min was predictive of RPE in just 2 players.
This study demonstrates that machine learning approaches may outperform more traditional methodologies with respect to predicting athlete responses to TL. These approaches enable further individualization of load monitoring, leading to more accurate training prescription and evaluation.
Dean Ritchie, Justin Keogh, Steven Stern, Peter Reaburn, Fergus O’Connor and Jonathan D. Bartlett
Little is known about the effect of preceding endurance-exercise bouts on subsequent resistance-training (RT) performance in team-sport players. Purpose: To examine the effect of prior skills/endurance training and different recovery time periods on subsequent same-day RT performance in professional Australian football players. Methods: Sport-specific endurance-running loads (duration [in minutes], total distance [in meters], mean speed [in meters per minute], high-speed running >15 km·h−1, and relative high-speed running [>75% and >85% of maximal velocity]) were obtained for 46 professional Australian football players for each training session across an entire competitive season. RT was prescribed in 3 weekly mesocycles with tonnage (in kilograms) lifted recorded as RT performance. Endurance and RT sessions were interspersed by different recovery durations: ∼20 min and 1, 2, and 3 h. Fixed- and mixed-effect linear models assessed the influence of skills/endurance-running loads on RT performance. Models also accounted for season period (preseason vs in-season) and recovery duration between concurrent training bouts. Results: An increase in high-speed running and distance covered >75% and >85% of maximal velocity had the greatest reductions on RT performance. In-season total distance covered displayed greater negative effects on subsequent RT performance compared with preseason, while ∼20-min recovery between skills/endurance and RT was associated with greater reductions in RT performance, compared with 1-, 2-, and 3-h recovery. Conclusions: Sport-specific endurance-running loads negatively affect subsequent same-day RT performance, and this effect is greater in-season and with shorter recovery durations between bouts.
Farhan Juhari, Dean Ritchie, Fergus O’Connor, Nathan Pitchford, Matthew Weston, Heidi R. Thornton and Jonathan D. Bartlett
Context: Team-sport training requires the daily manipulation of intensity, duration, and frequency, with preseason training focusing on meeting the demands of in-season competition and training on maintaining fitness. Purpose: To provide information about daily training in Australian football (AF), this study aimed to quantify session intensity, duration, and intensity distribution across different stages of an entire season. Methods: Intensity (session ratings of perceived exertion; CR-10 scale) and duration were collected from 45 professional male AF players for every training session and game. Each session’s rating of perceived exertion was categorized into a corresponding intensity zone, low (<4.0 arbitrary units), moderate (≥4.0 and <7.0), and high (≥7.0), to categorize session intensity. Linear mixed models were constructed to estimate session duration, intensity, and distribution between the 3 preseason and 4 in-season periods. Effects were assessed using linear mixed models and magnitude-based inferences. Results: The distribution of the mean session intensity across the season was 29% low intensity, 57% moderate intensity, and 14% high intensity. While 96% of games were high intensity, 44% and 49% of skills training sessions were low intensity and moderate intensity, respectively. Running had the highest proportion of high-intensity training sessions (27%). Preseason displayed higher training-session intensity (effect size [ES] = 0.29–0.91) and duration (ES = 0.33–1.44), while in-season game intensity (ES = 0.31–0.51) and duration (ES = 0.51–0.82) were higher. Conclusions: By using a cost-effective monitoring tool, this study provides information about the intensity, duration, and intensity distribution of all training types across different phases of a season, thus allowing a greater understanding of the training and competition demands of Australian footballers.
Fergus O’Connor, Heidi R. Thornton, Dean Ritchie, Jay Anderson, Lindsay Bull, Alex Rigby, Zane Leonard, Steven Stern and Jonathan D. Bartlett
Sprint capacity is an important attribute for team-sport athletes, yet the most appropriate method to analyze it is unclear. Purpose: To examine the relationship between sprint workloads using relative versus absolute thresholds and lower-body soft-tissue and bone-stress injury incidence in professional Australian rules football. Methods: Fifty-three professional Australian rules football athletes’ noncontact soft-tissue and bone-stress lower-body injuries (N = 62) were recorded, and sprint workloads were quantified over ∼18 months using the global positioning system. Sprint volume (m) and exposures (n) were determined using 2 methods: absolute (>24.9 km·h−1) and relative (≥75%, ≥80%, ≥85%, ≥90%, ≥95% of maximal velocity). Relationships between threshold methods and injury incidence were assessed using logistic generalized additive models. Incidence rate ratios and model performances’ area under the curve were reported. Results: Mean (SD) maximal velocity for the group was 31.5 (1.4), range 28.6 to 34.9 km·h−1. In comparing relative and absolute thresholds, 75% maximal velocity equated to ~1.5 km·h−1 below the absolute speed threshold, while 80% and 85% maximal velocity were 0.1 and 1.7 km·h−1 above the absolute speed threshold, respectively. Model area under the curve ranged from 0.48 to 0.61. Very low and very high cumulative sprint loads ≥80% across a 4-week period, when measured relatively, resulted in higher incidence rate ratios (2.54–3.29), than absolute thresholds (1.18–1.58). Discussion: Monitoring sprinting volume relative to an athlete’s maximal velocity should be incorporated into athlete monitoring systems. Specifically, quantifying the distance covered at >80% maximal velocity will ensure greater accuracy in determining sprint workloads and associated injury risk.
Fergus K. O’Connor, Steven E. Stern, Thomas M. Doering, Geoffrey M. Minett, Peter R. Reaburn, Jonathan D. Bartlett and Vernon G. Coffey
Context: Exercise in hot environments increases body temperature and thermoregulatory strain. However, little is known regarding the magnitude of effect that ambient temperature (Ta), relative humidity (RH), and solar radiation individually have on team-sport athletes. Purpose : To determine the effect of these individual heat-stress variables on team-sport training performance and recovery. Methods: Professional Australian Rules Football players (N = 45) undertook 8-wk preseason training producing a total of 579 outdoor field-based observations with Ta, RH, and solar radiation recorded at every training session. External load (distance covered, in m/min; percentage high-speed running [%HSR] >14.4 km/h) was collected via a global positioning system. Internal load (ratings of perceived exertion and heart rate) and recovery (subjective ratings of well-being and heart-rate variability [root mean square of the successive differences]) were monitored throughout the training period. Mixed-effects linear models analyzed relationships between variables using standardized regression coefficients. Results: Increased solar-radiation exposure was associated with reduced distance covered (−19.7 m/min, P < .001), %HSR (−10%, P < .001) during training and rMSSD 48 h posttraining (−16.9 ms, P = .019). Greater RH was associated with decreased %HSR (−3.4%, P = .010) but increased percentage duration >85% HRmax (3.9%, P < .001), ratings of perceived exertion (1.8 AU, P < .001), and self-reported stress 24 h posttraining (−0.11 AU, P = .002). In contrast, higher Ta was associated with increased distance covered (19.7 m/min, P < .001) and %HSR (3.5%, P = .005). Conclusions: The authors show the importance of considering the individual factors contributing to thermal load in isolation for team-sport athletes and that solar radiation and RH reduce work capacity during team-sport training and have the potential to slow recovery between sessions.