Search Results

You are looking at 1 - 7 of 7 items for

  • Author: Jonathan D. Bartlett x
Clear All Modify Search
Open access

Samuel Robertson, Jonathan D. Bartlett and Paul B. Gastin

Decision-support systems are used in team sport for a variety of purposes including evaluating individual performance and informing athlete selection. A particularly common form of decision support is the traffic-light system, where color coding is used to indicate a given status of an athlete with respect to performance or training availability. However, despite relatively widespread use, there remains a lack of standardization with respect to how traffic-light systems are operationalized. This paper addresses a range of pertinent issues for practitioners relating to the practice of traffic-light monitoring in team sports. Specifically, the types and formats of data incorporated in such systems are discussed, along with the various analysis approaches available. Considerations relating to the visualization and communication of results to key stakeholders in the team-sport environment are also presented. In order for the efficacy of traffic-light systems to be improved, future iterations should look to incorporate the recommendations made here.

Restricted access

Dean Ritchie, Will G. Hopkins, Martin Buchheit, Justin Cordy and Jonathan D. Bartlett

Purpose:

Load monitoring in Australian football (AF) has been widely adopted, yet team-sport periodization strategies are relatively unknown. The authors aimed to quantify training and competition load across a season in an elite AF team, using rating of perceived exertion (RPE) and GPS tracking.

Methods:

Weekly totals for RPE and GPS loads (including accelerometer data; PlayerLoad) were obtained for 44 players across a full season for each training modality and for competition. General linear mixed models compared mean weekly load between 3 preseason and 4 in-season blocks. Effects were assessed with inferences about magnitudes standardized with between-players SD.

Results:

Total RPE load was most likely greater during preseason, where the majority of load was obtained via skills and conditioning. There was a large reduction in RPE load in the last preseason block. During in-season, half the total load came from games and the remaining half from training, predominantly skills and upper-body weights. Total distance, high-intensity running, and PlayerLoad showed large to very large reductions from preseason to in-season, whereas changes in mean speed were trivial across all blocks. All these effects were clear at the 99% level.

Conclusions:

These data provide useful information about targeted periods of loading and unloading across different stages of a season. The study also provides a framework for further investigation of training periodization in AF teams.

Restricted access

Jonathan D. Bartlett, Fergus O’Connor, Nathan Pitchford, Lorena Torres-Ronda and Samuel J. Robertson

Purpose:

The aim of this study was to quantify and predict relationships between rating of perceived exertion (RPE) and GPS training-load (TL) variables in professional Australian football (AF) players using group and individualized modeling approaches.

Methods:

TL data (GPS and RPE) for 41 professional AF players were obtained over a period of 27 wk. A total of 2711 training observations were analyzed with a total of 66 ± 13 sessions/player (range 39–89). Separate generalized estimating equations (GEEs) and artificial-neural-network analyses (ANNs) were conducted to determine the ability to predict RPE from TL variables (ie, session distance, high-speed running [HSR], HSR %, m/min) on a group and individual basis.

Results:

Prediction error for the individualized ANN (root-mean-square error [RMSE] 1.24 ± 0.41) was lower than the group ANN (RMSE 1.42 ± 0.44), individualized GEE (RMSE 1.58 ± 0.41), and group GEE (RMSE 1.85 ± 0.49). Both the GEE and ANN models determined session distance as the most important predictor of RPE. Furthermore, importance plots generated from the ANN revealed session distance as most predictive of RPE in 36 of the 41 players, whereas HSR was predictive of RPE in just 3 players and m/min was predictive of RPE in just 2 players.

Conclusions:

This study demonstrates that machine learning approaches may outperform more traditional methodologies with respect to predicting athlete responses to TL. These approaches enable further individualization of load monitoring, leading to more accurate training prescription and evaluation.

Restricted access

Dean Ritchie, Will G. Hopkins, Martin Buchheit, Justin Cordy and Jonathan D. Bartlett

Context:

Training volume, intensity, and distribution are important factors during periods of return to play.

Purpose:

To quantify the effect of injury on training load (TL) before and after return to play (RTP) in professional Australian Rules football.

Methods:

Perceived training load (RPE-TL) for 44 players was obtained for all indoor and outdoor training sessions, while field-based training was monitored via GPS (total distance, high-speed running, mean speed). When a player sustained a competition time-loss injury, weekly TL was quantified for 3 wk before and after RTP. General linear mixed models, with inference about magnitudes standardized by between-players SDs, were used to quantify effects of lower- and upper-body injury on TL compared with the team.

Results:

While total RPE-TL was similar to the team 2 wk before RTP, training distribution was different, whereby skills RPE-TL was likely and most likely lower for upper- and lower-body injury, respectively, and most likely replaced with small to very large increases in running and other conditioning load. Weekly total distance and high-speed running were most likely moderately to largely reduced for lower- and upper-body injury until after RTP, at which point total RPE-TL, training distribution, total distance, and high-speed running were similar to the team. Mean speed of field-based training was similar before and after RTP compared with the team.

Conclusions:

Despite injured athletes’ obtaining comparable TLs to uninjured players, training distribution is different until after RTP, indicating the importance of monitoring all types of training that athletes complete.

Restricted access

Nathan W. Pitchford, Sam J. Robertson, Charli Sargent, Justin Cordy, David J. Bishop and Jonathan D. Bartlett

Purpose:

To assess the effects of a change in training environment on the sleep characteristics of elite Australian Rules football (AF) players.

Methods:

In an observational crossover trial, 19 elite AF players had time in bed (TIB), total sleep time (TST), sleep efficiency (SE), and wake after sleep onset (WASO) assessed using wristwatch activity devices and subjective sleep diaries across 8-d home and camp periods. Repeated-measures ANOVA determined mean differences in sleep, training load (session rating of perceived exertion [RPE]), and environment. Pearson product–moment correlations, controlling for repeated observations on individuals, were used to assess the relationship between changes in sleep characteristics at home and camp. Cohen effect sizes (d) were calculated using individual means.

Results:

On camp TIB (+34 min) and WASO (+26 min) increased compared with home. However, TST was similar between home and camp, significantly reducing camp SE (–5.82%). Individually, there were strong negative correlations for TIB and WASO (r = -.75 and r = -.72, respectively) and a moderate negative correlation for SE (r = -.46) between home and relative changes on camp. Camp increased the relationship between individual s-RPE variation and TST variation compared with home (increased load r = -.367 vs .051, reduced load r = .319 vs –.033, camp vs home respectively).

Conclusions:

Camp compromised sleep quality due to significantly increased TIB without increased TST. Individually, AF players with higher home SE experienced greater reductions in SE on camp. Together, this emphasizes the importance of individualized interventions for elite team-sport athletes when traveling and/or changing environments.

Restricted access

Farhan Juhari, Dean Ritchie, Fergus O’Connor, Nathan Pitchford, Matthew Weston, Heidi R. Thornton and Jonathan D. Bartlett

Context: Team-sport training requires the daily manipulation of intensity, duration, and frequency, with preseason training focusing on meeting the demands of in-season competition and training on maintaining fitness. Purpose: To provide information about daily training in Australian football (AF), this study aimed to quantify session intensity, duration, and intensity distribution across different stages of an entire season. Methods: Intensity (session ratings of perceived exertion; CR-10 scale) and duration were collected from 45 professional male AF players for every training session and game. Each session’s rating of perceived exertion was categorized into a corresponding intensity zone, low (<4.0 arbitrary units), moderate (≥4.0 and <7.0), and high (≥7.0), to categorize session intensity. Linear mixed models were constructed to estimate session duration, intensity, and distribution between the 3 preseason and 4 in-season periods. Effects were assessed using linear mixed models and magnitude-based inferences. Results: The distribution of the mean session intensity across the season was 29% low intensity, 57% moderate intensity, and 14% high intensity. While 96% of games were high intensity, 44% and 49% of skills training sessions were low intensity and moderate intensity, respectively. Running had the highest proportion of high-intensity training sessions (27%). Preseason displayed higher training-session intensity (effect size [ES] = 0.29–0.91) and duration (ES = 0.33–1.44), while in-season game intensity (ES = 0.31–0.51) and duration (ES = 0.51–0.82) were higher. Conclusions: By using a cost-effective monitoring tool, this study provides information about the intensity, duration, and intensity distribution of all training types across different phases of a season, thus allowing a greater understanding of the training and competition demands of Australian footballers.

Restricted access

Fergus O’Connor, Heidi R. Thornton, Dean Ritchie, Jay Anderson, Lindsay Bull, Alex Rigby, Zane Leonard, Steven Stern and Jonathan D. Bartlett

Sprint capacity is an important attribute for team-sport athletes, yet the most appropriate method to analyze it is unclear. Purpose: To examine the relationship between sprint workloads using relative versus absolute thresholds and lower-body soft-tissue and bone-stress injury incidence in professional Australian rules football. Methods: Fifty-three professional Australian rules football athletes’ noncontact soft-tissue and bone-stress lower-body injuries (N = 62) were recorded, and sprint workloads were quantified over ∼18 months using the global positioning system. Sprint volume (m) and exposures (n) were determined using 2 methods: absolute (>24.9 km·h−1) and relative (≥75%, ≥80%, ≥85%, ≥90%, ≥95% of maximal velocity). Relationships between threshold methods and injury incidence were assessed using logistic generalized additive models. Incidence rate ratios and model performances’ area under the curve were reported. Results: Mean (SD) maximal velocity for the group was 31.5 (1.4), range 28.6 to 34.9 km·h−1. In comparing relative and absolute thresholds, 75% maximal velocity equated to ~1.5 km·h−1 below the absolute speed threshold, while 80% and 85% maximal velocity were 0.1 and 1.7 km·h−1 above the absolute speed threshold, respectively. Model area under the curve ranged from 0.48 to 0.61. Very low and very high cumulative sprint loads ≥80% across a 4-week period, when measured relatively, resulted in higher incidence rate ratios (2.54–3.29), than absolute thresholds (1.18–1.58). Discussion: Monitoring sprinting volume relative to an athlete’s maximal velocity should be incorporated into athlete monitoring systems. Specifically, quantifying the distance covered at >80% maximal velocity will ensure greater accuracy in determining sprint workloads and associated injury risk.