Purpose: To compare relative and absolute speed and metabolic thresholds for quantifying match output in elite rugby league. Methods: Twenty-six professional players competing in the National Rugby League were monitored with global positioning systems (GPS) across a rugby-league season. Absolute speed (moderate-intensity running [MIRTh > 3.6 m/s] and high-intensity running [HIRTh > 5.2 m/s]) and metabolic (>20 W/kg) thresholds were compared with individualized ventilatory (first [VT1IFT] and second [VT2IFT]) thresholds estimated from the 30-15 Intermittent Fitness Test (30-15IFT), as well as the metabolic threshold associated with VT2IFT (HPmetVT2), to examine difference in match-play demands. Results: VT2IFT mean values represent 146%, 138%, 167%, and 144% increases in the HIR dose across adjustables, edge forwards, middle forwards, and outside backs, respectively. Distance covered above VT2IFT was almost certainly greater (ES range = 0.79–1.03) than absolute thresholds across all positions. Trivial to small differences were observed between VT1IFT and MIRTh, while small to moderate differences were reported between HPmetVT2 and HPmetTh. Conclusions: These results reveal that the speed at which players begin to run at higher intensities depends on individual capacities and attributes. As such, using absolute HIR speed thresholds underestimates the physical HIR load. Moreover, absolute MIR and high metabolic thresholds may over- or underestimate the work undertaken above these thresholds depending on the respective fitness of the individual. Therefore, using relative thresholds enables better prescription and monitoring of external training loads based on measured individual physical capacities.
Tannath J. Scott, Heidi R. Thornton, Macfarlane T.U. Scott, Ben J. Dascombe, and Grant M. Duthie
Daniel J. Plews, Ben Scott, Marco Altini, Matt Wood, Andrew E. Kilding, and Paul B. Laursen
Purpose: To establish the validity of smartphone photoplethysmography (PPG) and heart-rate sensor in the measurement of heart-rate variability (HRV). Methods: 29 healthy subjects were measured at rest during 5 min of guided breathing and normal breathing using smartphone PPG, a heart-rate chest strap, and electrocardiography (ECG). The root mean sum of the squared differences between R–R intervals (rMSSD) was determined from each device. Results: Compared to ECG, the technical error of estimate (TEE) was acceptable for all conditions (average TEE CV% [90% CI] = 6.35 [5.13; 8.5]). When assessed as a standardized difference, all differences were deemed “trivial” (average standard difference [90% CI] = 0.10 [0.08; 0.13]). Both PPG- and heart-rate-sensor-derived measures had almost perfect correlations with ECG (R = 1.00 [0.99; 1.00]). Conclusion: Both PPG and heart-rate sensors provide an acceptable agreement for the measurement of rMSSD when compared with ECG. Smartphone PPG technology may be a preferred method of HRV data collection for athletes due to its practicality and ease of use in the field.
Jace A. Delaney, Grant M. Duthie, Heidi R. Thornton, Tannath J. Scott, David Gay, and Ben J. Dascombe
Rugby league involves frequent periods of high-intensity running including acceleration and deceleration efforts, often occurring at low speeds.
To quantify the energetic cost of running and acceleration efforts during rugby league competition to aid in prescription and monitoring of training.
Global positioning system (GPS) data were collected from 37 professional rugby league players across 2 seasons. Peak values for relative distance, average acceleration/deceleration, and metabolic power (Pmet) were calculated for 10 different moving-average durations (1–10 min) for each position. A mixed-effects model was used to assess the effect of position for each duration, and individual comparisons were made using a magnitude-based-inference network.
There were almost certainly large differences in relative distance and Pmet between the 10-min window and all moving averages <5 min in duration (ES = 1.21–1.88). Fullbacks, halves, and hookers covered greater relative distances than outside backs, edge forwards, and middle forwards for moving averages lasting 2–10 min. Acceleration/deceleration demands were greatest in hookers and halves compared with fullbacks, middle forwards, and outside backs. Pmet was greatest in hookers, halves, and fullbacks compared with middle forwards and outside backs.
Competition running intensities varied by both position and moving-average duration. Hookers exhibited the greatest Pmet of all positions, due to high involvement in both attack and defense. Fullbacks also reached high Pmet, possibly due to a greater absolute volume of running. This study provides coaches with match data that can be used for the prescription and monitoring of specific training drills.
Heidi R. Thornton, Jace A. Delaney, Grant M. Duthie, Brendan R. Scott, William J. Chivers, Colin E. Sanctuary, and Ben J. Dascombe
To identify contributing factors to the incidence of illness for professional team-sport athletes, using training load (TL), self-reported illness, and well-being data.
Thirty-two professional rugby league players (26.0 ± 4.8 y, 99.1 ± 9.6 kg, 1.84 ± 0.06 m) were recruited from the same club. Players participated in prescribed training and responded to a series of questionnaires to determine the presence of self-reported illness and markers of well-being. Internal TL was determined using the session rating of perceived exertion. These data were collected over 29 wk, across the preparatory and competition macrocycles.
The predictive models developed recognized increases in internal TL (strain values of >2282 AU, weekly TL >2786 AU, and monotony >0.78 AU) to best predict when athletes are at increased risk of self-reported illness. In addition, a reduction in overall well-being (<7.25 AU) in the presence of increased internal TL, as previously stated, was highlighted as a contributor to self-reported-illness occurrence.
These results indicate that self-report data can be successfully used to provide a novel understanding of the interactions between competition-associated stressors experienced by professional team-sport athletes and their susceptibility to illness. This may help coaching staff more effectively monitor players during the season and potentially implement preventive measures to reduce the likelihood of illnesses occurring.
Jace A. Delaney, Heidi R. Thornton, Tannath J. Scott, David A. Ballard, Grant M. Duthie, Lisa G. Wood, and Ben J. Dascombe
High levels of lean mass are important in collision-based sports for the development of strength and power, which may also assist during contact situations. While skinfold-based measures have been shown to be appropriate for cross-sectional assessments of body composition, their utility in tracking changes in lean mass is less clear.
To determine the most effective method of quantifying changes in lean mass in rugby league athletes.
Body composition of 21 professional rugby league players was assessed on 2 or 3 occasions separated by ≥6 wk, including bioelectrical impedance analysis (BIA), leanmass index (LMI), and a skinfold-based prediction equation (SkF). Dual-X-ray absorptiometry provided a criterion measure of fat-free mass (FFM). Correlation coefficients (r) and standard errors of the estimate (SEE) were used as measures of validity for the estimates.
All 3 practical estimates exhibited strong validity for cross-sectional assessments of FFM (r > .9, P < .001). The correlation between change scores was stronger for the LMI (r = .69, SEE 1.3 kg) and the SkF method (r = .66, SEE = 1.4 kg) than for BIA (r = .50, SEE = 1.6 kg).
The LMI is probably as accurate in predicting changes in FFM as SkF and very likely to be more appropriate than BIA. The LMI offers an adequate, practical alternative for assessing in FFM among rugby league athletes.
Jace A. Delaney, Tannath J. Scott, Heidi R. Thornton, Kyle J.M. Bennett, David Gay, Grant M. Duthie, and Ben J. Dascombe
Rugby league coaches often prescribe training to replicate the demands of competition. The intensities of running drills are often monitored in comparison with absolute match-play measures. Such measures may not be sensitive enough to detect fluctuations in intensity across a match or to differentiate between positions.
To determine the position- and duration-specific running intensities of rugby league competition, using a moving-average method, for the prescription and monitoring of training.
Data from a 15-Hz global positioning system (GPS) were collected from 32 professional rugby league players across a season. The velocity–time curve was analyzed using a rolling-average method, where maximum values were calculated for 10 different durations, 1, 2, 3, 4, 5, 6, 7, 8, 9, and 10 min, for each player across each match.
There were large differences between the 1- and 2-min rolling averages and all other rolling-average durations. Smaller differences were observed for rolling averages of greater duration. Fullbacks maintained a greater velocity than outside backs and middle and edge forwards over the 1- and 2-min rolling averages (ES 0.8−1.2, P < .05). For rolling averages 3 min and greater, the running demands of the fullbacks were greater than those of the middle forwards and outside backs (ES 1.1−1.4, P < .05).
These findings suggest that the running demands of rugby league fluctuate vastly across a match. Fullbacks were the only position to exhibit a greater running intensity than any other position, and therefore training prescription should reflect this.