Search Results

You are looking at 1 - 10 of 17 items for :

  • Author: Ben J. Dascombe x
  • International Journal of Sports Physiology and Performance x
  • Physical Education and Coaching x
  • Refine by Access: All Content x
Clear All Modify Search
Restricted access

The Match Demands of Australian Rules Football Umpires in a State-Base Competition

Nathan Elsworthy and Ben J. Dascombe

Purpose:

The main purpose of the present study was to quantify the match running demands and physiological intensities of AF field and boundary umpires during match play.

Methods:

Thirty-five AF umpires [20 field (age: 24.7 ± 7.7 y, body mass: 74.3 ± 7.1 kg, Σ7 skinfolds: 67.8 ± 18.8 mm); 15 boundary (age: 29.6 ± 13.6 y, body mass: 71.9 ± 3.1 kg, Σ7 skinfolds: 65.6 ± 8.8 mm)] volunteered to participate in the study. Movement characteristics [total distance (TD), average running speed, high-intensity activity (HIA; >14.4 km·h–1) distance] and physiological measures [heart rate, blood lactate concentration ([BLa–]), and rating of perceived exertion] were collected during 20 state-based AF matches.

Results:

The mean (± SD) TD covered by field umpires was 11,492 ± 1,729 m, with boundary umpires covering 15,061 ± 1,749 m. The average running speed in field umpires was 103 ± 14 m·min-1, and was 134 ± 14 m·min-1 in boundary umpires. Field and boundary umpires covered 3,095 ± 752 m and 5,875 ± 1,590 m, during HIA, respectively. In the first quarter, HIA distance (field: P = .004, η2 = 0.071, boundary: P < .001, η2 = 0.180) and average running speed (field: P = .002, η2 = 0.078, boundary: P < .001, η2 = 0.191) were significantly greater than in subsequent quarters.

Conclusions:

The results demonstrate that both AF field and boundary umpires complete similar running demands to elite AF players and are subject to physical fatigue. Further research is warranted to see if this physical fatigue impacts on the cognitive function of AF umpires during match play.

Restricted access

Physiological Characteristics of Well-Trained Junior Sprint Kayak Athletes

Thiago Oliveira Borges, Ben Dascombe, Nicola Bullock, and Aaron J. Coutts

This study aimed to profile the physiological characteristics of junior sprint kayak athletes (n = 21, VO2max 4.1 ± 0.7 L/min, training experience 2.7 ± 1.2 y) and to establish the relationship between physiological variables (VO2max, VO2 kinetics, muscle-oxygen kinetics, paddling efficiency) and sprint kayak performance. VO2max, power at VO2max, power:weight ratio, paddling efficiency, VO2 at lactate threshold, and whole-body and muscle oxygen kinetics were determined on a kayak ergometer in the laboratory. Separately, on-water time trials (TT) were completed over 200 m and 1000 m. Large to nearly perfect (−.5 to −.9) inverse relationships were found between the physiological variables and on-water TT performance across both distances. Paddling efficiency and lactate threshold shared moderate to very large correlations (−.4 to −.7) with 200- and 1000-m performance. In addition, trivial to large correlations (−.11 to −.5) were observed between muscle-oxygenation parameters, muscle and whole-body oxygen kinetics, and performance. Multiple regression showed that 88% of the unadjusted variance for the 200-m TT performance was explained by VO2max, peripheral muscle deoxygenation, and maximal aerobic power (P < .001), whereas 85% of the unadjusted variance in 1000-m TT performance was explained by VO2max and deoxyhemoglobin (P < .001). The current findings show that well-trained junior sprint kayak athletes possess a high level of relative aerobic fitness and highlight the importance of the peripheral muscle metabolism for sprint kayak performance, particularly in 200-m races, where finalists and nonfinalists are separated by very small margins. Such data highlight the relative aerobic-fitness variables that can be used as benchmarks for talent-identification programs or monitoring longitudinal athlete development. However, such approaches need further investigation.

Restricted access

Time–Motion Analysis of a 2-Hour Surfing Training Session

Josh L. Secomb, Jeremy M. Sheppard, and Ben J. Dascombe

Purpose:

To provide a descriptive and quantitative time–motion analysis of surfing training with the use of global positioning system (GPS) and heart-rate (HR) technology.

Methods:

Fifteen male surfing athletes (22.1 ± 3.9 y, 175.4 ± 6.4 cm, 72.5 ± 7.7 kg) performed a 2-h surfing training session, wearing both a GPS unit and an HR monitor. An individual digital video recording was taken of the entire surfing duration. Repeated-measures ANOVAs were used to determine any effects of time on the physical and physiological measures.

Results:

Participants covered 6293.2 ± 1826.1 m during the 2-h surfing training session and recorded measures of average speed, HRaverage, and HRpeak as 52.4 ± 15.2 m/min, 128 ± 13 beats/min, and 171 ± 12 beats/min, respectively. Furthermore, the relative mean times spent performing paddling, sprint paddling to catch waves, stationary, wave riding, and recovery of the surfboard were 42.6% ± 9.9%, 4.1% ± 1.2%, 52.8% ± 12.4%, 2.5% ± 1.9%, and 2.1% ± 1.7%, respectively.

Conclusion:

The results demonstrate that a 2-h surfing training session is performed at a lower intensity than competitive heats. This is likely due to the onset of fatigue and a pacing strategy used by participants. Furthermore, surfing training sessions do not appear to appropriately condition surfers for competitive events. As a result, coaches working with surfing athletes should consider altering training sessions to incorporate repeated-effort sprint paddling to more effectively physically prepare surfers for competitive events.

Restricted access

Effects of Preseason Training on the Sleep Characteristics of Professional Rugby League Players

Heidi R. Thornton, Jace A. Delaney, Grant M. Duthie, and Ben J. Dascombe

Purpose: To investigate the influence of daily and exponentially weighted moving training loads on subsequent nighttime sleep. Methods: Sleep of 14 professional rugby league athletes competing in the National Rugby League was recorded using wristwatch actigraphy. Physical demands were quantified using GPS technology, including total distance, high-speed distance, acceleration/deceleration load (SumAccDec; AU), and session rating of perceived exertion (AU). Linear mixed models determined effects of acute (daily) and subacute (3- and 7-d) exponentially weighted moving averages (EWMA) on sleep. Results: Higher daily SumAccDec was associated with increased sleep efficiency (effect-size correlation; ES = 0.15; ±0.09) and sleep duration (ES = 0.12; ±0.09). Greater 3-d EWMA SumAccDec was associated with increased sleep efficiency (ES = 0.14; ±0.09) and an earlier bedtime (ES = 0.14; ±0.09). An increase in 7-d EWMA SumAccDec was associated with heightened sleep efficiency (ES = 0.15; ±0.09) and earlier bedtimes (ES = 0.15; ±0.09). Conclusions: The direction of the associations between training loads and sleep varied, but the strongest relationships showed that higher training loads increased various measures of sleep. Practitioners should be aware of the increased requirement for sleep during intensified training periods, using this information in the planning and implementation of training and individualized recovery modalities.

Restricted access

Gender-Specific Activity Demands Experienced During Semiprofessional Basketball Game Play

Aaron T. Scanlan, Ben J. Dascombe, Andrew P. Kidcaff, Jessica L. Peucker, and Vincent J. Dalbo

Purpose:

To compare game activity demands between female and male semiprofessional basketball players.

Methods:

Female (n = 12) and male (n = 12) semiprofessional basketball players were monitored across 3 competitive games. Time–motion-analysis procedures quantified player activity into predefined movement categories across backcourt (BC) and frontcourt (FC) positions. Activity frequencies, durations, and distances were calculated relative to live playing time (min). Work:rest ratios were also calculated using the video data. Game activity was compared between genders for each playing position and all players.

Results:

Female players performed at greater running work-rates than male players (45.7 ± 1.4 vs. 42.1 ± 1.7 m/min, P = .05), while male players performed more dribbling than female players (2.5 ± 0.3 vs. 3.0 ± 0.2 s/min; 8.4 ± 0.3 vs. 9.7 ± 0.7 m/min, P = .05). Positional analyses revealed that female BC players performed more low-intensity shuffling (P = .04) and jumping (P = .05), as well as longer (P = .04) jogging durations, than male BC players. Female FC players executed more upper-body activity (P = .03) and larger work:rest ratios (P < .001) than male FC players. No significant gender differences were observed in the overall intermittent demands, distance traveled, high-intensity shuffling activity, and sprinting requirements during game play.

Conclusions:

These findings indicate that gender-specific running and dribbling differences might exist in semiprofessional basketball. Furthermore, position-specific variations between female and male basketball players should be considered. These data may prove useful in the development of gender-specific conditioning plans relative to playing position in basketball.

Restricted access

Cumulative Training Dose’s Effects on Interrelationships Between Common Training-Load Models During Basketball Activity

Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe, and Vincent J. Dalbo

Purpose:

The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.

Methods:

Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).

Results:

sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).

Conclusions:

Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.

Restricted access

Importance of Various Training-Load Measures in Injury Incidence of Professional Rugby League Athletes

Heidi R. Thornton, Jace A. Delaney, Grant M. Duthie, and Ben J. Dascombe

Purpose:

To investigate the ability of various internal and external training-load (TL) monitoring measures to predict injury incidence among positional groups in professional rugby league athletes.

Methods:

TL and injury data were collected across 3 seasons (2013–2015) from 25 players competing in National Rugby League competition. Daily TL data were included in the analysis, including session rating of perceived exertion (sRPE-TL), total distance (TD), high-speed-running distance (>5 m/s), and high-metabolic-power distance (HPD; >20 W/kg). Rolling sums were calculated, nontraining days were removed, and athletes’ corresponding injury status was marked as “available” or “unavailable.” Linear (generalized estimating equations) and nonlinear (random forest; RF) statistical methods were adopted.

Results:

Injury risk factors varied according to positional group. For adjustables, the TL variables associated most highly with injury were 7-d TD and 7-d HPD, whereas for hit-up forwards they were sRPE-TL ratio and 14-d TD. For outside backs, 21- and 28-d sRPE-TL were identified, and for wide-running forwards, sRPE-TL ratio. The individual RF models showed that the importance of the TL variables in injury incidence varied between athletes.

Conclusions:

Differences in risk factors were recognized between positional groups and individual athletes, likely due to varied physiological capacities and physical demands. Furthermore, these results suggest that robust machine-learning techniques can appropriately monitor injury risk in professional team-sport athletes.

Restricted access

Factors That Influence Running Intensity in Interchange Players in Professional Rugby League

Jace A. Delaney, Heidi R. Thornton, Grant M. Duthie, and Ben J. Dascombe

Background:

Rugby league coaches adopt replacement strategies for their interchange players to maximize running intensity; however, it is important to understand the factors that may influence match performance.

Purpose:

To assess the independent factors affecting running intensity sustained by interchange players during professional rugby league.

Methods:

Global positioning system (GPS) data were collected from all interchanged players (starters and nonstarters) in a professional rugby league squad across 24 matches of a National Rugby League season. A multilevel mixed-model approach was employed to establish the effect of various technical (attacking and defensive involvements), temporal (bout duration, time in possession, etc), and situational (season phase, recovery cycle, etc) factors on the relative distance covered and average metabolic power (Pmet) during competition. Significant effects were standardized using correlation coefficients, and the likelihood of the effect was described using magnitude-based inferences.

Results:

Superior intermittent running ability resulted in very likely large increases in both relative distance and Pmet. As the length of a bout increased, both measures of running intensity exhibited a small decrease. There were at least likely small increases in running intensity for matches played after short recovery cycles and against strong opposition. During a bout, the number of collision-based involvements increased running intensity, whereas time in possession and ball time out of play decreased demands.

Conclusions:

These data demonstrate a complex interaction of individual- and match-based factors that require consideration when developing interchange strategies, and the manipulation of training loads during shorter recovery periods and against stronger opponents may be beneficial.

Restricted access

The Effects of Wearing Undersized Lower-Body Compression Garments on Endurance Running Performance

Ben J. Dascombe, Trent K. Hoare, Joshua A. Sear, Peter R. Reaburn, and Aaron T. Scanlan

Purpose:

To examine whether wearing various size lower-body compression garments improves physiological and performance parameters related to endurance running in well-trained athletes.

Methods:

Eleven well-trained middle-distance runners and triathletes (age: 28.4 ± 10.0 y; height: 177.3 ± 4.7 cm; body mass: 72.6 ± 8.0 kg; VO2max: 59.0 ± 6.7 mL·kg–1·min–1) completed repeat progressive maximal tests (PMT) and time-to-exhaustion (TTE) tests at 90% VO2max wearing either manufacturer-recommended LBCG (rLBCG), undersized LBCG (uLBCG), or loose running shorts (CONT). During all exercise testing, several systemic and peripheral physiological measures were taken.

Results:

The results indicated similar effects of wearing rLBCG and uLBCG compared with the control. Across the PMT, wearing either LBCG resulted in significantly (P < .05) increased oxygen consumption, O2 pulse, and deoxyhemoglobin (HHb) and decreased running economy, oxyhemoglobin, and tissue oxygenation index (TOI) at low-intensity speeds (8–10 km·h–1). At higher speeds (12–18 km·h-1), wearing LBCG increased regional blood fow (nTHI) and HHb values, but significantly lowered heart rate and TOI. During the TTE, wearing either LBCG significantly (P < .05) increased HHb concentration, whereas wearing uLBCG also significantly (P < .05) increased nTHI. No improvement in endurance running performance was observed in either compression condition.

Conclusion:

The results suggest that wearing LBCG facilitated a small number of cardiorespiratory and peripheral physiological benefits that appeared mostly related to improvements in venous flow. However, these improvements appear trivial to athletes, as they did not correspond to any improvement in endurance running performance.

Restricted access

Differences Between Relative and Absolute Speed and Metabolic Thresholds in Rugby League

Tannath J. Scott, Heidi R. Thornton, Macfarlane T.U. Scott, Ben J. Dascombe, and Grant M. Duthie

Purpose: To compare relative and absolute speed and metabolic thresholds for quantifying match output in elite rugby league. Methods: Twenty-six professional players competing in the National Rugby League were monitored with global positioning systems (GPS) across a rugby-league season. Absolute speed (moderate-intensity running [MIRTh > 3.6 m/s] and high-intensity running [HIRTh > 5.2 m/s]) and metabolic (>20 W/kg) thresholds were compared with individualized ventilatory (first [VT1IFT] and second [VT2IFT]) thresholds estimated from the 30-15 Intermittent Fitness Test (30-15IFT), as well as the metabolic threshold associated with VT2IFT (HPmetVT2), to examine difference in match-play demands. Results: VT2IFT mean values represent 146%, 138%, 167%, and 144% increases in the HIR dose across adjustables, edge forwards, middle forwards, and outside backs, respectively. Distance covered above VT2IFT was almost certainly greater (ES range = 0.79–1.03) than absolute thresholds across all positions. Trivial to small differences were observed between VT1IFT and MIRTh, while small to moderate differences were reported between HPmetVT2 and HPmetTh. Conclusions: These results reveal that the speed at which players begin to run at higher intensities depends on individual capacities and attributes. As such, using absolute HIR speed thresholds underestimates the physical HIR load. Moreover, absolute MIR and high metabolic thresholds may over- or underestimate the work undertaken above these thresholds depending on the respective fitness of the individual. Therefore, using relative thresholds enables better prescription and monitoring of external training loads based on measured individual physical capacities.