Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.
Daniele Conte, Nicholas Kolb, Aaron T. Scanlan and Fabrizio Santolamazza
Jordan L. Fox, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To compare the concurrent validity of session-rating of perceived exertion (sRPE) workload determined face-to-face and via an online application in basketball players. Methods: Sixteen semiprofessional, male basketball players (21.8 [4.3] y, 191.2 [9.2] cm, 85.0 [15.7] kg) were monitored during all training sessions across the 2018 (8 players) and 2019 (11 players) seasons in a state-level Australian league. Workload was reported as accumulated PlayerLoad (PL), summated-heart-rate-zones (SHRZ) workload, and sRPE. During the 2018 season, rating of perceived exertion (RPE) was determined following each session via individualized face-to-face reporting. During the 2019 season, RPE was obtained following each session via a phone-based, online application. Repeated-measures correlations with 95% confidence intervals were used to determine the relationships between sRPE collected using each method and other workload measures (PL and SHRZ) as indicators of concurrent validity. Results: Although all correlations were significant (P < .05), sRPE obtained using face-to-face reporting demonstrated stronger relationships with PL (r = .69 [.07], large) and SHRZ (r = .74 [.06], very large) compared with the online application (r = .29 [.25], small [PL] and r = .34 [.22], moderate [SHRZ]). Conclusions: Concurrent validity of sRPE workload was stronger when players reported RPE in an individualized, face-to-face manner compared with using a phone-based online application. Given the weaker relationships with other workload measures, basketball practitioners should be cautious when using player training workloads predicated on RPE obtained via online applications.
Jordan L. Fox, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To investigate the relationships between external and internal workloads using a comprehensive selection of variables during basketball training and games. Methods: Eight semiprofessional, male basketball players were monitored during training and games for an entire season. External workload was determined as PlayerLoad™: total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events. Internal workload was determined using the summated-heart-rate zones and session rating of perceived exertion models. The relationships between external and internal workload variables were separately calculated for training and games using repeated-measures correlations with 95% confidence intervals. Results: PlayerLoad was more strongly related to summated-heart-rate zones (r = .88 ± .03, very large [training]; r = .69 ± .09, large [games]) and session rating of perceived exertion (r = .74 ± .06, very large [training]; r = .53 ± .12, large [games]) than other external workload variables (P < .05). Correlations between total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events and internal workloads were stronger during training (r = .44–.88) than during games (r = .15–.69). Conclusions: PlayerLoad and summated-heart-rate zones possess the strongest dose–response relationship among a comprehensive selection of external and internal workload variables in basketball, particularly during training sessions compared with games. Basketball practitioners may therefore be able to best anticipate player responses when prescribing training drills using these variables for optimal workload management across the season.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Jordan L. Fox, Robert Stanton, Charli Sargent, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To quantify and compare external and internal game workloads according to contextual factors (game outcome, game location, and score-line). Methods: Starting semiprofessional, male basketball players were monitored during 19 games. External (PlayerLoad™ and inertial movement analysis variables) and internal (summated-heart-rate-zones and rating of perceived exertion [RPE]) workload variables were collected for all games. Linear mixed-effect models and effect sizes were used to compare workload variables based on each of the contextual variables assessed. Results: The number of jumps, absolute and relative (in min−1) high-intensity accelerations and decelerations, and relative changes-of-direction were higher during losses, whereas session RPE was higher during wins. PlayerLoad™ the number of absolute and relative jumps, high-intensity accelerations, absolute and relative total decelerations, total changes-of-direction, summated-heart-rate-zones, session RPE, and RPE were higher during away games, whereas the number of relative high-intensity jumps was higher during home games. PlayerLoad™, the number of high-intensity accelerations, total accelerations, absolute and relative decelerations, absolute and relative changes-of-direction, summated-heart-rate-zones, sRPE, and RPE were higher during balanced games, whereas the relative number of total and high-intensity jumps were higher during unbalanced games. Conclusions: Due to increased intensity, starting players may need additional recovery following losses. Given the increased external and internal workload volumes encountered during away games and balanced games, practitioners should closely monitor playing times during games. Monitoring playing times may help identify when players require additional recovery or reduced training volumes to avoid maladaptive responses across the in-season.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe and Vincent J. Dalbo
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.
Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).
sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).
Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.
Aaron T. Scanlan, Ben J. Dascombe, Andrew P. Kidcaff, Jessica L. Peucker and Vincent J. Dalbo
To compare game activity demands between female and male semiprofessional basketball players.
Female (n = 12) and male (n = 12) semiprofessional basketball players were monitored across 3 competitive games. Time–motion-analysis procedures quantified player activity into predefined movement categories across backcourt (BC) and frontcourt (FC) positions. Activity frequencies, durations, and distances were calculated relative to live playing time (min). Work:rest ratios were also calculated using the video data. Game activity was compared between genders for each playing position and all players.
Female players performed at greater running work-rates than male players (45.7 ± 1.4 vs. 42.1 ± 1.7 m/min, P = .05), while male players performed more dribbling than female players (2.5 ± 0.3 vs. 3.0 ± 0.2 s/min; 8.4 ± 0.3 vs. 9.7 ± 0.7 m/min, P = .05). Positional analyses revealed that female BC players performed more low-intensity shuffling (P = .04) and jumping (P = .05), as well as longer (P = .04) jogging durations, than male BC players. Female FC players executed more upper-body activity (P = .03) and larger work:rest ratios (P < .001) than male FC players. No significant gender differences were observed in the overall intermittent demands, distance traveled, high-intensity shuffling activity, and sprinting requirements during game play.
These findings indicate that gender-specific running and dribbling differences might exist in semiprofessional basketball. Furthermore, position-specific variations between female and male basketball players should be considered. These data may prove useful in the development of gender-specific conditioning plans relative to playing position in basketball.
Aaron T. Scanlan, Daniel M. Berkelmans, William M. Vickery and Crystal O. Kean
Cricket is a popular international team sport with various game formats ranging from long-duration multiday tests to short-duration Twenty20 game play. The role of batsmen is critical to all game formats, with differing physiological demands imposed during each format. Investigation of the physiological demands imposed during cricket batting has historically been neglected, with much of the research focusing on bowling responses and batting technique. A greater understanding of the physiological demands of the batting role in cricket is required to assist strength and conditioning professionals and coaches with the design of training plans, recovery protocols, and player-management strategies. This brief review provides an updated synthesis of the literature examining the internal (eg, metabolic demands and heart rate) and external (eg, activity work rates) physiological responses to batting in the various game formats, as well as simulated play and small-sided-games training. Although few studies have been done in this area, the summary of data provides important insight regarding physiological responses to batting and highlights that more research on this topic is required. Future research is recommended to combine internal and external measures during actual game play, as well as comparing different game formats and playing levels. In addition, understanding the relationship between batting technique and physiological responses is warranted to gain a more holistic understanding of batting in cricket, as well as to develop appropriate coaching and training strategies.
Ben J. Dascombe, Trent K. Hoare, Joshua A. Sear, Peter R. Reaburn and Aaron T. Scanlan
To examine whether wearing various size lower-body compression garments improves physiological and performance parameters related to endurance running in well-trained athletes.
Eleven well-trained middle-distance runners and triathletes (age: 28.4 ± 10.0 y; height: 177.3 ± 4.7 cm; body mass: 72.6 ± 8.0 kg; VO2max: 59.0 ± 6.7 mL·kg–1·min–1) completed repeat progressive maximal tests (PMT) and time-to-exhaustion (TTE) tests at 90% VO2max wearing either manufacturer-recommended LBCG (rLBCG), undersized LBCG (uLBCG), or loose running shorts (CONT). During all exercise testing, several systemic and peripheral physiological measures were taken.
The results indicated similar effects of wearing rLBCG and uLBCG compared with the control. Across the PMT, wearing either LBCG resulted in significantly (P < .05) increased oxygen consumption, O2 pulse, and deoxyhemoglobin (HHb) and decreased running economy, oxyhemoglobin, and tissue oxygenation index (TOI) at low-intensity speeds (8–10 km·h–1). At higher speeds (12–18 km·h-1), wearing LBCG increased regional blood fow (nTHI) and HHb values, but significantly lowered heart rate and TOI. During the TTE, wearing either LBCG significantly (P < .05) increased HHb concentration, whereas wearing uLBCG also significantly (P < .05) increased nTHI. No improvement in endurance running performance was observed in either compression condition.
The results suggest that wearing LBCG facilitated a small number of cardiorespiratory and peripheral physiological benefits that appeared mostly related to improvements in venous flow. However, these improvements appear trivial to athletes, as they did not correspond to any improvement in endurance running performance.
Nattai R. Borges, Aaron T. Scanlan, Peter R. Reaburn and Thomas M. Doering
Purpose: Due to age-related changes in the psychobiological state of masters athletes, this brief report aimed to compare training load responses using heart rate (HR) and ratings of perceived exertion (RPE) during standardized training sessions between masters and young cyclists. Methods: Masters (n = 10; 55.6 [5.0] y) and young (n = 8; 25.9 [3.0] y) cyclists performed separate endurance and high-intensity interval training sessions. Endurance intensity was set at 95% of ventilatory threshold 2 for 1 hour. High-intensity interval training consisted of 6 × 30-second intervals at 175% peak power output with 4.5-minute rest between intervals. HR was monitored continuously and RPE collected at standardized time periods during each session. Banister training impulse and summated-HR-zones training loads were also calculated. Results: Despite a significantly lower mean HR in masters cyclists during endurance (P = .04; d = 1.06 [±0.8], moderate) and high-intensity interval training (P = .01; d = 1.34 [±0.8], large), no significant differences were noted (P > .05) when responses were determined relative to maximum HR or converted to training impulse and summated-HR-zone loads. Furthermore, no interaction or between-group differences were evident for RPE across either session (P > .05). Conclusions: HR and RPE values were comparable between masters and young cyclists when relative HR responses and HR training load models are used. This finding suggests HR and RPE methods used to monitor or prescribe training load can be used interchangeably between masters and young athletes irrespective of chronological age.