Purpose: To investigate the relationships between external and internal workloads using a comprehensive selection of variables during basketball training and games. Methods: Eight semiprofessional, male basketball players were monitored during training and games for an entire season. External workload was determined as PlayerLoad™: total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events. Internal workload was determined using the summated-heart-rate zones and session rating of perceived exertion models. The relationships between external and internal workload variables were separately calculated for training and games using repeated-measures correlations with 95% confidence intervals. Results: PlayerLoad was more strongly related to summated-heart-rate zones (r = .88 ± .03, very large [training]; r = .69 ± .09, large [games]) and session rating of perceived exertion (r = .74 ± .06, very large [training]; r = .53 ± .12, large [games]) than other external workload variables (P < .05). Correlations between total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events and internal workloads were stronger during training (r = .44–.88) than during games (r = .15–.69). Conclusions: PlayerLoad and summated-heart-rate zones possess the strongest dose–response relationship among a comprehensive selection of external and internal workload variables in basketball, particularly during training sessions compared with games. Basketball practitioners may therefore be able to best anticipate player responses when prescribing training drills using these variables for optimal workload management across the season.
Jordan L. Fox, Cody J. O’Grady and Aaron T. Scanlan
Daniele Conte, Nicholas Kolb, Aaron T. Scanlan and Fabrizio Santolamazza
Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.
Jordan L. Fox, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To compare the concurrent validity of session-rating of perceived exertion (sRPE) workload determined face-to-face and via an online application in basketball players. Methods: Sixteen semiprofessional, male basketball players (21.8 [4.3] y, 191.2 [9.2] cm, 85.0 [15.7] kg) were monitored during all training sessions across the 2018 (8 players) and 2019 (11 players) seasons in a state-level Australian league. Workload was reported as accumulated PlayerLoad (PL), summated-heart-rate-zones (SHRZ) workload, and sRPE. During the 2018 season, rating of perceived exertion (RPE) was determined following each session via individualized face-to-face reporting. During the 2019 season, RPE was obtained following each session via a phone-based, online application. Repeated-measures correlations with 95% confidence intervals were used to determine the relationships between sRPE collected using each method and other workload measures (PL and SHRZ) as indicators of concurrent validity. Results: Although all correlations were significant (P < .05), sRPE obtained using face-to-face reporting demonstrated stronger relationships with PL (r = .69 [.07], large) and SHRZ (r = .74 [.06], very large) compared with the online application (r = .29 [.25], small [PL] and r = .34 [.22], moderate [SHRZ]). Conclusions: Concurrent validity of sRPE workload was stronger when players reported RPE in an individualized, face-to-face manner compared with using a phone-based online application. Given the weaker relationships with other workload measures, basketball practitioners should be cautious when using player training workloads predicated on RPE obtained via online applications.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Aaron T. Scanlan, Benjamin J. Dascombe, Peter R.J. Reaburn and Mark Osborne
The present investigation examined the physiological and performance effects of lower-body compression garments (LBCG) during a one-hour cycling time-trial in well-trained cyclists.
Twelve well-trained male cyclists ([mean ± SD] age: 20.5 ± 3.6 years; height: 177.5 ± 4.9 cm; body mass: 70.5 ± 7.5 kg; VO2max: 55.2 ± 6.8 mL·kg−1·min−1) volunteered for the study. Each subject completed two randomly ordered stepwise incremental tests and two randomly ordered one-hour time trials (1HTT) wearing either full-length SportSkins Classic LBCG or underwear briefs (control). Blood lactate concentration ([BLa−]), heart rate (HR), oxygen consumption (VO2) and muscle oxygenation (mOxy) were recorded throughout each test. Indicators of cycling endurance performance were anaerobic threshold (AnT) and VO2max values from the incremental test, and mean power (W), peak power (W), and total work (kJ) from the 1HTT Magnitude-based inferences were used to determine if LBCG demonstrated any performance and/or physiological benefits.
A likely practically significant increase (86%:12%:2%; η2 = 0.6) in power output at AnT was observed in the LBCG condition (CONT: 245.9 ± 55.7 W; LBCG: 259.8 ± 44.6 W). Further, a possible practically significant improvement (78%:19%:3%; η2 = 0.6) was reported in muscle oxygenation economy (W·%mOxy−1) across the 1HTT (mOxy: CONT: 52.2 ± 12.2%; LBCG: 57.3 ± 8.2%).
The present results demonstrated limited physiological benefits and no performance enhancement through wearing LBCG during a cycling time trial.
Ben J. Dascombe, Trent K. Hoare, Joshua A. Sear, Peter R. Reaburn and Aaron T. Scanlan
To examine whether wearing various size lower-body compression garments improves physiological and performance parameters related to endurance running in well-trained athletes.
Eleven well-trained middle-distance runners and triathletes (age: 28.4 ± 10.0 y; height: 177.3 ± 4.7 cm; body mass: 72.6 ± 8.0 kg; VO2max: 59.0 ± 6.7 mL·kg–1·min–1) completed repeat progressive maximal tests (PMT) and time-to-exhaustion (TTE) tests at 90% VO2max wearing either manufacturer-recommended LBCG (rLBCG), undersized LBCG (uLBCG), or loose running shorts (CONT). During all exercise testing, several systemic and peripheral physiological measures were taken.
The results indicated similar effects of wearing rLBCG and uLBCG compared with the control. Across the PMT, wearing either LBCG resulted in significantly (P < .05) increased oxygen consumption, O2 pulse, and deoxyhemoglobin (HHb) and decreased running economy, oxyhemoglobin, and tissue oxygenation index (TOI) at low-intensity speeds (8–10 km·h–1). At higher speeds (12–18 km·h-1), wearing LBCG increased regional blood fow (nTHI) and HHb values, but significantly lowered heart rate and TOI. During the TTE, wearing either LBCG significantly (P < .05) increased HHb concentration, whereas wearing uLBCG also significantly (P < .05) increased nTHI. No improvement in endurance running performance was observed in either compression condition.
The results suggest that wearing LBCG facilitated a small number of cardiorespiratory and peripheral physiological benefits that appeared mostly related to improvements in venous flow. However, these improvements appear trivial to athletes, as they did not correspond to any improvement in endurance running performance.
Jordan L. Fox, Aaron T. Scanlan, Robert Stanton, Cody J. O’Grady and Charli Sargent
Purpose: To examine the impact of workload volume during training sessions and games on subsequent sleep duration and sleep quality in basketball players. Methods: Seven semiprofessional male basketball players were monitored across preseason and in-season phases to determine training session and game workloads, sleep duration, and sleep quality. Training and game data were collected via accelerometers, heart-rate monitors, and rating of perceived exertion (RPE) and reported as PlayerLoad™ (PL), summated heart-rate zones, and session RPE (sRPE). Sleep duration and sleep quality were measured using wrist-worn activity monitors in conjunction with self-report sleep diaries. For daily training sessions and games, all workload data were independently sorted into tertiles representing low, medium, and high workload volumes. Sleep measures following low, medium, and high workloads and control nights (no training/games) were compared using linear mixed models. Results: Sleep onset time was significantly later following medium and high PL and sRPE game workloads compared with control nights (P < .05). Sleep onset time was significantly later following low, medium, and high summated heart-rate-zones game workloads, compared with control nights (P < .05). Time in bed and sleep duration were significantly shorter following high PL and sRPE game workloads compared with control nights (P < .05). Following low, medium, and high training workloads, sleep duration and quality were similar to control nights (P > .05). Conclusions: Following high PL and sRPE game workloads, basketball practitioners should consider strategies that facilitate longer time in bed, such as napping and/or adjusting travel or training schedules the following day.
Aaron T. Scanlan, Neal Wen, Patrick S. Tucker, Nattai R. Borges and Vincent J. Dalbo
To compare perceptual and physiological training-load responses during various basketball training modes.
Eight semiprofessional male basketball players (age 26.3 ± 6.7 y, height 188.1 ± 6.2 cm, body mass 92.0 ± 13.8 kg) were monitored across a 10-wk period in the preparatory phase of their training plan. Player session ratings of perceived exertion (sRPE) and heart-rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and 2 HR-based models: the training impulse (TRIMP) and summated HR zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model.
Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP r = .53, P < .05; sRPE-SHRZ r = .75, P < .05) and tactical/game-play conditioning (sRPE-TRIMP r = .60, P < .05; sRPE-SHRZ r = .63; P < .05) than during specific conditioning (sRPE-TRIMP r = .38, P < .05; sRPE-SHRZ r = .52; P < .05). Furthermore, the sRPE model detected greater increases (126–429 AU) in training load than the TRIMP (15–65 AU) and SHRZ models (27–170 AU) transitioning between training modes.
While the training-load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest that the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes.
Michele Lastella, Gregory D. Roach, Grace E. Vincent, Aaron T. Scanlan, Shona L. Halson and Charli Sargent
Purpose: To quantify the sleep/wake behaviors of adolescent, female basketball players and to examine the impact of daily training load on sleep/wake behaviors during a 14-day training camp. Methods: Elite, adolescent, female basketball players (N = 11) had their sleep/wake behaviors monitored using self-report sleep diaries and wrist-worn activity monitors during a 14-day training camp. Each day, players completed 1 to 5 training sessions (session duration: 114  min). Training load was determined using the session rating of perceived exertion model in arbitrary units. Daily training loads were summated across sessions on each day and split into tertiles corresponding to low, moderate, and high training load categories, with rest days included as a separate category. Separate linear mixed models and effect size analyses were conducted to assess differences in sleep/wake behaviors among daily training load categories. Results: Sleep onset and offset times were delayed (P < .05) on rest days compared with training days. Time in bed and total sleep time were longer (P < .05) on rest days compared with training days. Players did not obtain the recommended 8 to 10 hours of sleep per night on training days. A moderate increase in sleep efficiency was evident during days with high training loads compared with low. Conclusions: Elite, adolescent, female basketball players did not consistently meet the sleep duration recommendations of 8 to 10 hours per night during a 14-day training camp. Rest days delayed sleep onset and offset times, resulting in longer sleep durations compared with training days. Sleep/wake behaviors were not impacted by variations in the training load administered to players.
Aaron T. Scanlan, Robert Stanton, Charli Sargent, Cody O’Grady, Michele Lastella and Jordan L. Fox
Purpose: To quantify and compare internal and external workloads in regular and overtime games and examine changes in relative workloads during overtime compared with other periods in overtime games in male basketball players. Methods: Starting players for a semiprofessional male basketball team were monitored during 2 overtime games and 2 regular games (nonovertime) with similar contextual factors. Internal (rating of perceived exertion and heart-rate variables) and external (PlayerLoad and inertial movement analysis variables) workloads were quantified across games. Separate linear mixed-models and effect-size analyses were used to quantify differences in variables between regular and overtime games and between game periods in overtime games. Results: Session rating-of-perceived-exertion workload (P = .002, effect size 2.36, very large), heart-rate workload (P = .12, 1.13, moderate), low-intensity change-of-direction events to the left (P = .19, 0.95, moderate), medium-intensity accelerations (P = .12, 1.01, moderate), and medium-intensity change-of-direction events to the left (P = .10, 1.06, moderate) were higher during overtime games than during regular games. Overtime periods also exhibited reductions in relative PlayerLoad (first quarter P = .03, −1.46, large), low-intensity accelerations (first quarter P = .01, −1.45, large; second quarter P = .15, −1.22, large), and medium-intensity accelerations (first quarter P = .09, −1.32, large) compared with earlier periods. Conclusions: Overtime games disproportionately elevate perceptual, physiological, and acceleration workloads compared with regular games in starting basketball players. Players also perform at lower external intensities during overtime periods than earlier quarters during basketball games.