Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.
Daniele Conte, Nicholas Kolb, Aaron T. Scanlan and Fabrizio Santolamazza
Jordan L. Fox, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To compare the concurrent validity of session-rating of perceived exertion (sRPE) workload determined face-to-face and via an online application in basketball players. Methods: Sixteen semiprofessional, male basketball players (21.8 [4.3] y, 191.2 [9.2] cm, 85.0 [15.7] kg) were monitored during all training sessions across the 2018 (8 players) and 2019 (11 players) seasons in a state-level Australian league. Workload was reported as accumulated PlayerLoad (PL), summated-heart-rate-zones (SHRZ) workload, and sRPE. During the 2018 season, rating of perceived exertion (RPE) was determined following each session via individualized face-to-face reporting. During the 2019 season, RPE was obtained following each session via a phone-based, online application. Repeated-measures correlations with 95% confidence intervals were used to determine the relationships between sRPE collected using each method and other workload measures (PL and SHRZ) as indicators of concurrent validity. Results: Although all correlations were significant (P < .05), sRPE obtained using face-to-face reporting demonstrated stronger relationships with PL (r = .69 [.07], large) and SHRZ (r = .74 [.06], very large) compared with the online application (r = .29 [.25], small [PL] and r = .34 [.22], moderate [SHRZ]). Conclusions: Concurrent validity of sRPE workload was stronger when players reported RPE in an individualized, face-to-face manner compared with using a phone-based online application. Given the weaker relationships with other workload measures, basketball practitioners should be cautious when using player training workloads predicated on RPE obtained via online applications.
Jordan L. Fox, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To investigate the relationships between external and internal workloads using a comprehensive selection of variables during basketball training and games. Methods: Eight semiprofessional, male basketball players were monitored during training and games for an entire season. External workload was determined as PlayerLoad™: total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events. Internal workload was determined using the summated-heart-rate zones and session rating of perceived exertion models. The relationships between external and internal workload variables were separately calculated for training and games using repeated-measures correlations with 95% confidence intervals. Results: PlayerLoad was more strongly related to summated-heart-rate zones (r = .88 ± .03, very large [training]; r = .69 ± .09, large [games]) and session rating of perceived exertion (r = .74 ± .06, very large [training]; r = .53 ± .12, large [games]) than other external workload variables (P < .05). Correlations between total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events and internal workloads were stronger during training (r = .44–.88) than during games (r = .15–.69). Conclusions: PlayerLoad and summated-heart-rate zones possess the strongest dose–response relationship among a comprehensive selection of external and internal workload variables in basketball, particularly during training sessions compared with games. Basketball practitioners may therefore be able to best anticipate player responses when prescribing training drills using these variables for optimal workload management across the season.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Ben J. Dascombe, Trent K. Hoare, Joshua A. Sear, Peter R. Reaburn and Aaron T. Scanlan
To examine whether wearing various size lower-body compression garments improves physiological and performance parameters related to endurance running in well-trained athletes.
Eleven well-trained middle-distance runners and triathletes (age: 28.4 ± 10.0 y; height: 177.3 ± 4.7 cm; body mass: 72.6 ± 8.0 kg; VO2max: 59.0 ± 6.7 mL·kg–1·min–1) completed repeat progressive maximal tests (PMT) and time-to-exhaustion (TTE) tests at 90% VO2max wearing either manufacturer-recommended LBCG (rLBCG), undersized LBCG (uLBCG), or loose running shorts (CONT). During all exercise testing, several systemic and peripheral physiological measures were taken.
The results indicated similar effects of wearing rLBCG and uLBCG compared with the control. Across the PMT, wearing either LBCG resulted in significantly (P < .05) increased oxygen consumption, O2 pulse, and deoxyhemoglobin (HHb) and decreased running economy, oxyhemoglobin, and tissue oxygenation index (TOI) at low-intensity speeds (8–10 km·h–1). At higher speeds (12–18 km·h-1), wearing LBCG increased regional blood fow (nTHI) and HHb values, but significantly lowered heart rate and TOI. During the TTE, wearing either LBCG significantly (P < .05) increased HHb concentration, whereas wearing uLBCG also significantly (P < .05) increased nTHI. No improvement in endurance running performance was observed in either compression condition.
The results suggest that wearing LBCG facilitated a small number of cardiorespiratory and peripheral physiological benefits that appeared mostly related to improvements in venous flow. However, these improvements appear trivial to athletes, as they did not correspond to any improvement in endurance running performance.
Aaron T. Scanlan, Robert Stanton, Charli Sargent, Cody O’Grady, Michele Lastella and Jordan L. Fox
Purpose: To quantify and compare internal and external workloads in regular and overtime games and examine changes in relative workloads during overtime compared with other periods in overtime games in male basketball players. Methods: Starting players for a semiprofessional male basketball team were monitored during 2 overtime games and 2 regular games (nonovertime) with similar contextual factors. Internal (rating of perceived exertion and heart-rate variables) and external (PlayerLoad and inertial movement analysis variables) workloads were quantified across games. Separate linear mixed-models and effect-size analyses were used to quantify differences in variables between regular and overtime games and between game periods in overtime games. Results: Session rating-of-perceived-exertion workload (P = .002, effect size 2.36, very large), heart-rate workload (P = .12, 1.13, moderate), low-intensity change-of-direction events to the left (P = .19, 0.95, moderate), medium-intensity accelerations (P = .12, 1.01, moderate), and medium-intensity change-of-direction events to the left (P = .10, 1.06, moderate) were higher during overtime games than during regular games. Overtime periods also exhibited reductions in relative PlayerLoad (first quarter P = .03, −1.46, large), low-intensity accelerations (first quarter P = .01, −1.45, large; second quarter P = .15, −1.22, large), and medium-intensity accelerations (first quarter P = .09, −1.32, large) compared with earlier periods. Conclusions: Overtime games disproportionately elevate perceptual, physiological, and acceleration workloads compared with regular games in starting basketball players. Players also perform at lower external intensities during overtime periods than earlier quarters during basketball games.
Jordan L. Fox, Robert Stanton, Charli Sargent, Cody J. O’Grady and Aaron T. Scanlan
Purpose: To quantify and compare external and internal game workloads according to contextual factors (game outcome, game location, and score-line). Methods: Starting semiprofessional, male basketball players were monitored during 19 games. External (PlayerLoad™ and inertial movement analysis variables) and internal (summated-heart-rate-zones and rating of perceived exertion [RPE]) workload variables were collected for all games. Linear mixed-effect models and effect sizes were used to compare workload variables based on each of the contextual variables assessed. Results: The number of jumps, absolute and relative (in min−1) high-intensity accelerations and decelerations, and relative changes-of-direction were higher during losses, whereas session RPE was higher during wins. PlayerLoad™ the number of absolute and relative jumps, high-intensity accelerations, absolute and relative total decelerations, total changes-of-direction, summated-heart-rate-zones, session RPE, and RPE were higher during away games, whereas the number of relative high-intensity jumps was higher during home games. PlayerLoad™, the number of high-intensity accelerations, total accelerations, absolute and relative decelerations, absolute and relative changes-of-direction, summated-heart-rate-zones, sRPE, and RPE were higher during balanced games, whereas the relative number of total and high-intensity jumps were higher during unbalanced games. Conclusions: Due to increased intensity, starting players may need additional recovery following losses. Given the increased external and internal workload volumes encountered during away games and balanced games, practitioners should closely monitor playing times during games. Monitoring playing times may help identify when players require additional recovery or reduced training volumes to avoid maladaptive responses across the in-season.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe and Vincent J. Dalbo
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.
Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).
sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).
Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.
Michele Lastella, Gregory D. Roach, Grace E. Vincent, Aaron T. Scanlan, Shona L. Halson and Charli Sargent
Purpose: To quantify the sleep/wake behaviors of adolescent, female basketball players and to examine the impact of daily training load on sleep/wake behaviors during a 14-day training camp. Methods: Elite, adolescent, female basketball players (N = 11) had their sleep/wake behaviors monitored using self-report sleep diaries and wrist-worn activity monitors during a 14-day training camp. Each day, players completed 1 to 5 training sessions (session duration: 114  min). Training load was determined using the session rating of perceived exertion model in arbitrary units. Daily training loads were summated across sessions on each day and split into tertiles corresponding to low, moderate, and high training load categories, with rest days included as a separate category. Separate linear mixed models and effect size analyses were conducted to assess differences in sleep/wake behaviors among daily training load categories. Results: Sleep onset and offset times were delayed (P < .05) on rest days compared with training days. Time in bed and total sleep time were longer (P < .05) on rest days compared with training days. Players did not obtain the recommended 8 to 10 hours of sleep per night on training days. A moderate increase in sleep efficiency was evident during days with high training loads compared with low. Conclusions: Elite, adolescent, female basketball players did not consistently meet the sleep duration recommendations of 8 to 10 hours per night during a 14-day training camp. Rest days delayed sleep onset and offset times, resulting in longer sleep durations compared with training days. Sleep/wake behaviors were not impacted by variations in the training load administered to players.
Aaron T. Scanlan, Neal Wen, Patrick S. Tucker, Nattai R. Borges and Vincent J. Dalbo
To compare perceptual and physiological training-load responses during various basketball training modes.
Eight semiprofessional male basketball players (age 26.3 ± 6.7 y, height 188.1 ± 6.2 cm, body mass 92.0 ± 13.8 kg) were monitored across a 10-wk period in the preparatory phase of their training plan. Player session ratings of perceived exertion (sRPE) and heart-rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and 2 HR-based models: the training impulse (TRIMP) and summated HR zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model.
Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP r = .53, P < .05; sRPE-SHRZ r = .75, P < .05) and tactical/game-play conditioning (sRPE-TRIMP r = .60, P < .05; sRPE-SHRZ r = .63; P < .05) than during specific conditioning (sRPE-TRIMP r = .38, P < .05; sRPE-SHRZ r = .52; P < .05). Furthermore, the sRPE model detected greater increases (126–429 AU) in training load than the TRIMP (15–65 AU) and SHRZ models (27–170 AU) transitioning between training modes.
While the training-load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest that the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes.