Purpose: To quantify and compare external and internal game workloads according to contextual factors (game outcome, game location, and score-line). Methods: Starting semiprofessional, male basketball players were monitored during 19 games. External (PlayerLoad™ and inertial movement analysis variables) and internal (summated-heart-rate-zones and rating of perceived exertion [RPE]) workload variables were collected for all games. Linear mixed-effect models and effect sizes were used to compare workload variables based on each of the contextual variables assessed. Results: The number of jumps, absolute and relative (in min−1) high-intensity accelerations and decelerations, and relative changes-of-direction were higher during losses, whereas session RPE was higher during wins. PlayerLoad™ the number of absolute and relative jumps, high-intensity accelerations, absolute and relative total decelerations, total changes-of-direction, summated-heart-rate-zones, session RPE, and RPE were higher during away games, whereas the number of relative high-intensity jumps was higher during home games. PlayerLoad™, the number of high-intensity accelerations, total accelerations, absolute and relative decelerations, absolute and relative changes-of-direction, summated-heart-rate-zones, sRPE, and RPE were higher during balanced games, whereas the relative number of total and high-intensity jumps were higher during unbalanced games. Conclusions: Due to increased intensity, starting players may need additional recovery following losses. Given the increased external and internal workload volumes encountered during away games and balanced games, practitioners should closely monitor playing times during games. Monitoring playing times may help identify when players require additional recovery or reduced training volumes to avoid maladaptive responses across the in-season.
Jordan L. Fox, Robert Stanton, Charli Sargent, Cody J. O’Grady and Aaron T. Scanlan
Aaron T. Scanlan, Robert Stanton, Charli Sargent, Cody O’Grady, Michele Lastella and Jordan L. Fox
Purpose: To quantify and compare internal and external workloads in regular and overtime games and examine changes in relative workloads during overtime compared with other periods in overtime games in male basketball players. Methods: Starting players for a semiprofessional male basketball team were monitored during 2 overtime games and 2 regular games (nonovertime) with similar contextual factors. Internal (rating of perceived exertion and heart-rate variables) and external (PlayerLoad and inertial movement analysis variables) workloads were quantified across games. Separate linear mixed-models and effect-size analyses were used to quantify differences in variables between regular and overtime games and between game periods in overtime games. Results: Session rating-of-perceived-exertion workload (P = .002, effect size 2.36, very large), heart-rate workload (P = .12, 1.13, moderate), low-intensity change-of-direction events to the left (P = .19, 0.95, moderate), medium-intensity accelerations (P = .12, 1.01, moderate), and medium-intensity change-of-direction events to the left (P = .10, 1.06, moderate) were higher during overtime games than during regular games. Overtime periods also exhibited reductions in relative PlayerLoad (first quarter P = .03, −1.46, large), low-intensity accelerations (first quarter P = .01, −1.45, large; second quarter P = .15, −1.22, large), and medium-intensity accelerations (first quarter P = .09, −1.32, large) compared with earlier periods. Conclusions: Overtime games disproportionately elevate perceptual, physiological, and acceleration workloads compared with regular games in starting basketball players. Players also perform at lower external intensities during overtime periods than earlier quarters during basketball games.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe and Vincent J. Dalbo
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.
Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).
sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).
Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.