Purpose: To systematically quantify the external and internal workloads reported during games-based drills in basketball and identify the effects of different modifiable factors on the workloads encountered. Methods: PubMed, Scopus, MEDLINE, and SPORTDiscus databases were searched for original research published up until January 2, 2019. The search included terms relevant to workload, games-based drills, and basketball. Studies were screened using predefined selection criteria, and methodological quality was assessed prior to data extraction. Results: The electronic search yielded 8,284 studies with 3,411 duplicates. A total of 17 studies met the inclusion criteria for this review, with quality scores ranging from 9 to 10 out of 11. Factors regularly modified during games-based drills among the included studies were team size, playing area, playing and rest time, and game alterations. Games-based drills containing smaller team sizes elicited greater external and internal workloads compared to larger team sizes. Furthermore, full-court games-based drills elicited greater external and internal workloads compared to half-court drills, while continuous games-based drills elicited greater internal workloads compared to intermittent drills. Conclusions: This review provides a comprehensive collation of data indicating the external and internal workloads reported during different games-based drills in various samples of basketball players. Furthermore, evidence is provided for basketball coaches to consider when prescribing games-based drills and modifying factors during drills across the season. Current literature suggests that smaller team sizes and full-court playing areas elicit greater external and internal workloads than larger team sizes and half-court drills, respectively. Furthermore, continuous games-based drills elicit greater internal workloads than intermittent drills.
Cody J. O’Grady, Jordan L. Fox, Vincent J. Dalbo, and Aaron T. Scanlan
Aaron T. Scanlan, Ben J. Dascombe, Andrew P. Kidcaff, Jessica L. Peucker, and Vincent J. Dalbo
To compare game activity demands between female and male semiprofessional basketball players.
Female (n = 12) and male (n = 12) semiprofessional basketball players were monitored across 3 competitive games. Time–motion-analysis procedures quantified player activity into predefined movement categories across backcourt (BC) and frontcourt (FC) positions. Activity frequencies, durations, and distances were calculated relative to live playing time (min). Work:rest ratios were also calculated using the video data. Game activity was compared between genders for each playing position and all players.
Female players performed at greater running work-rates than male players (45.7 ± 1.4 vs. 42.1 ± 1.7 m/min, P = .05), while male players performed more dribbling than female players (2.5 ± 0.3 vs. 3.0 ± 0.2 s/min; 8.4 ± 0.3 vs. 9.7 ± 0.7 m/min, P = .05). Positional analyses revealed that female BC players performed more low-intensity shuffling (P = .04) and jumping (P = .05), as well as longer (P = .04) jogging durations, than male BC players. Female FC players executed more upper-body activity (P = .03) and larger work:rest ratios (P < .001) than male FC players. No significant gender differences were observed in the overall intermittent demands, distance traveled, high-intensity shuffling activity, and sprinting requirements during game play.
These findings indicate that gender-specific running and dribbling differences might exist in semiprofessional basketball. Furthermore, position-specific variations between female and male basketball players should be considered. These data may prove useful in the development of gender-specific conditioning plans relative to playing position in basketball.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe, and Vincent J. Dalbo
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.
Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).
sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).
Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.
Vincent J. Dalbo, Michael D. Roberts, Scott E. Hassell, Jordan R. Moon, and Chad M. Kerksick
This investigation examined the safety and efficacy of a silica-based mineral antioxidant complex (MAC) that has been suggested to influence body water and buffer lactate.
In a double-blind, randomized crossover design, male participants completed testing for 3 conditions: water only (baseline), rice flour (placebo), and MAC supplementation. Participants visited the laboratory on 5 occasions: familiarization, baseline, Testing Day 1, washout, and Testing Day 2. Baseline and Testing Days 1 and 2 consisted of fasting blood, pre- to postexercise body-water assessment and determination of VO2peak on a cycle ergometer. The supplementation protocols were separated by 1 wk and balanced to minimize an order effect.
No differences between conditions were found for heart rate, blood pressure, or serum-safety markers (p > .05). Before exercise there were no differences between conditions for total body water (TBW), intracellular water (ICW), or extracellular water (ECW). No significant interactive effects for supplementation and exercise were found for TBW, ICW, or ECW (p > .05). A time effect for TBW (p < .01) and ICW (p < .001) was present. Within-group changes in TBW occurred in the MAC condition, and within-group changes for ICW occurred in the MAC and placebo conditions. Ratings of perceived exertion and blood lactate increased (p < .05) with exercise. No significant effects were found for performance variables.
MAC supplementation had no impact on aerobic exercise performance and lactate response. Increases in TBW and ICW occurred after MAC consumption, but these changes appeared to have minimal physiological impact.
Aaron T. Scanlan, Neal Wen, Patrick S. Tucker, Nattai R. Borges, and Vincent J. Dalbo
To compare perceptual and physiological training-load responses during various basketball training modes.
Eight semiprofessional male basketball players (age 26.3 ± 6.7 y, height 188.1 ± 6.2 cm, body mass 92.0 ± 13.8 kg) were monitored across a 10-wk period in the preparatory phase of their training plan. Player session ratings of perceived exertion (sRPE) and heart-rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and 2 HR-based models: the training impulse (TRIMP) and summated HR zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model.
Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP r = .53, P < .05; sRPE-SHRZ r = .75, P < .05) and tactical/game-play conditioning (sRPE-TRIMP r = .60, P < .05; sRPE-SHRZ r = .63; P < .05) than during specific conditioning (sRPE-TRIMP r = .38, P < .05; sRPE-SHRZ r = .52; P < .05). Furthermore, the sRPE model detected greater increases (126–429 AU) in training load than the TRIMP (15–65 AU) and SHRZ models (27–170 AU) transitioning between training modes.
While the training-load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest that the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes.
Aaron T. Scanlan, Emilija Stojanović, Zoran Milanović, Masaru Teramoto, Mario Jeličić, and Vincent J. Dalbo
Purpose: To compare the aerobic capacity of elite female basketball players between playing roles and positions determined using maximal laboratory and field tests. Methods: Elite female basketball players from the National Croatian League were grouped according to playing role (starter: n = 8; bench: n = 12) and position (backcourt: n = 11; frontcourt: n = 9). All 20 players completed 2 maximal exercise tests in a crossover fashion 7 days apart. First, the players underwent a laboratory-based continuous running treadmill test with metabolic measurement to determine their peak oxygen uptake (VO2peak). The players then completed a maximal field-based 30-15 Intermittent Fitness Test (30-15 IFT) to estimate VO2peak. The VO2peak was compared using multiple linear regression analysis with bootstrap standard errors and playing role and position as predictors. Results: During both tests, starters attained a significantly higher VO2peak than bench players (continuous running treadmill: 47.4 [5.2] vs 44.7 [3.5] mL·kg−1·min−1, P = .05, moderate; 30-15 IFT: 44.9 [2.1] vs 41.9 [1.7] mL·kg−1·min−1, P < .001, large), and backcourt players attained a significantly higher VO2peak than frontcourt players (continuous running treadmill: 48.1 [3.8] vs 43.0 [3.3] mL·kg−1·min−1, P < .001, large; 30-15 IFT: 44.2 [2.2] vs 41.8 [2.0] mL·kg−1·min−1, P < .001, moderate). Conclusions: Starters (vs bench players) and guards (vs forwards and centers) possess a higher VO2peak irrespective of using laboratory or field tests. These data highlight the role- and position-specific importance of aerobic fitness to inform testing, training, and recovery practices in elite female basketball.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Markus N.C. Williams, Vincent J. Dalbo, Jordan L. Fox, Cody J. O’Grady, and Aaron T. Scanlan
Purpose: To compare weekly training and game demands according to playing position in basketball players. Methods: A longitudinal, observational study was adopted. Semiprofessional, male basketball players categorized as backcourt (guards; n = 4) and frontcourt players (forwards/centers; n = 4) had their weekly workloads monitored across an entire season. External workload was determined using microsensors and included PlayerLoad™ (PL) and inertial movement analysis variables. Internal workload was determined using heart rate to calculate absolute and relative summated-heart-rate-zones workload and rating of perceived exertion (RPE) to calculate session-RPE workload. Comparisons between weekly training and game demands were made using linear mixed models and effect sizes in each positional group. Results: In backcourt players, higher relative PL (P = .04, very large) and relative summated-heart-rate-zones workload (P = .007, very large) were evident during training, while greater session-RPE workload (P = .001, very large) was apparent during games. In frontcourt players, greater PL (P < .001, very large), relative PL (P = .019, very large), peak PL intensities (P < .001, moderate), high-intensity inertial movement analysis events (P = .002, very large), total inertial movement analysis events (P < .001, very large), summated-heart-rate-zones workload (P < .001, very large), RPE (P < .001, very large), and session-RPE workload (P < .001, very large) were evident during games. Conclusions: Backcourt players experienced similar demands between training and games across several variables, with higher average workload intensities during training. Frontcourt players experienced greater demands across all variables during games than training. These findings emphasize the need for position-specific preparation strategies leading into games in basketball teams.
Cody J. O’Grady, Jordan L. Fox, Daniele Conte, Davide Ferioli, Aaron T. Scanlan, and Vincent J. Dalbo
Purpose: Games-based drills are the predominant form of training adopted during basketball practice. As such, researchers have begun to quantify the physical, physiological, and perceptual demands of different games-based drill formats. However, study methodology has not been systematically reported across studies, limiting the ability to form conclusions from existing research. The authors developed this call to action to draw attention to the current standard of methodological reporting in basketball games-based drill research and establish a systematic reporting standard the authors hope will be utilized in future research. The Basketball Games-Based Drill Methodical Reporting Checklist (BGBDMRC) was developed to encourage the systematic reporting of games-based drill methodology. The authors used the BGBDMRC to evaluate the current methodological reporting standard of studies included in their review published in the International Journal of Sports Physiology and Performance, “A Systematic Review of the External and Internal Workloads Experienced During Games-Based Drills in Basketball Players” (2020), which highlighted this issue. Of the 17 studies included in their review, only 38% (±18%) of applicable checklist items were addressed across included studies, which is problematic as checklist items are essential for study replication. Conclusions: The current standard of methodological reporting in basketball games-based drill research is insufficient to allow for replication of examined drills in future research or the application of research outcomes to practice. The authors implore researchers to adopt the BGBDMRC to improve the quality and reproducibility of games-based drill research and increase the translation of research findings to practice.
Markus N.C. Williams, Jordan L. Fox, Cody J. O’Grady, Samuel Gardner, Vincent J. Dalbo, and Aaron T. Scanlan
Purpose: To compare weekly training, game, and overall (training and games) demands across phases of the regular season in basketball. Methods: Seven semiprofessional, male basketball players were monitored during all on-court team-based training sessions and games during the regular season. External monitoring variables included PlayerLoad™ and inertial movement analysis events per minute. Internal monitoring variables included a modified summated heart rate zones model calculated per minute and rating of perceived exertion. Linear mixed models were used to compare training, game, and overall demands between 5-week phases (early, middle, and late) of the regular season with significance set at P ≤ .05. Effect sizes were calculated between phases and interpreted as: trivial, <0.20; small, 0.20 to 0.59; moderate, 0.60 to 1.19; large, 1.20 to 1.99; very large, ≥2.00. Results: Greater (P > .05) overall inertial movement analysis events (moderate–very large) and rating of perceived exertion (moderate) were evident in the late phase compared with earlier phases. During training, more accelerations were evident in the middle (P = .01, moderate) and late (P = .05, moderate) phases compared with the early phase, while higher rating of perceived exertion (P = .04, moderate) was evident in the late phase compared with earlier phases. During games, nonsignificant, trivial–small differences in demands were apparent between phases. Conclusions: Training and game demands should be interpreted in isolation and combined given overall player demands increased as the season progressed, predominantly due to modifications in training demands given the stability of game demands. Periodization strategies administered by coaching staff may have enabled players to train at greater intensities late in the season without compromising game intensity.