Purpose: To investigate the relationships between external and internal workloads using a comprehensive selection of variables during basketball training and games. Methods: Eight semiprofessional, male basketball players were monitored during training and games for an entire season. External workload was determined as PlayerLoad™: total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events. Internal workload was determined using the summated-heart-rate zones and session rating of perceived exertion models. The relationships between external and internal workload variables were separately calculated for training and games using repeated-measures correlations with 95% confidence intervals. Results: PlayerLoad was more strongly related to summated-heart-rate zones (r = .88 ± .03, very large [training]; r = .69 ± .09, large [games]) and session rating of perceived exertion (r = .74 ± .06, very large [training]; r = .53 ± .12, large [games]) than other external workload variables (P < .05). Correlations between total and high-intensity accelerations, decelerations, changes of direction, and jumps and total low-intensity, medium-intensity, high-intensity, and overall inertial movement analysis events and internal workloads were stronger during training (r = .44–.88) than during games (r = .15–.69). Conclusions: PlayerLoad and summated-heart-rate zones possess the strongest dose–response relationship among a comprehensive selection of external and internal workload variables in basketball, particularly during training sessions compared with games. Basketball practitioners may therefore be able to best anticipate player responses when prescribing training drills using these variables for optimal workload management across the season.
Jordan L. Fox, Cody J. O’Grady, and Aaron T. Scanlan
Jordan L. Fox, Cody J. O’Grady, and Aaron T. Scanlan
Purpose: To compare the concurrent validity of session-rating of perceived exertion (sRPE) workload determined face-to-face and via an online application in basketball players. Methods: Sixteen semiprofessional, male basketball players (21.8 [4.3] y, 191.2 [9.2] cm, 85.0 [15.7] kg) were monitored during all training sessions across the 2018 (8 players) and 2019 (11 players) seasons in a state-level Australian league. Workload was reported as accumulated PlayerLoad (PL), summated-heart-rate-zones (SHRZ) workload, and sRPE. During the 2018 season, rating of perceived exertion (RPE) was determined following each session via individualized face-to-face reporting. During the 2019 season, RPE was obtained following each session via a phone-based, online application. Repeated-measures correlations with 95% confidence intervals were used to determine the relationships between sRPE collected using each method and other workload measures (PL and SHRZ) as indicators of concurrent validity. Results: Although all correlations were significant (P < .05), sRPE obtained using face-to-face reporting demonstrated stronger relationships with PL (r = .69 [.07], large) and SHRZ (r = .74 [.06], very large) compared with the online application (r = .29 [.25], small [PL] and r = .34 [.22], moderate [SHRZ]). Conclusions: Concurrent validity of sRPE workload was stronger when players reported RPE in an individualized, face-to-face manner compared with using a phone-based online application. Given the weaker relationships with other workload measures, basketball practitioners should be cautious when using player training workloads predicated on RPE obtained via online applications.
Jordan L. Fox, Jesse Green, and Aaron T. Scanlan
Purpose: To compare peak and average intensities encountered during winning and losing game quarters in basketball players. Methods: Eight semiprofessional male basketball players (age = 23.1 [3.8] y) were monitored during all games (N = 18) over 1 competitive season. The average intensities attained in each quarter were determined using microsensors and heart-rate monitors to derive relative values (per minute) for the following variables: PlayerLoad, frequency of high-intensity and total accelerations, decelerations, changes of direction, jumps, and total inertial movement analysis events combined, as well as modified summated-heart-rate-zones workload. The peak intensities reached in each quarter were determined using microsensors and reported as PlayerLoad per minute over 15-second, 30-second, 1-minute, 2-minute, 3-minute, 4-minute, and 5-minute sample durations. Linear mixed models and effect sizes were used to compare intensity variables between winning and losing game quarters. Results: Nonsignificant (P > .05), unclear–small differences were evident between winning and losing game quarters in all variables. Conclusions: During winning and losing game quarters, peak and average intensities were similar. Consequently, factors other than the intensity of effort applied during games may underpin team success in individual game quarters and therefore warrant further investigation.
Daniele Conte, Nicholas Kolb, Aaron T. Scanlan, and Fabrizio Santolamazza
Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe, and Vincent J. Dalbo
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.
Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).
sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).
Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Nattai R. Borges, Aaron T. Scanlan, Peter R. Reaburn, and Thomas M. Doering
Purpose: Due to age-related changes in the psychobiological state of masters athletes, this brief report aimed to compare training load responses using heart rate (HR) and ratings of perceived exertion (RPE) during standardized training sessions between masters and young cyclists. Methods: Masters (n = 10; 55.6 [5.0] y) and young (n = 8; 25.9 [3.0] y) cyclists performed separate endurance and high-intensity interval training sessions. Endurance intensity was set at 95% of ventilatory threshold 2 for 1 hour. High-intensity interval training consisted of 6 × 30-second intervals at 175% peak power output with 4.5-minute rest between intervals. HR was monitored continuously and RPE collected at standardized time periods during each session. Banister training impulse and summated-HR-zones training loads were also calculated. Results: Despite a significantly lower mean HR in masters cyclists during endurance (P = .04; d = 1.06 [±0.8], moderate) and high-intensity interval training (P = .01; d = 1.34 [±0.8], large), no significant differences were noted (P > .05) when responses were determined relative to maximum HR or converted to training impulse and summated-HR-zone loads. Furthermore, no interaction or between-group differences were evident for RPE across either session (P > .05). Conclusions: HR and RPE values were comparable between masters and young cyclists when relative HR responses and HR training load models are used. This finding suggests HR and RPE methods used to monitor or prescribe training load can be used interchangeably between masters and young athletes irrespective of chronological age.
Jordan L. Fox, Robert Stanton, Charli Sargent, Cody J. O’Grady, and Aaron T. Scanlan
Purpose: To quantify and compare external and internal game workloads according to contextual factors (game outcome, game location, and score-line). Methods: Starting semiprofessional, male basketball players were monitored during 19 games. External (PlayerLoad™ and inertial movement analysis variables) and internal (summated-heart-rate-zones and rating of perceived exertion [RPE]) workload variables were collected for all games. Linear mixed-effect models and effect sizes were used to compare workload variables based on each of the contextual variables assessed. Results: The number of jumps, absolute and relative (in min−1) high-intensity accelerations and decelerations, and relative changes-of-direction were higher during losses, whereas session RPE was higher during wins. PlayerLoad™ the number of absolute and relative jumps, high-intensity accelerations, absolute and relative total decelerations, total changes-of-direction, summated-heart-rate-zones, session RPE, and RPE were higher during away games, whereas the number of relative high-intensity jumps was higher during home games. PlayerLoad™, the number of high-intensity accelerations, total accelerations, absolute and relative decelerations, absolute and relative changes-of-direction, summated-heart-rate-zones, sRPE, and RPE were higher during balanced games, whereas the relative number of total and high-intensity jumps were higher during unbalanced games. Conclusions: Due to increased intensity, starting players may need additional recovery following losses. Given the increased external and internal workload volumes encountered during away games and balanced games, practitioners should closely monitor playing times during games. Monitoring playing times may help identify when players require additional recovery or reduced training volumes to avoid maladaptive responses across the in-season.
Javier Raya-González, Aaron T. Scanlan, María Soto-Célix, Alejandro Rodríguez-Fernández, and Daniel Castillo
Purpose: To examine the effects of acute caffeine supplementation on physical performance during fitness testing and activity during simulated games in basketball players. Methods: A double-blind, counterbalanced, randomized, crossover study design was followed. A total of 14 professional male basketball players ingested a placebo (sucrose) and caffeine (6 mg·kg−1 of body mass) in liquid form prior to completing 2 separate testing sessions. Each testing session involved completion of a standardized 15-minute warm-up followed by various fitness tests including 20-m sprints, countermovement jumps, Lane Agility Drill trials, and a repeated-sprint-ability test. Following a 20-minute recovery, players completed 3 × 7-minute 5-vs-5 simulated periods of full-court basketball games, each separated by 2 minutes of recovery. Local positioning system technology was used to measure player activity during games. Players completed a side-effects questionnaire 12 to 14 hours after testing. Results: Players experienced significant (P < .05), moderate–very large (effect size = −2.19 to 0.89) improvements in 20-m sprint, countermovement jump, Lane Agility Drill, and repeated-sprint-ability performance with caffeine supplementation. However, external workloads completed during simulated games demonstrated nonsignificant, trivial–small (effect size = −0.23 to 0.12) changes between conditions. In addition, players reported greater (P < .05) insomnia and urine output after caffeine ingestion. Conclusions: Acute caffeine supplementation could be effective to improve physical performance during tests stressing fitness elements important in basketball. However, acute caffeine supplementation appears to exert no meaningful effects on the activity completed during simulated basketball games and may promote sleep disturbances and exert a diuretic effect when taken at 6 mg·kg−1 of body mass in professional players.
Aaron T. Scanlan, Daniel M. Berkelmans, William M. Vickery, and Crystal O. Kean
Cricket is a popular international team sport with various game formats ranging from long-duration multiday tests to short-duration Twenty20 game play. The role of batsmen is critical to all game formats, with differing physiological demands imposed during each format. Investigation of the physiological demands imposed during cricket batting has historically been neglected, with much of the research focusing on bowling responses and batting technique. A greater understanding of the physiological demands of the batting role in cricket is required to assist strength and conditioning professionals and coaches with the design of training plans, recovery protocols, and player-management strategies. This brief review provides an updated synthesis of the literature examining the internal (eg, metabolic demands and heart rate) and external (eg, activity work rates) physiological responses to batting in the various game formats, as well as simulated play and small-sided-games training. Although few studies have been done in this area, the summary of data provides important insight regarding physiological responses to batting and highlights that more research on this topic is required. Future research is recommended to combine internal and external measures during actual game play, as well as comparing different game formats and playing levels. In addition, understanding the relationship between batting technique and physiological responses is warranted to gain a more holistic understanding of batting in cricket, as well as to develop appropriate coaching and training strategies.