Search Results

You are looking at 1 - 10 of 15 items for

  • Author: Aaron T. Scanlan x
Clear All Modify Search
Restricted access

Daniele Conte, Nicholas Kolb, Aaron T. Scanlan and Fabrizio Santolamazza

Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.

Restricted access

Aaron T. Scanlan, Daniel M. Berkelmans, William M. Vickery and Crystal O. Kean

Cricket is a popular international team sport with various game formats ranging from long-duration multiday tests to short-duration Twenty20 game play. The role of batsmen is critical to all game formats, with differing physiological demands imposed during each format. Investigation of the physiological demands imposed during cricket batting has historically been neglected, with much of the research focusing on bowling responses and batting technique. A greater understanding of the physiological demands of the batting role in cricket is required to assist strength and conditioning professionals and coaches with the design of training plans, recovery protocols, and player-management strategies. This brief review provides an updated synthesis of the literature examining the internal (eg, metabolic demands and heart rate) and external (eg, activity work rates) physiological responses to batting in the various game formats, as well as simulated play and small-sided-games training. Although few studies have been done in this area, the summary of data provides important insight regarding physiological responses to batting and highlights that more research on this topic is required. Future research is recommended to combine internal and external measures during actual game play, as well as comparing different game formats and playing levels. In addition, understanding the relationship between batting technique and physiological responses is warranted to gain a more holistic understanding of batting in cricket, as well as to develop appropriate coaching and training strategies.

Restricted access

Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe and Vincent J. Dalbo

Purpose:

The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.

Methods:

Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).

Results:

sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).

Conclusions:

Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.

Restricted access

Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges and Vincent J. Dalbo

Purpose:

Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.

Methods:

Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.

Results:

A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.

Conclusions:

Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.

Restricted access

Aaron T. Scanlan, Ben J. Dascombe, Andrew P. Kidcaff, Jessica L. Peucker and Vincent J. Dalbo

Purpose:

To compare game activity demands between female and male semiprofessional basketball players.

Methods:

Female (n = 12) and male (n = 12) semiprofessional basketball players were monitored across 3 competitive games. Time–motion-analysis procedures quantified player activity into predefined movement categories across backcourt (BC) and frontcourt (FC) positions. Activity frequencies, durations, and distances were calculated relative to live playing time (min). Work:rest ratios were also calculated using the video data. Game activity was compared between genders for each playing position and all players.

Results:

Female players performed at greater running work-rates than male players (45.7 ± 1.4 vs. 42.1 ± 1.7 m/min, P = .05), while male players performed more dribbling than female players (2.5 ± 0.3 vs. 3.0 ± 0.2 s/min; 8.4 ± 0.3 vs. 9.7 ± 0.7 m/min, P = .05). Positional analyses revealed that female BC players performed more low-intensity shuffling (P = .04) and jumping (P = .05), as well as longer (P = .04) jogging durations, than male BC players. Female FC players executed more upper-body activity (P = .03) and larger work:rest ratios (P < .001) than male FC players. No significant gender differences were observed in the overall intermittent demands, distance traveled, high-intensity shuffling activity, and sprinting requirements during game play.

Conclusions:

These findings indicate that gender-specific running and dribbling differences might exist in semiprofessional basketball. Furthermore, position-specific variations between female and male basketball players should be considered. These data may prove useful in the development of gender-specific conditioning plans relative to playing position in basketball.

Restricted access

Aaron T. Scanlan, Neal Wen, Patrick S. Tucker, Nattai R. Borges and Vincent J. Dalbo

Purpose:

To compare perceptual and physiological training-load responses during various basketball training modes.

Methods:

Eight semiprofessional male basketball players (age 26.3 ± 6.7 y, height 188.1 ± 6.2 cm, body mass 92.0 ± 13.8 kg) were monitored across a 10-wk period in the preparatory phase of their training plan. Player session ratings of perceived exertion (sRPE) and heart-rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and 2 HR-based models: the training impulse (TRIMP) and summated HR zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model.

Results:

Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP r = .53, P < .05; sRPE-SHRZ r = .75, P < .05) and tactical/game-play conditioning (sRPE-TRIMP r = .60, P < .05; sRPE-SHRZ r = .63; P < .05) than during specific conditioning (sRPE-TRIMP r = .38, P < .05; sRPE-SHRZ r = .52; P < .05). Furthermore, the sRPE model detected greater increases (126–429 AU) in training load than the TRIMP (15–65 AU) and SHRZ models (27–170 AU) transitioning between training modes.

Conclusions:

While the training-load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest that the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes.

Restricted access

Jordan L. Fox, Robert Stanton, Charli Sargent, Cody J. O’Grady and Aaron T. Scanlan

Purpose: To quantify and compare external and internal game workloads according to contextual factors (game outcome, game location, and score-line). Methods: Starting semiprofessional, male basketball players were monitored during 19 games. External (PlayerLoad and inertial movement analysis variables) and internal (summated-heart-rate-zones and rating of perceived exertion [RPE]) workload variables were collected for all games. Linear mixed-effect models and effect sizes were used to compare workload variables based on each of the contextual variables assessed. Results: The number of jumps, absolute and relative (in min−1) high-intensity accelerations and decelerations, and relative changes-of-direction were higher during losses, whereas session RPE was higher during wins. PlayerLoad the number of absolute and relative jumps, high-intensity accelerations, absolute and relative total decelerations, total changes-of-direction, summated-heart-rate-zones, session RPE, and RPE were higher during away games, whereas the number of relative high-intensity jumps was higher during home games. PlayerLoad, the number of high-intensity accelerations, total accelerations, absolute and relative decelerations, absolute and relative changes-of-direction, summated-heart-rate-zones, sRPE, and RPE were higher during balanced games, whereas the relative number of total and high-intensity jumps were higher during unbalanced games. Conclusions: Due to increased intensity, starting players may need additional recovery following losses. Given the increased external and internal workload volumes encountered during away games and balanced games, practitioners should closely monitor playing times during games. Monitoring playing times may help identify when players require additional recovery or reduced training volumes to avoid maladaptive responses across the in-season.

Restricted access

Ben J. Dascombe, Trent K. Hoare, Joshua A. Sear, Peter R. Reaburn and Aaron T. Scanlan

Purpose:

To examine whether wearing various size lower-body compression garments improves physiological and performance parameters related to endurance running in well-trained athletes.

Methods:

Eleven well-trained middle-distance runners and triathletes (age: 28.4 ± 10.0 y; height: 177.3 ± 4.7 cm; body mass: 72.6 ± 8.0 kg; VO2max: 59.0 ± 6.7 mL·kg–1·min–1) completed repeat progressive maximal tests (PMT) and time-to-exhaustion (TTE) tests at 90% VO2max wearing either manufacturer-recommended LBCG (rLBCG), undersized LBCG (uLBCG), or loose running shorts (CONT). During all exercise testing, several systemic and peripheral physiological measures were taken.

Results:

The results indicated similar effects of wearing rLBCG and uLBCG compared with the control. Across the PMT, wearing either LBCG resulted in significantly (P < .05) increased oxygen consumption, O2 pulse, and deoxyhemoglobin (HHb) and decreased running economy, oxyhemoglobin, and tissue oxygenation index (TOI) at low-intensity speeds (8–10 km·h–1). At higher speeds (12–18 km·h-1), wearing LBCG increased regional blood fow (nTHI) and HHb values, but significantly lowered heart rate and TOI. During the TTE, wearing either LBCG significantly (P < .05) increased HHb concentration, whereas wearing uLBCG also significantly (P < .05) increased nTHI. No improvement in endurance running performance was observed in either compression condition.

Conclusion:

The results suggest that wearing LBCG facilitated a small number of cardiorespiratory and peripheral physiological benefits that appeared mostly related to improvements in venous flow. However, these improvements appear trivial to athletes, as they did not correspond to any improvement in endurance running performance.

Restricted access

Aaron T. Scanlan, Benjamin J. Dascombe, Peter R.J. Reaburn and Mark Osborne

Purpose:

The present investigation examined the physiological and performance effects of lower-body compression garments (LBCG) during a one-hour cycling time-trial in well-trained cyclists.

Methods:

Twelve well-trained male cyclists ([mean ± SD] age: 20.5 ± 3.6 years; height: 177.5 ± 4.9 cm; body mass: 70.5 ± 7.5 kg; VO2max: 55.2 ± 6.8 mL·kg−1·min−1) volunteered for the study. Each subject completed two randomly ordered stepwise incremental tests and two randomly ordered one-hour time trials (1HTT) wearing either full-length SportSkins Classic LBCG or underwear briefs (control). Blood lactate concentration ([BLa]), heart rate (HR), oxygen consumption (VO2) and muscle oxygenation (mOxy) were recorded throughout each test. Indicators of cycling endurance performance were anaerobic threshold (AnT) and VO2max values from the incremental test, and mean power (W), peak power (W), and total work (kJ) from the 1HTT Magnitude-based inferences were used to determine if LBCG demonstrated any performance and/or physiological benefits.

Results:

A likely practically significant increase (86%:12%:2%; η2 = 0.6) in power output at AnT was observed in the LBCG condition (CONT: 245.9 ± 55.7 W; LBCG: 259.8 ± 44.6 W). Further, a possible practically significant improvement (78%:19%:3%; η2 = 0.6) was reported in muscle oxygenation economy (W·%mOxy−1) across the 1HTT (mOxy: CONT: 52.2 ± 12.2%; LBCG: 57.3 ± 8.2%).

Conclusions:

The present results demonstrated limited physiological benefits and no performance enhancement through wearing LBCG during a cycling time trial.

Restricted access

Henrikas Paulauskas, Rasa Kreivyte, Aaron T. Scanlan, Alexandre Moreira, Laimonas Siupsinskas and Daniele Conte

Purpose: To assess the weekly fluctuations in workload and differences in workload according to playing time in elite female basketball players. Methods: A total of 29 female basketball players (mean [SD] age 21 [5] y, stature 181 [7] cm, body mass 71 [7] kg, playing experience 12 [5] y) belonging to the 7 women’s basketball teams competing in the first-division Lithuanian Women’s Basketball League were recruited. Individualized training loads (TLs) and game loads (GLs) were assessed using the session rating of perceived exertion after each training session and game during the entire in-season phase (24 wk). Percentage changes in total weekly TL (weekly TL + GL), weekly TL, weekly GL, chronic workload, acute:chronic workload ratio, training monotony, and training strain were calculated. Mixed linear models were used to assess differences for each dependent variable, with playing time (low vs high) used as fixed factor and subject, week, and team as random factors. Results: The highest changes in total weekly TL, weekly TL, and acute:chronic workload ratio were evident in week 13 (47%, 120%, and 49%, respectively). Chronic workload showed weekly changes ≤10%, whereas monotony and training strain registered highest fluctuations in weeks 17 (34%) and 15 (59%), respectively. A statistically significant difference in GL was evident between players completing low and high playing times (P = .026, moderate), whereas no significant differences (P > .05) were found for all other dependent variables. Conclusions: Coaches of elite women’s basketball teams should monitor weekly changes in workload during the in-season phase to identify weeks that may predispose players to unwanted spikes and adjust player workload according to playing time.