Following exercise-induced muscle damage (EIMD), masters athletes take longer to recover than younger athletes. The purpose of this study was to determine the effect of higher than recommended postexercise protein feedings on the recovery of knee extensor peak isometric torque (PIT), perceptions of recovery, and cycling time trial (TT) performance following EIMD in masters triathletes. Eight masters triathletes (52 ± 2 y, V̇O2max, 51.8 ± 4.2 ml•kg-1•min-1) completed two trials separated by seven days in a randomized, doubleblind, crossover study. Trials consisted of morning PIT testing and a 30-min downhill run followed by an eight-hour recovery. During recovery, a moderate (MPI; 0.3 g•kg-1•bolus-1) or high (0.6 g•kg-1•bolus-1) protein intake (HPI) was consumed in three bolus feedings at two hour intervals commencing immediately postexercise. PIT testing and a 7 kJ•kg-1 cycling TT were completed postintervention. Perceptions of recovery were assessed pre- and postexercise. The HPI did not significantly improve recovery compared with MPI (p > .05). However, comparison of within-treatment change shows the HPI provided a moderate beneficial effect (d = 0.66), attenuating the loss of afternoon PIT (-3.6%, d = 0.09) compared with the MPI (-8.6%, d = 0.24). The HPI provided a large beneficial effect (d = 0.83), reducing perceived fatigue over the eight-hour recovery (d = 1.25) compared with the MPI (d = 0.22). Despite these effects, cycling performance was unchanged (HPI = 2395 ± 297 s vs. MPI = 2369 ± 278 s; d = 0.09). In conclusion, doubling the recommended postexercise protein intake did not significantly improve recovery in masters athletes; however, HPI provided moderate to large beneficial effects on recovery that may be meaningful following EIMD.
Thomas M. Doering, Peter R. Reaburn, Nattai R. Borges, Gregory R. Cox, and David G. Jenkins
Nattai R. Borges, Aaron T. Scanlan, Peter R. Reaburn, and Thomas M. Doering
Purpose: Due to age-related changes in the psychobiological state of masters athletes, this brief report aimed to compare training load responses using heart rate (HR) and ratings of perceived exertion (RPE) during standardized training sessions between masters and young cyclists. Methods: Masters (n = 10; 55.6 [5.0] y) and young (n = 8; 25.9 [3.0] y) cyclists performed separate endurance and high-intensity interval training sessions. Endurance intensity was set at 95% of ventilatory threshold 2 for 1 hour. High-intensity interval training consisted of 6 × 30-second intervals at 175% peak power output with 4.5-minute rest between intervals. HR was monitored continuously and RPE collected at standardized time periods during each session. Banister training impulse and summated-HR-zones training loads were also calculated. Results: Despite a significantly lower mean HR in masters cyclists during endurance (P = .04; d = 1.06 [±0.8], moderate) and high-intensity interval training (P = .01; d = 1.34 [±0.8], large), no significant differences were noted (P > .05) when responses were determined relative to maximum HR or converted to training impulse and summated-HR-zone loads. Furthermore, no interaction or between-group differences were evident for RPE across either session (P > .05). Conclusions: HR and RPE values were comparable between masters and young cyclists when relative HR responses and HR training load models are used. This finding suggests HR and RPE methods used to monitor or prescribe training load can be used interchangeably between masters and young athletes irrespective of chronological age.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, Ben J. Dascombe, and Vincent J. Dalbo
The influence of various factors on training-load (TL) responses in basketball has received limited attention. This study aimed to examine the temporal changes and influence of cumulative training dose on TL responses and interrelationships during basketball activity.
Ten state-level Australian male junior basketball players completed 4 × 10-min standardized bouts of simulated basketball activity using a circuit-based protocol. Internal TL was quantified using the session rating of perceived exertion (sRPE), summated heart-rate zones (SHRZ), Banister training impulse (TRIMP), and Lucia TRIMP models. External TL was assessed via measurement of mean sprint and circuit speeds. Temporal TL comparisons were performed between 10-min bouts, while Pearson correlation analyses were conducted across cumulative training doses (0–10, 0–20, 0–30, and 0–40 min).
sRPE TL increased (P < .05) after the first 10-min bout of basketball activity. sRPE TL was only significantly related to Lucia TRIMP (r = .66–.69; P < .05) across 0–10 and 0–20 min. Similarly, mean sprint and circuit speed were significantly correlated across 0–20 min (r = .67; P < .05). In contrast, SHRZ and Banister TRIMP were significantly related across all training doses (r = .84–.89; P < .05).
Limited convergence exists between common TL approaches across basketball training doses lasting beyond 20 min. Thus, the interchangeability of commonly used internal and external TL approaches appears dose-dependent during basketball activity, with various psychophysiological mediators likely underpinning temporal changes.
Aaron T. Scanlan, Jordan L. Fox, Nattai R. Borges, and Vincent J. Dalbo
Declines in high-intensity activity during game play (in-game approach) and performance tests measured pre- and postgame (across-game approach) have been used to assess player fatigue in basketball. However, a direct comparison of these approaches is not available. Consequently, this study examined the commonality between in- and across-game jump fatigue during simulated basketball game play.
Australian, state-level, junior male basketball players (n = 10; 16.6 ± 1.1 y, 182.4 ± 4.3 cm, 68.3 ± 10.2 kg) completed 4 × 10-min standardized quarters of simulated basketball game play. In-game jump height during game play was measured using video analysis, while across-game jump height was determined pre-, mid-, and postgame play using an in-ground force platform. Jump height was determined using the flight-time method, with jump decrement calculated for each approach across the first half, second half, and entire game.
A greater jump decrement was apparent for the in-game approach than for the across-game approach in the first half (37.1% ± 11.6% vs 1.7% ± 6.2%; P = .005; d = 3.81, large), while nonsignificant, large differences were evident between approaches in the second half (d = 1.14) and entire game (d = 1.83). Nonsignificant associations were evident between in-game and across-game jump decrement, with shared variances of 3–26%.
Large differences and a low commonality were observed between in- and across-game jump fatigue during basketball game play, suggesting that these approaches measure different constructs. Based on our findings, it is not recommended that basketball coaches use these approaches interchangeably to monitor player fatigue across the season.
Aaron T. Scanlan, Neal Wen, Patrick S. Tucker, Nattai R. Borges, and Vincent J. Dalbo
To compare perceptual and physiological training-load responses during various basketball training modes.
Eight semiprofessional male basketball players (age 26.3 ± 6.7 y, height 188.1 ± 6.2 cm, body mass 92.0 ± 13.8 kg) were monitored across a 10-wk period in the preparatory phase of their training plan. Player session ratings of perceived exertion (sRPE) and heart-rate (HR) responses were gathered across base, specific, and tactical/game-play training modes. Pearson correlations were used to determine the relationships between the sRPE model and 2 HR-based models: the training impulse (TRIMP) and summated HR zones (SHRZ). One-way ANOVAs were used to compare training loads between training modes for each model.
Stronger relationships between perceptual and physiological models were evident during base (sRPE-TRIMP r = .53, P < .05; sRPE-SHRZ r = .75, P < .05) and tactical/game-play conditioning (sRPE-TRIMP r = .60, P < .05; sRPE-SHRZ r = .63; P < .05) than during specific conditioning (sRPE-TRIMP r = .38, P < .05; sRPE-SHRZ r = .52; P < .05). Furthermore, the sRPE model detected greater increases (126–429 AU) in training load than the TRIMP (15–65 AU) and SHRZ models (27–170 AU) transitioning between training modes.
While the training-load models were significantly correlated during each training mode, weaker relationships were observed during specific conditioning. Comparisons suggest that the HR-based models were less effective in detecting periodized increases in training load, particularly during court-based, intermittent, multidirectional drills. The practical benefits and sensitivity of the sRPE model support its use across different basketball training modes.
John F.T. Fernandes, Kevin L. Lamb, Jonathan P. Norris, Jason Moran, Benjamin Drury, Nattai R. Borges, and Craig Twist
Aging is anecdotally associated with a prolonged recovery from resistance training, though current literature remains equivocal. This brief review considers the effects of resistance training on indirect markers of muscle damage and recovery (i.e., muscle soreness, blood markers, and muscle strength) in older males. With no date restrictions, four databases were searched for articles relating to aging, muscle damage, and recovery. Data from 11 studies were extracted for review. Of these, four reported worse symptoms in older compared with younger populations, while two have observed the opposite, and the remaining studies (n = 6) proposed no differences between age groups. It appears that resistance training can be practiced in older populations without concern for impaired recovery. To improve current knowledge, researchers are urged to utilize more ecologically valid muscle-damaging bouts and investigate the mechanisms which underpin the recovery of muscle soreness and strength after exercise in older populations.