Purpose: This study aimed to determine the effects of an acute “train-low” nutritional protocol on markers of recovery optimization compared to standard recovery nutrition protocol. Methods: After completing a 2-hour high-intensity interval running protocol, 8 male endurance athletes consumed a standard dairy milk recovery beverage (CHO; 1.2 g/kg body mass [BM] of carbohydrate and 0.4 g/kg BM of protein) and a low-carbohydrate (L-CHO; isovolumetric with 0.35 g/kg BM of carbohydrate and 0.5 g/kg BM of protein) dairy milk beverage in a double-blind randomized crossover design. Venous blood and breath samples, nude BM, body water, and gastrointestinal symptom measurements were collected preexercise and during recovery. Muscle biopsy was performed at 0 hour and 2 hours of recovery. Participants returned to the laboratory the following morning to measure energy substrate oxidation and perform a 1-hour distance test. Results: The exercise protocol resulted in depletion of muscle glycogen stores (250 mmol/kg dry weight) and mild body-water losses (BM loss = 1.8%). Neither recovery beverage replenished muscle glycogen stores (279 mmol/kg dry weight) or prevented a decrease in bacterially stimulated neutrophil function (−21%). Both recovery beverages increased phosphorylation of mTORSer2448 (main effect of time = P < .001) and returned hydration status to baseline. A greater fold increase in p-GSK-3βSer9/total-GSK-3β occurred on CHO (P = .012). Blood glucose (P = .005) and insulin (P = .012) responses were significantly greater on CHO (618 mmol/L per 2 h and 3507 μIU/mL per 2 h, respectively) compared to L-CHO (559 mmol/L per 2 h and 1147 μIU/mL per 2 h, respectively). Rates of total fat oxidation were greater on CHO, but performance was not affected. Conclusion: A lower-carbohydrate recovery beverage consumed after exercise in a “train-low” nutritional protocol does not negatively impact recovery optimization outcomes.
Isabella Russo, Paul A. Della Gatta, Andrew Garnham, Judi Porter, Louise M. Burke, and Ricardo J.S. Costa
Tim J. Mosey and Lachlan J.G. Mitchell
Objectives: The purpose of this study was to document the longitudinal strength and power characteristic changes and race performance changes of a skeleton athlete. Method: Longitudinal strength and power changes were assessed with strength and power diagnostic testing over a 9-year period. Trends over 9 years for relative strength were analyzed using a linear model. Push-start time was recorded across multiple tracks. Trends over 9 years for start performance at each track were assessed using a mixed-effects linear model to account for the impact of different tracks. Lower-body strength and power changes were assessed via a 1-repetition-maximum squat and a body-weight countermovement jump. The relationship between strength and power changes was assessed over time. The relationship between strength changes and start performance was determined by assessing the fixed effect of relative strength changes on push-start time. Results: Relative lower-body strength ranged from 1.6 kg per body weight to 1.9 kg per body weight and showed a significant mean improvement of 0.05 kg per body weight per year (R 2 = .71, P < .01). A negative correlation (R 2 = .79) between relative strength changes and push-start performance across multiple tracks was found. The mixed-effects model indicated that push-start time improved significantly year to year (0.02 s; P < .001; R 2 = .74) when controlling for the effect of track. Conclusions: The longitudinal analysis of push-start time and the associations with changes in strength suggest that training this quality can have a positive effect on push-start performance.
Even Brøndbo Dahl, Eivind Øygard, Gøran Paulsen, Bjarne Rud, and Thomas Losnegard
Purpose: Preconditioning exercise is a widely used strategy believed to enhance performance later the same day. The authors examined the influence of preconditioning exercises 6 hours prior to a time-to-exhaustion (TTE) test during treadmill running. Methods: Ten male competitive runners (age = 26  y, height = 184  cm, weight = 73  kg, maximum oxygen consumption = 72  mL·kg−1·min−1) did a preconditioning session of running (RUN) or resistance exercise (RES) or no morning exercise (NoEx) in a randomized order, separated by >72 hours. The RUN consisted of 15 minutes of low-intensity running and 4 × 15 seconds at race pace (21–24 km·h−1) on a treadmill; RES involved 5 minutes of low-intensity running and 2 × 3 repetitions of isokinetic 1-leg shallow squats with maximal mobilization. Following a 6-hour break, electrically evoked force (m. vastus medialis), countermovement jump, running economy, and a TTE of approximately 2 minutes were examined. Results: Relative to NoEx, no difference was seen for RUN or RES in TTE (mean ± 95% CI: −1.3% ± 3.4% and −0.5% ± 6.0%) or running economy (0.2% ± 1.6% and 1.9% ± 2.7%; all Ps > .05). Jump height was not different for the RUN condition (1.0% ± 2.7%]) but tended to be higher in RES than in the NoEx condition (1.5% ± 1.6%, P = .07). The electrically evoked force tended to reveal low-frequency fatigue (reduced 20:50-Hz peak force ratio) only after RES compared to NoEx (−4.5% ± 4.6%, P = .06). Conclusion: The RUN or RES 6 hours prior to approximately 2 minutes of TTE running test did not improve performance in competitive runners.
Kolbjørn Lindberg, Ingrid Eythorsdottir, Paul Solberg, Øyvind Gløersen, Olivier Seynnes, Thomas Bjørnsen, and Gøran Paulsen
Purpose: The aim of this study was to examine the concurrent validity of force–velocity (FV) variables assessed across 5 Keiser leg press devices. Methods: A linear encoder and 2 independent force plates (MuscleLab devices) were mounted on each of the 5 leg press devices. A total of 997 leg press executions, covering a wide range of forces and velocities, were performed by 14 participants (29  y, 181  cm, 82  kg) across the 5 devices. Average and peak force, velocity, and power values were collected simultaneously from the Keiser and MuscleLab devices for each repetition. Individual FV profiles were fitted to each participant from peak and average force and velocity measurements. Theoretical maximal force, velocity, and power were deduced from the FV relationship. Results: Average and peak force and velocity had a coefficient of variation of 1.5% to 8.6%, near-perfect correlations (.994–.999), and a systematic bias of 0.7% to 7.1% when compared with reference measurements. Average and peak power showed larger coefficient of variations (11.6% and 17.2%), despite excellent correlations (.977 and .952), and trivial to small biases (3.9% and 8.4%). Extrapolated FV variables showed near-perfect correlations (.983–.997) with trivial to small biases (1.4%–11.2%) and a coefficient of variation of 1.4% to 5.9%. Conclusions: The Keiser leg press device can obtain valid measurements over a wide range of forces and velocities across different devices. To accurately measure power, theoretical maximal power calculated from the FV profile is recommended over average and peak power values from single repetitions, due to the lower random error observed for theoretical maximal power.
Mark J. Kilgallon, Michael J. Johnston, Liam P. Kilduff, and Mark L. Watsford
Purpose: To compare resistance training using a velocity loss threshold with training to repetition failure on upper-body strength parameters in professional Australian footballers. Methods: A total of 26 professional Australian footballers (23.9 [4.2] y, 189.9 [7.8] cm, 88.2 [8.8] kg) tested 1-repetition-maximum strength (FPmax) and mean barbell velocity at 85% of 1-repetition maximum on floor press (FPvel). They were then assigned to 2 training groups: 20% velocity loss threshold training (VL; n = 12, maximum-effort lift velocity) or training to repetition failure (TF; n = 14, self-selected lift velocity). Subjects trained twice per week for 3 weeks before being reassessed on FPmax and FPvel. Training volume (total repetitions) was recorded for all training sessions. No differences were present between groups on any pretraining measure. Results: The TF group significantly improved FPmax (105.2–110.9 kg, +5.4%), while the VL group did not (107.5–109.2 kg, +1.6%) (P > .05). Both groups significantly increased FPvel (0.38–0.46 m·s−1, +19.1% and 0.37–0.42 m·s−1, +16.7%, respectively) with no between-groups differences evident (P > .05). The TF group performed significantly more training volume (12.2 vs 6.8 repetitions per session, P > .05). Conclusions: Training to repetition failure improved FPmax, while training using a velocity loss threshold of 20% did not. Both groups demonstrated similar improvements in FPvel despite the VL group completing 45% less total training volume than the TF group. The reduction in training volume associated with implementing a 20% velocity loss threshold may negatively impact the development of upper-body maximum strength while still enhancing submaximal movement velocity.
Andrew A. Flatt, Jeff R. Allen, Clay M. Keith, Matthew W. Martinez, and Michael R. Esco
Purpose: To track cardiac-autonomic functioning, indexed by heart-rate variability, in American college football players throughout a competitive period. Methods: Resting heart rate (RHR) and the natural logarithm root mean square of successive differences (LnRMSSD) were obtained throughout preseason and ∼3 times weekly leading up to the national championship among 8 linemen and 12 nonlinemen. Seated 1-minute recordings were performed via mobile device and standardized for time of day and proximity to training. Results: Relative to preseason, linemen exhibited suppressed LnRMSSD during camp-style preparation for the playoffs (P = .041, effect size [ES] = −1.01), the week of the national semifinal (P < .001, ES = −1.27), and the week of the national championship (P = .005, ES = −1.16). As a combined group, increases in RHR (P < .001) were observed at the same time points (nonlinemen ES = 0.48–0.59, linemen ES = 1.03–1.10). For all linemen, RHR trended upward (positive slopes, R 2 = .02–.77) while LnRMSSD trended downward (negative slopes, R 2 = .02–.62) throughout the season. Preseason to postseason changes in RHR (r = .50, P = .025) and LnRMSSD (r = −.68, P < .001) were associated with body mass. Conclusions: Heart-rate variability tracking revealed progressive autonomic imbalance in the lineman position group, with individual players showing suppressed values by midseason. Attenuated parasympathetic activation is a hallmark of impaired recovery and may contribute to cardiovascular maladaptations reported to occur in linemen following a competitive season. Thus, a descending pattern may serve as an easily identifiable red flag requiring attention from performance and medical staff.
Antonis Kesisoglou, Andrea Nicolò, Lucinda Howland, and Louis Passfield
Purpose: To examine the effect of continuous (CON) and intermittent (INT) running training sessions of different durations and intensities on subsequent performance and calculated training load (TL). Methods: Runners (N = 11) performed a 1500-m time trial as a baseline and after completing 4 different running training sessions. The training sessions were performed in a randomized order and were either maximal for 10 minutes (10CON and 10INT) or submaximal for 25 minutes (25CON and 25INT). An acute performance decrement (APD) was calculated as the percentage change in 1500-m time-trial speed measured after training compared with baseline. The pattern of APD response was compared with that for several TL metrics (bTRIMP, eTRIMP, iTRIMP, running training stress score, and session rating of perceived exertion) for the respective training sessions. Results: Average speed (P < .001,
Bjarne Rud, Eivind Øygard, Even B. Dahl, Gøran Paulsen, and Thomas Losnegard
Purpose: We tested whether a single session of heavy-load resistance priming conducted in the morning improved double-poling (DP) performance in the afternoon. Methods: Eight national-level male cross-country skiers (mean [SD]: 23  y, 184  cm, 73  kg, maximum oxygen consumption = 69  mL·kg−1·min−1) carried out 2 days of afternoon performance tests. In the morning, 5 hours before tests, subjects were counterbalanced to either a session of 3 × 3 repetitions (approximately 85%–90% 1-repetition maximum) of squat and sitting pullover exercises or no exercise. The performance was evaluated in DP as time to exhaustion (TTE) (approximately 3 min) on a treadmill and 30-m indoor sprints before and after TTE (30-m DP pre/post). Furthermore, submaximal DP oxygen cost, countermovement jump, and isometric knee-extension force during electrical stimulation were conducted. Participants reported perceived readiness on test days. Results: Resistance exercise session versus no exercise did not differ for TTE (approximately 3 min above) (mean ± 95% confidence interval = 3.6% ± 6.0%; P = .29; effect size [ES], Cohen d = 0.27), 30-m DP pre (−0.56% ± 0.80%; P = .21; ES = 0.20), 30-m DP post (−0.18% ± 1.13%; P = .76; ES = 0.03), countermovement jump (−2.0% ± 2.8%; P = .21; ES = 0.12), DP oxygen cost (−0.13% ± 2.04%; P = .91; ES = 0.02), or perceived readiness (P ≥ .11). Electrical stimulation force was not different in contraction or relaxation time but revealed low-frequency fatigue in the afternoon for the resistance exercise session only (−12% [7%]; P = .01; ES = 1.3). Conclusion: A single session of heavy-load, low-volume resistance exercise in the morning did not increase afternoon DP performance of short duration in high-level skiers. However, leg low-frequency fatigue after resistance priming, together with the presence of small positive effects in 2 out of 3 DP tests, may indicate that the preconditioning was too strenuous.
Maria Misailidi, Konstantinos Mantzios, Christos Papakonstantinou, Leonidas G. Ioannou, and Andreas D. Flouris
Purpose: We investigated the environmental conditions in which all outdoor International Tennis Federation (ITF) junior tournaments (athlete ages: <18 y) were held during 2010–2019. Thereafter, we performed a crossover trial (ClinicalTrials.gov: NCT04197375) assessing the efficacy of head–neck precooling for mitigating the heat-induced psychophysical and performance impacts on junior athletes during tennis match play. Methods: ITF junior tournament information was collected. We identified meteorological data from nearby (13.6 [20.3] km) weather stations for 3056 (76%) tournaments. Results: Overall, 30.1% of tournaments were held in hot (25°C–30°C wet-bulb globe temperature [WBGT]; 25.9%), very hot (30°C–35°C WBGT; 4.1%), or extremely hot (>35°C WBGT; 0.1%) conditions. Thereafter, 8 acclimatized male junior tennis athletes (age = 16.0 [0.9] y; height = 1.82 [0.04] m; weight = 71.3 [11.1] kg) were evaluated during 2 matches: one with head–neck precooling (27.7°C [2.2°C] WBGT) and one without (27.9°C [1.8°C] WBGT). Head–neck precooling reduced athletes’ core temperature from 36.9°C (0.2°C) to 36.4°C (0.2°C) (P = .001; d = 2.4), an effect reduced by warm-up. Head–neck precooling reduced skin temperature (by 0.3°C [1.3°C]) for the majority of the match and led to improved (P < .05) perceived exertion (by 13%), thermal comfort (by 14%), and thermal sensation (by 15%). Muscle temperature, heart rate, body weight, and urine specific gravity remained unaffected (P ≥ .05; d < 0.2). Small or moderate improvements were observed in most performance parameters assessed (d = 0.20–0.79). Conclusions: Thirty percent of the last decade’s ITF junior tournaments were held in hot, very hot, or extremely hot conditions (25°C–36°C WBGT). In such conditions, head–neck precooling may somewhat lessen the physiological and perceptual heat strain and lead to small to moderate improvements in the match-play performance of adolescent athletes.
Charli Sargent, Michele Lastella, Shona L. Halson, and Gregory D. Roach
Purpose: Anecdotal reports indicate that many elite athletes are dissatisfied with their sleep, but little is known about their actual sleep requirements. Therefore, the aim of this study was to compare the self-assessed sleep need of elite athletes with an objective measure of their habitual sleep duration. Methods: Participants were 175 elite athletes (n = 30 females), age 22.2 (3.8) years (mean [SD]) from 12 individual and team sports. The athletes answered the question “how many hours of sleep do you need to feel rested?” and they kept a self-report sleep diary and wore a wrist activity monitor for ∼12 nights during a normal phase of training. For each athlete, a sleep deficit index was calculated by subtracting their average sleep duration from their self-assessed sleep need. Results: The athletes needed 8.3 (0.9) hours of sleep to feel rested, their average sleep duration was 6.7 (0.8) hours, and they had a sleep deficit index of 96.0 (60.6) minutes. Only 3% of athletes obtained enough sleep to satisfy their self-assessed sleep need, and 71% of athletes fell short by an hour or more. Specifically, habitual sleep duration was shorter in athletes from individual sports than in athletes from team sports (F1,173 = 13.1, P < .001; d = 0.6, medium), despite their similar sleep need (F1,173 = 1.40, P = .24; d = 0.2, small). Conclusions: The majority of elite athletes obtain substantially less than their self-assessed sleep need. This is a critical finding, given that insufficient sleep may compromise an athlete’s capacity to train effectively and/or compete optimally.