We quantified the effect of an extended live high-train low (LHTL) simulated altitude exposure followed by a series of training camps at natural moderate altitude on competitive performance in seven elite middle-distance runners (Vo2max 71.4 ± 3.4 mL·min−1·kg−1, mean ± SD). Runners spent 44 ± 7 nights (mean ± SD) at a simulated altitude of 2846 ± 32 m, and a further 4 X 7- to 10-d training at natural moderate altitude (1700–2200 m) before racing. The combination of simulated LHTL and natural altitude training improved competitive performance by 1.9% (90% confidence limits, 1.3-2.5%). Middle-distance runners can confidently use a combination of simulated and natural altitude to stimulate adaptations responsible for improving performance.
Philo U. Saunders, Richard D. Telford, David B. Pyne, Christopher J. Gore and Allan G. Hahn
Philo U. Saunders, Amanda J. Cox, Will G. Hopkins and David B. Pyne
It is unclear whether physiological measures monitored in an incremental treadmill test during a training season provide useful diagnostic information about changes in distance running performance.
To quantify the relationship between changes in physiological measures and performance (peak running speed) over a training season.
Well-trained distance runners (34 males; VO2max 64 ± 6 mL⋅kg-1⋅min-1, mean ± SD) completed four incremental treadmill tests over 17 wk. The tests provided values of peak running speed, VO2max, running economy, and lactate threshold (as speed and %VO2max). The physiological measures were included in simple and multiple linear regression models to quantify the relationship between changes in these measures and changes in peak speed.
The typical within-subject variation in peak speed from test to test was 2.5%, whereas those for physiological measures were VO2max (mL⋅min-1⋅kg-1) 3.0%, economy (m⋅kg⋅mL–1) 3.6%, lactate threshold (%VO2max) 8.7%, and body mass 1.8%. In simple models these typical changes predicted the following changes in performance: VO2max 1.4%, economy 0.8%, lactate threshold –0.3%, and body mass –0.2% (90% confidence limits ~±0.7%); the corresponding correlations with performance were 0.57, 0.33, –0.05, and –0.13 respectively (~±0.20). In a multiple linear regression model, the contribution of each physiological variable to performance changed little after adjustment for the other variables.
Change in VO2max in an incremental test during a running season is a good predictor of change in peak running speed, change in running economy is a moderate predictor, and lactate threshold and body mass provide little additional information.
Louise M. Burke, Clare Wood, David B. Pyne, Richard D. Telford and Philo U. Saunders
Eighteen highly-trained runners ran two half marathons in mild environmental conditions, 3 wk apart, consuming either 426 ± 227 mL of a flavored placebo drink (PLACEBO) or an equivalent volume of water (386 ± 185 mL) and a commercial gel (GEL) supplying 1.1 ± 0.2 g/kg body mass (BM) carbohydrate (CHO). Voluntary consumption of this fluid was associated with a mean BM change of ~ 2.4%. Runners performed better in their second race by 0.9% or 40 s (P = 0.03). Three runners complained of gastrointestinal discomfort in GEL trial, which produced a clear impairment of half-marathon performance by 2.4% or 105 s (P = 0.03 ) . The effect of GEL on performance was trivial: time was improved b y 0.3% or 14 s compared with PLACEBO (P = 0.52). Consuming the gel was associated with a 2.4% slower time through the 2 × 200 m feed zone; adding a trivial ~ 2 s to race time. Although benefits to half marathon performance were not detected, the theoretical improvement during 1-h exercise with CHO intake merits further investigation.
Philo U. Saunders, Laura A. Garvican-Lewis, Robert F. Chapman and Julien D. Périard
High-level athletes are always looking at ways to maximize training adaptations for competition performance, and using altered environmental conditions to achieve this outcome has become increasingly popular by elite athletes. Furthermore, a series of potential nutrition and hydration interventions may also optimize the adaptation to altered environments. Altitude training was first used to prepare for competition at altitude, and it still is today; however, more often now, elite athletes embark on a series of altitude training camps to try to improve sea-level performance. Similarly, the use of heat acclimation/acclimatization to optimize performance in hot/humid environmental conditions is a common practice by high-level athletes and is well supported in the scientific literature. More recently, the use of heat training to improve exercise capacity in temperate environments has been investigated and appears to have positive outcomes. This consensus statement will detail the use of both heat and altitude training interventions to optimize performance capacities in elite athletes in both normal environmental conditions and extreme conditions (hot and/or high), with a focus on the importance of nutritional strategies required in these extreme environmental conditions to maximize adaptations conducive to competitive performance enhancement.
Lachlan J.G. Mitchell, Ben Rattray, Paul Wu, Philo U. Saunders and David B. Pyne
Purpose: Critical speed (CS) and supra-CS distance capacity (D′) are useful metrics for monitoring changes in swimmers’ physiological and performance capacities. However, the utility of these metrics across a season has not been systematically evaluated in high-level swimmers. Methods: A total of 27 swimmers (mean [SD]: 18 females, age = 19.1 [2.9] y, and 9 males, age = 19.5 [1.9] y) completed the 12 × 25-m swimming test multiple times (4  tests/swimmer) across a 2-y period. Season-best times in all distances for the test stroke were sourced from publicly available databases. Swimmers’ distance speciality was determined as the event with the time closest to world record. Four metrics were calculated from the 12 × 25-m test: CS, D′, peak speed, and drop-off %. Results: Guyatt responsiveness index values were calculated to ascertain the practically relevant sensitivity of each 12 × 25-m metric: CS = 1.5, peak speed = 2.3, D′ = 2.1, and drop-off % = 2.6. These values are modified effect sizes; all are large effects. Bayesian mixed modeling showed substantial between-subjects differences between genders and strokes for each variable but minimal within-subject changes across the season. Drop-off % was lower in 200-m swimmers (14.0% [3.3%]) than in 100-m swimmers (18.1% [4.1%], P = .003, effect size = 1.10). Conclusions: The 12 × 25-m test is best suited to differentiating between swimmers of different strokes and events. Further development is needed to improve its utility in quantifying meaningful changes over a season for individual swimmers.
Erin L. McCleave, Katie M. Slattery, Rob Duffield, Philo U. Saunders, Avish P. Sharma, Stephen Crowcroft and Aaron J. Coutts
Purpose: To determine whether combining training in heat with “Live High, Train Low” hypoxia (LHTL) further improves thermoregulatory and cardiovascular responses to a heat-tolerance test compared with independent heat training. Methods: A total of 25 trained runners (peak oxygen uptake = 64.1 [8.0] mL·min−1·kg−1) completed 3-wk training in 1 of 3 conditions: (1) heat training combined with “LHTL” hypoxia (H+H; FiO2 = 14.4% [3000 m], 13 h·d−1; train at <600 m, 33°C, 55% relative humidity [RH]), (2) heat training (HOT; live and train <600 m, 33°C, 55% RH), and (3) temperate training (CONT; live and train <600 m, 13°C, 55% RH). Heat adaptations were determined from a 45-min heat-response test (33°C, 55% RH, 65% velocity corresponding to the peak oxygen uptake) at baseline and immediately and 1 and 3 wk postexposure (baseline, post, 1 wkP, and 3 wkP, respectively). Core temperature, heart rate, sweat rate, sodium concentration, plasma volume, and perceptual responses were analyzed using magnitude-based inferences. Results: Submaximal heart rate (effect size [ES] = −0.60 [−0.89; −0.32]) and core temperature (ES = −0.55 [−0.99; −0.10]) were reduced in HOT until 1 wkP. Sweat rate (ES = 0.36 [0.12; 0.59]) and sweat sodium concentration (ES = −0.82 [−1.48; −0.16]) were, respectively, increased and decreased until 3 wkP in HOT. Submaximal heart rate (ES = −0.38 [−0.85; 0.08]) was likely reduced in H+H at 3 wkP, whereas CONT had unclear physiological changes. Perceived exertion and thermal sensation were reduced across all groups. Conclusions: Despite greater physiological stress from combined heat training and “LHTL” hypoxia, thermoregulatory adaptations are limited in comparison with independent heat training. The combined stimuli provide no additional physiological benefit during exercise in hot environments.
Amy L. Woods, Avish P. Sharma, Laura A. Garvican-Lewis, Philo U. Saunders, Anthony J. Rice and Kevin G. Thompson
High altitude exposure can increase resting metabolic rate (RMR) and induce weight loss in obese populations, but there is a lack of research regarding RMR in athletes at moderate elevations common to endurance training camps. The present study aimed to determine whether 4 weeks of classical altitude training affects RMR in middle-distance runners. Ten highly trained athletes were recruited for 4 weeks of endurance training undertaking identical programs at either 2200m in Flagstaff, Arizona (ALT, n = 5) or 600m in Canberra, Australia (CON, n = 5). RMR, anthropometry, energy intake, and hemoglobin mass (Hbmass) were assessed pre- and posttraining. Weekly run distance during the training block was: ALT 96.8 ± 18.3km; CON 103.1 ± 5.6km. A significant interaction for Time*Group was observed for absolute (kJ.day-1) (F-statistic, p-value: F(1,8)=13.890, p = .01) and relative RMR (F(1,8)=653.453, p = .003) POST-training. No significant changes in anthropometry were observed in either group. Energy intake was unchanged (mean ± SD of difference, ALT: 195 ± 3921kJ, p = .25; CON: 836 ± 7535kJ, p = .75). A significant main effect for time was demonstrated for total Hbmass (g) (F(1,8)=13.380, p = .01), but no significant interactions were observed for either variable [Total Hbmass (g): F(1,8)=1.706, p = .23; Relative Hbmass (g.kg-1): F(1,8)=0.609, p = .46]. These novel findings have important practical application to endurance athletes routinely training at moderate altitude, and those seeking to optimize energy management without compromising training adaptation. Altitude exposure may increase RMR and enhance training adaptation,. During training camps at moderate altitude, an increased energy intake is likely required to support an increased RMR and provide sufficient energy for training and performance.
Avish P. Sharma, Philo U. Saunders, Laura A. Garvican-Lewis, Brad Clark, Jamie Stanley, Eileen Y. Robertson and Kevin G. Thompson
To determine the effect of training at 2100-m natural altitude on running speed (RS) during training sessions over a range of intensities relevant to middle-distance running performance.
In an observational study, 19 elite middle-distance runners (mean ± SD age 25 ± 5 y, VO2max, 71 ± 5 mL · kg–1 · min–1) completed either 4–6 wk of sea-level training (CON, n = 7) or a 4- to 5-wk natural altitude-training camp living at 2100 m and training at 1400–2700 m (ALT, n = 12) after a period of sea-level training. Each training session was recorded on a GPS watch, and athletes also provided a score for session rating of perceived exertion (sRPE). Training sessions were grouped according to duration and intensity. RS (km/h) and sRPE from matched training sessions completed at sea level and 2100 m were compared within ALT, with sessions completed at sea level in CON describing normal variation.
In ALT, RS was reduced at altitude compared with sea level, with the greatest decrements observed during threshold- and VO2max-intensity sessions (5.8% and 3.6%, respectively). Velocity of low-intensity and race-pace sessions completed at a lower altitude (1400 m) and/or with additional recovery was maintained in ALT, though at a significantly greater sRPE (P = .04 and .05, respectively). There was no change in velocity or sRPE at any intensity in CON.
RS in elite middle-distance athletes is adversely affected at 2100-m natural altitude, with levels of impairment dependent on the intensity of training. Maintenance of RS at certain intensities while training at altitude can result in a higher perceived exertion.
Avish P. Sharma, Philo U. Saunders, Laura A. Garvican-Lewis, Brad Clark, Marijke Welvaert, Christopher J. Gore and Kevin G. Thompson
Purpose: To determine the effect of altitude training at 1600 and 1800 m on sea-level (SL) performance in national-level runners. Methods: After 3 wk of SL training, 24 runners completed a 3-wk sojourn at 1600 m (ALT1600, n = 8), 1800 m (ALT1800, n = 9), or SL (CON, n = 7), followed by up to 11 wk of SL racing. Race performance was measured at SL during the lead-in period and repeatedly postintervention. Training volume (in kilometers) and load (session rating of perceived exertion) were calculated for all sessions. Hemoglobin mass was measured via CO rebreathing. Between-groups differences were evaluated using effect sizes (Hedges g). Results: Performance improved in both ALT1600 (mean [SD] 1.5% [0.9%]) and ALT1800 (1.6% [1.3%]) compared with CON (0.4% [1.7%]); g = 0.83 (90% confidence limits −0.10, 1.66) and 0.81 (−0.09, 1.62), respectively. Season-best performances occurred 5 to 71 d postaltitude in ALT1600/1800. There were large increases in training load from lead-in to intervention in ALT1600 (48% [32%]) and ALT1800 (60% [31%]) compared with CON (18% [20%]); g = 1.24 (0.24, 2.08) and 1.69 (0.65, 2.55), respectively. Hemoglobin mass increased in ALT1600 and ALT1800 (∼4%) but not CON. Conclusions: Larger improvements in performance after altitude training may be due to the greater overall load of training in hypoxia compared with normoxia, combined with a hypoxia-mediated increase in hemoglobin mass. A wide time frame for peak performances suggests that the optimal window to race postaltitude is individual, and factors other than altitude exposure per se may be important.
Amelia J. Carr, Laura A. Garvican-Lewis, Brent S. Vallance, Andrew P. Drake, Philo U. Saunders, Clare E. Humberstone and Christopher J. Gore
Purpose: To compare the effects of natural altitude training (NAT) and simulated (SIM) live high:train low altitude training on road-race walking performance (min), as well as treadmill threshold walking speed (km·h−1) at 4 mmol·L−1 and maximal oxygen consumption, at 1380 m. Methods: Twenty-two elite-level male (n = 15) and female (n = 7) race walkers completed 14 d of NAT at 1380 m (n = 7), SIM live high:train low at 3000:600 m (n = 7), or control conditions (600-m altitude; CON, n = 8). All preintervention and postintervention testing procedures were conducted at 1380 m and consisted of an incremental treadmill test, completed prior to a 5 × 2-km road-race walking performance test. Differences between groups were analyzed via mixed-model analysis of variance and magnitude-based inferences, with a substantial change detected with >75% likelihood of exceeding the smallest worthwhile change. Results: The improvement in total performance time for the 5 × 2-km test in NAT was not substantially different from SIM but was substantially greater (85% likely) than CON. The improvement in percentage decrement in the 5 × 2-km performance test in NAT was greater than in both SIM (93% likely) and CON (93% likely). The increase in maximal oxygen consumption was substantially greater (91% likely) in NAT than in SIM. Improvement in threshold walking speed was substantially greater than CON for both SIM (91% likely) and NAT (90% likely). Conclusions: Both NAT and SIM may allow athletes to achieve reasonable acclimation prior to competition at low altitude.