Many endurance athletes perform specific blocks of training in hot environments in “heat stress training camps.” It is not known if physiological threshold heart rates measured in temperate conditions are reflective of those under moderate environmental heat stress. A total of 16 endurance-trained cyclists and triathletes performed incremental exercise assessments in 18°C and 35°C (both 60% relative humidity) to determine heart rates at absolute blood lactate and ventilatory thresholds. Heart rate at fixed blood lactate concentrations of 2, 3, and 4 mmol·L−1 and ventilatory thresholds were not significantly different between environments (P > .05), despite significant heat stress-induced reductions in power output of approximately 10% to 17% (P < .05, effect size = 0.65–1.15). The coefficient of variation for heart rate at these blood lactate concentrations (1.4%−2.9%) and ventilatory thresholds (2.3%−2.7%) between conditions was low, with significant strong positive correlations between measurements in the 2 environments (r = .92–.95, P < .05). These data indicate heart rates measured at physiological thresholds in temperate environments are reflective of measurements taken under moderate environmental heat stress. Therefore, endurance athletes embarking on heat stress training camps can use heart rate–based thresholds ascertained in temperate environments to prescribe training under moderate environmental heat stress.
Ed Maunder, Daniel J. Plews, Fabrice Merien, and Andrew E. Kilding
Jeffrey A. Rothschild, Andrew E. Kilding, and Daniel J. Plews
Athletes may choose to perform exercise in the overnight-fasted state for a variety of reasons related to convenience, gut comfort, or augmenting the training response, but it is unclear how many endurance athletes use this strategy. We investigated the prevalence and determinants of exercise performed in the overnight-fasted state among endurance athletes using an online survey and examined differences based on sex, competitive level, and habitual dietary pattern. The survey was completed by 1,950 endurance athletes (51.0% female, mean age 40.9 ± 11.1 years). The use of fasted training was reported by 62.9% of athletes, with significant effects of sex (p < .001, Cramer’s V [φc] = 0.18, 90% CI [0.14, 0.22]), competitive level (p < .001, φc = 0.09, 90% CI [0.5, 0.13]), and habitual dietary pattern noted (p < .001, φc = 0.26, 90% CI [0.22, 0.29]). Males, nonprofessional athletes, and athletes following a low-carbohydrate, high-fat diet were most likely to perform fasted training. The most common reasons for doing so were related to utilizing fat as a fuel source (42.9%), gut comfort (35.5%), and time constraints/convenience (31.4%), whereas the most common reasons athletes avoided fasted training were that it does not help their training (47.0%), performance was worse during fasted training (34.7%), or greater hunger (34.6%). Overall, some athletes perform fasted training because they think it helps their training, whereas others avoid it because they think it is detrimental to their training goals, highlighting a need for future research. These findings offer insights into the beliefs and practices related to fasted-state endurance training.
Jeffrey A. Rothschild, Matthieu Delcourt, Ed Maunder, and Daniel J. Plews
Purpose: To present a case report of an elite ultra-endurance cyclist, who was the winner and course record holder of 2 distinct races within a 4-month span: a 24-hour solo cycling race and a 2-man team multiday race (Race Across America).
Methods: The athlete’s raw data (cycling power, heart rate [HR], speed, and distance) were obtained and analyzed for 2 ultra-endurance races and 11 weeks of training in between.
Results: For the 24-hour race, the athlete completed 861.6 km (average speed 35.9 km·h−1, average power 210 W [2.8 W·kg−1], average HR 121 beats per minute) with a 37% decrease in power and a 22% decrease in HR throughout the race. During the 11 weeks between the 24-hour race and Race Across America, training intensity distribution (Zone 1/2/3) based on HR was 51%/39%/10%. For the Race Across America, total team time to complete the 4939-km race was 6 days, 10 hours, 39 minutes, at an average speed of 31.9 km·h−1. Of this, the athlete featured in this case study rode 75.2 hours, completing 2532 km (average speed 33.7 km·h−1, average power 203 W [2.7 W·kg−1]), with a 12% decrease in power throughout the race. Power during daytime segments was greater than nighttime (212  vs 189  W, P < .001,
Daniel J. Plews, Paul B. Laursen, and Martin Buchheit
Heart-rate variability (HRV) is a popular tool for monitoring autonomic nervous system status and training adaptation in athletes. It is believed that increases in HRV indicate effective training adaptation, but these are not always apparent in elite athletes.
Resting HRV was recorded in 4 elite rowers (rowers A, B, C, and D) over the 7-wk period before their success at the 2015 World Rowing Championships. The natural logarithm of the square root of the mean sum of the squared differences (Ln rMSSD) between R–R intervals, Ln rMSSD:R-R ratio trends, and the Ln-rMSSD–to–R-R-interval relationship were assessed for each championship-winning rower.
The time course of change in Ln rMSSD was athlete-dependent, with stagnation and decreases apparent. However, there were consistent substantial reductions in the Ln rMSSD:R-R ratio: rower A, baseline toward wk 5 (–2.35 ± 1.94); rower B, baseline to wk 4 and 5 (–0.41 ± 0.48 and –0.64 ± 0.65, respectively); rower C, baseline to wk 4 (–0.58 ± 0.66); and rower D, baseline to wk 4, 5, and 6 (–1.15 ± 0.91, –0.81 ± 0.74, and –1.43 ± 0.69, respectively).
Reductions in Ln rMSSD concurrent with reductions in the Ln rMSSD:R-R ratio are indicative of parasympathetic saturation. Consequently, 3 of 4 rowers displayed substantial increases in parasympathetic activity despite having decreases in Ln rMSSD. These results confirm that a combination of indices should be used to monitor cardiac autonomic activity.
Jamie Douglas, Daniel J. Plews, Phil J. Handcock, and Nancy J. Rehrer
To determine whether a facilitated recovery via cold-water immersion (CWI) after simulated rugby sevens would influence parasympathetic reactivation and repeated-sprint (RS) performance across 6 matches in a 2-d tournament.
Ten male team-sport athletes completed 6 rugby sevens match simulations over 2 d with either postmatch passive recovery (PAS) or CWI in a randomized crossover design. Parasympathetic reactivation was determined via the natural logarithm of the square root of the mean of the sum of the squares of differences between adjacent R-R intervals (ln rMSSD). RS performance was calculated as time taken (s) to complete 6 × 30-m sprints within the first half of each match.
There were large increases in postintervention ln rMSSD between CWI and PAS after all matches (ES 90% CL: +1.13; ±0.21). Average heart rate (HR) during the RS performance task (HRAverage RS) was impaired from baseline from match 3 onward for both conditions. However, HRAverage RS was higher with CWI than with PAS (ES 90% CL: 0.58; ±0.58). Peak HR during the RS performance task (HRPeak RS) was similarly impaired from baseline for match 3 onward during PAS and for match 4 onward with CWI. HRPeak RS was very likely higher with CWI than with PAS (ES 90% CL: +0.80; ±0.56). No effects of match or condition were observed for RS performance, although there were moderate correlations between the changes in HRAverage RS (r 90% CL: –0.33; ±0.14), HRPeak RS (r 90% CL: –0.38; ±0.13), and RS performance.
CWI facilitated cardiac parasympathetic reactivation after a simulated rugby sevens match. The decline in average and peak HR across matches was partially attenuated by CWI. This decline was moderately correlated with a reduction in RS performance.
Ed Maunder, Andrew E. Kilding, Christopher J. Stevens, and Daniel J. Plews
A common practice among endurance athletes is to purposefully train in hot environments during a “heat stress camp.” However, combined exercise-heat stress poses threats to athlete well-being, and therefore, heat stress training has the potential to induce maladaptation. This case study describes the monitoring strategies used in a successful 3-week heat stress camp undertaken by 2 elite Ironman triathletes, namely resting heart rate variability, self-report well-being, and careful prescription of training based on previously collected physiological data. Despite the added heat stress, training volume very likely increased in both athletes, and training load very likely increased in one of the athletes, while resting heart rate variability and self-report well-being were maintained. There was also some evidence of favorable metabolic changes during routine laboratory testing following the camp. The authors therefore recommend that practitioners working with endurance athletes embarking on a heat stress training camp consider using the simple strategies employed in the present case study to reduce the risk of maladaptation and nonfunctional overreaching.
Ana C. Holt, Daniel J. Plews, Katherine T. Oberlin-Brown, Fabrice Merien, and Andrew E. Kilding
Purpose: To determine the effect of different high-intensity interval-training (IT) sessions on the postexercise recovery response and time course across varying recovery measures. Methods: A total of 13 highly trained rowers (10 male and 3 female, peak oxygen uptake during a 6-min maximal test 4.9 [0.7] L·min−1) completed 3 IT sessions on a rowing ergometer separated by 7 d. Sessions consisted of 5 × 3.5 min, 4-min rest periods (maximal oxygen uptake [VO2max]); 10 × 30 s, 5-min rest periods (glycolytic); and 5 × 10 min, 4-min rest periods (threshold). Participants were instructed to perform intervals at the highest maintainable pace. Blood lactate and salivary cortisol were measured preexercise and postexercise. Resting heart-rate (HR) variability, post-submaximal-exercise HR variability, submaximal-exercise HR, HR recovery, and modified Wingate peak and mean power were measured preexercise and 1, 10, 24, 34, 48, 58, and 72 h postexercise. Participants resumed training throughout the measurement period. Results: Between-groups short-term response differences (1 h post-IT) across IT sessions were trivial or unclear for all recovery variables. However, post-submaximal-exercise HR variability demonstrated the longest recovery time course (threshold = 37.8 [14.2], glycolytic = 20.2 [11.0], and VO2max = 20.6 [15.2]; mean [h] ± confidence limits). Conclusion: Short-term responses to threshold, glycolytic, and VO2max IT in highly trained male and female rowers were similar. Recovery time course was greatest following threshold compared with glycolytic and VO2max-focused training, suggesting a durational influence on recovery time course at HR intensities ≥80% HRmax. As such, this provides valuable information around the programming and sequencing of high-intensity IT for endurance athletes.
Daniel J. Plews, Paul B. Laursen, Yann Le Meur, Christophe Hausswirth, Andrew E. Kilding, and Martin Buchheit
To establish the minimum number of days that heart-rate-variability (HRV, ie, the natural logarithm of square root of the mean sum of the squared differences between R-R intervals, Ln rMSSD) data should be averaged to achieve correspondingly equivalent results as data averaged over a 1-wk period.
Standardized changes in Ln rMSSD between different phases of training (normal training, functional overreaching (FOR), overall training, and taper) and the correlation coefficients of percentage changes in performance vs changes in Ln rMSSD were compared when averaging Ln rMSSD from 1 to 7 d, randomly selected within the week.
Standardized Ln rMSSD changes (90% confidence limits, CL) from baseline to overload (FOR) were 0.20 ± 0.28, 0.33 ± 0.26, 0.49 ± 0.33, 0.48 ± 0.28, 0.47 ± 0.26, 0.45 ± 0.26, and 0.43 ± 0.29 on days 1 to 7, respectively. Correlations (90% CL) over the same time sequence and training phase were –.02 ± .23, –.07 ± .23, –.17 ± .22, –.25 ± .22, –.26 ± .22, –.28 ± .21, and –.25 ± .22 on days 1 to 7. There were almost perfect quadratic relationships between standardized changes/r values vs the number of days Ln rMSSD was averaged (r 2 = .92 and .97, respectively) in trained triathletes during FOR. This indicates a plateau in the increase in standardized changes/r values’ magnitude after 3 and 4 d, respectively, in trained triathletes.
Practitioners using HRV to monitor training adaptation should use a minimum of 3 (randomly selected) valid data points per week.
Daniel J. Plews, Paul B. Laursen, Andrew E. Kilding, and Martin Buchheit
Elite endurance athletes may train in a polarized fashion, such that their training-intensity distribution preserves autonomic balance. However, field data supporting this are limited.
The authors examined the relationship between heart-rate variability and training-intensity distribution in 9 elite rowers during the 26-wk build-up to the 2012 Olympic Games (2 won gold and 2 won bronze medals). Weekly averaged log-transformed square root of the mean sum of the squared differences between R-R intervals (Ln rMSSD) was examined, with respect to changes in total training time (TTT) and training time below the first lactate threshold (>LT1), above the second lactate threshold (LT2), and between LT1 and LT2 (LT1–LT2).
After substantial increases in training time in a particular training zone or load, standardized changes in Ln rMSSD were +0.13 (unclear) for TTT, +0.20 (51% chance increase) for time >LT1, –0.02 (trivial) for time LT1–LT2, and –0.20 (53% chance decrease) for time >LT2. Correlations (±90% confidence limits) for Ln rMSSD were small vs TTT (r = .37 ± .80), moderate vs time >LT1 (r = .43 ± .10), unclear vs LT1–LT2 (r = .01 ± .17), and small vs >LT2 (r = –.22 ± .50).
These data provide supportive rationale for the polarized model of training, showing that training phases with increased time spent at high intensity suppress parasympathetic activity, while low-intensity training preserves and increases it. As such, periodized low-intensity training may be beneficial for optimal training programming.
Daniel J. Plews, Ben Scott, Marco Altini, Matt Wood, Andrew E. Kilding, and Paul B. Laursen
Purpose: To establish the validity of smartphone photoplethysmography (PPG) and heart-rate sensor in the measurement of heart-rate variability (HRV). Methods: 29 healthy subjects were measured at rest during 5 min of guided breathing and normal breathing using smartphone PPG, a heart-rate chest strap, and electrocardiography (ECG). The root mean sum of the squared differences between R–R intervals (rMSSD) was determined from each device. Results: Compared to ECG, the technical error of estimate (TEE) was acceptable for all conditions (average TEE CV% [90% CI] = 6.35 [5.13; 8.5]). When assessed as a standardized difference, all differences were deemed “trivial” (average standard difference [90% CI] = 0.10 [0.08; 0.13]). Both PPG- and heart-rate-sensor-derived measures had almost perfect correlations with ECG (R = 1.00 [0.99; 1.00]). Conclusion: Both PPG and heart-rate sensors provide an acceptable agreement for the measurement of rMSSD when compared with ECG. Smartphone PPG technology may be a preferred method of HRV data collection for athletes due to its practicality and ease of use in the field.