Elite endurance athletes may train in a polarized fashion, such that their training-intensity distribution preserves autonomic balance. However, field data supporting this are limited.
The authors examined the relationship between heart-rate variability and training-intensity distribution in 9 elite rowers during the 26-wk build-up to the 2012 Olympic Games (2 won gold and 2 won bronze medals). Weekly averaged log-transformed square root of the mean sum of the squared differences between R-R intervals (Ln rMSSD) was examined, with respect to changes in total training time (TTT) and training time below the first lactate threshold (>LT1), above the second lactate threshold (LT2), and between LT1 and LT2 (LT1–LT2).
After substantial increases in training time in a particular training zone or load, standardized changes in Ln rMSSD were +0.13 (unclear) for TTT, +0.20 (51% chance increase) for time >LT1, –0.02 (trivial) for time LT1–LT2, and –0.20 (53% chance decrease) for time >LT2. Correlations (±90% confidence limits) for Ln rMSSD were small vs TTT (r = .37 ± .80), moderate vs time >LT1 (r = .43 ± .10), unclear vs LT1–LT2 (r = .01 ± .17), and small vs >LT2 (r = –.22 ± .50).
These data provide supportive rationale for the polarized model of training, showing that training phases with increased time spent at high intensity suppress parasympathetic activity, while low-intensity training preserves and increases it. As such, periodized low-intensity training may be beneficial for optimal training programming.
Plews and Laursen are with High Performance Sport New Zealand, Auckland, New Zealand. Kilding is with the Sports Performance Research Inst New Zealand (SPRINZ), Auckland University of Technology, Auckland, New Zealand. Buchheit is with the Sport Science Unit, Myorobie Association, Montvalezan, France. Address author correspondence to Daniel Plews at email@example.com.