The purpose of this study was to compare physiological responses and perceived exertion among well-trained cyclists (n = 63) performing 3 different high-intensity interval-training (HIIT) prescriptions differing in work-bout duration and accumulated duration but all prescribed with maximal session effort. Subjects (male, mean ± SD 38 ± 8 y, VO2peak 62 ± 6 mL · kg–1 · min–1) completed up to 24 HIIT sessions over 12 wk as part of a training-intervention study. Sessions were prescribed as 4 × 16, 4 × 8, or 4 × 4 min with 2-min recovery periods (8 sessions of each prescription, balanced over time). Power output, HR, and RPE were collected during and after each work bout. Session RPE was reported after each session. Blood lactate samples were collected throughout the 12 wk. Physiological and perceptual responses during >1400 training sessions were analyzed. HIIT sessions were performed at 95% ± 5%, 106% ± 5%, and 117% ± 6% of 40-min time-trial power during 4 × 16-, 4 × 8-, and 4 × 4-min sessions, respectively, with peak HR in each work bout averaging 89% ± 2%, 91% ± 2%, and 94% ± 2% HRpeak. Blood lactate concentrations were 4.7 ± 1.6, 9.2 ± 2.4, and 12.7 ± 2.7 mmol/L. Despite the common prescription of maximal session effort, RPE and sRPE increased with decreasing accumulated work duration (AWD), tracking relative HR. Only 8% of 4 × 16-min sessions reached RPE 19–20, vs 61% of 4 × 4-min sessions. The authors conclude that within the HIIT duration range, performing at “maximal session effort” over a reduced AWD is associated with higher perceived exertion both acutely and postexercise. This may have important implications for HIIT prescription choices.
Stephen Seiler and Øystein Sylta
Øystein Sylta, Espen Tønnessen and Stephen Seiler
The authors directly compared 3 frequently used methods of heart-rate-based training-intensity-distribution (TID) quantification in a large sample of training sessions performed by elite endurance athletes.
Twenty-nine elite cross-country skiers (16 male, 13 female; 25 ± 4 y; 70 ± 11 kg; 76 ± 7 mL · min−1 · kg−1 VO2max) conducted 570 training sessions during a ~14-d altitude-training camp. Three analysis methods were used: time in zone (TIZ), session goal (SG), and a hybrid session-goal/time-in-zone (SG/TIZ) approach. The proportion of training in zone 1, zone 2, and zone 3 was quantified using total training time or frequency of sessions, and simple conversion factors across different methods were calculated.
Comparing the TIZ and SG/TIZ methods, 96.1% and 95.5%, respectively, of total training time was spent in zone 1 (P < .001), with 2.9%/3.6% and 1.1%/0.8% in zones 2/3 (P < .001). Using SG, this corresponded to 86.6% zone 1 and 11.1%/2.4% zone 2/3 sessions. Estimated conversion factors from TIZ or SG/TIZ to SG and vice versa were 0.9/1.1, respectively, in the low-intensity training range (zone 1) and 3.0/0.33 in the high-intensity training range (zones 2 and 3).
This study provides a direct comparison and practical conversion factors across studies employing different methods of TID quantification associated with the most common heart-rate-based analysis methods.
Øystein Sylta, Espen Tønnessen and Stephen Seiler
The purpose of this study was to validate the accuracy of self-reported (SR) training duration and intensity distribution in elite endurance athletes.
Twenty-four elite cross-country skiers (25 ± 4 y, 67.9 ± 9.88 kg, 75.9 ± 6.50 mL · min−1 · kg−1) SR all training sessions during an ~14-d altitude-training camp. Heart rate (HR) and some blood lactate measurements were collected during 466 training sessions. SR training was compared with recorded training duration from HR monitors, and SR intensity distribution was compared with expert analysis (EA) of all session data.
SR training was nearly perfectly correlated with recorded training duration (r = .99), but SR training was 1.7% lower than recorded training duration (P < .001). SR training duration was also nearly perfectly correlated (r = .95) with recorded training duration >55% HRmax, but SR training was 11.4% higher than recorded training duration >55% HRmax (P < .001) due to SR inclusion of time <55% HRmax. No significant differences were observed in intensity distribution in zones 1–2 between SR and EA comparisons, but small discrepancies were found in zones 3–4 (P < .001).
This study provides evidence that elite endurance athletes report their training data accurately, although some small differences were observed due to lack of a SR “gold standard.” Daily SR training is a valid method of quantifying training duration and intensity distribution in elite endurance athletes. However, additional common reporting guidelines would further enhance accuracy.
Monica Klungland Torstveit, Ida Fahrenholtz, Thomas B. Stenqvist, Øystein Sylta and Anna Melin
Endurance athletes are at increased risk of relative energy deficiency associated with metabolic perturbation and impaired health. We aimed to estimate and compare within-day energy balance in male athletes with suppressed and normal resting metabolic rate (RMR) and explore whether within-day energy deficiency is associated with endocrine markers of energy deficiency. A total of 31 male cyclists, triathletes, and long-distance runners recruited from regional competitive sports clubs were included. The protocol comprised measurements of RMR by ventilated hood and energy intake and energy expenditure to predict RMRratio (measured RMR/predicted RMR), energy availability, 24-hr energy balance and within-day energy balance in 1-hr intervals, assessment of body composition by dual-energy X-ray absorptiometry, and blood plasma analysis. Subjects were categorized as having suppressed (RMRratio < 0.90, n = 20) or normal (RMRratio > 0.90, n = 11) RMR. Despite there being no observed differences in 24-hr energy balance or energy availability between the groups, subjects with suppressed RMR spent more time in an energy deficit exceeding 400 kcal (20.9 [18.8–21.8] hr vs. 10.8 [2.5–16.4], p = .023) and had larger single-hour energy deficits compared with subjects with normal RMR (3,265 ± 1,963 kcal vs. −1,340 ± 2,439, p = .023). Larger single-hour energy deficits were associated with higher cortisol levels (r = −.499, p = .004) and a lower testosterone:cortisol ratio (r = .431, p = .015), but no associations with triiodothyronine or fasting blood glucose were observed. In conclusion, within-day energy deficiency was associated with suppressed RMR and catabolic markers in male endurance athletes.
Helen G. Hanstock, Andrew D. Govus, Thomas B. Stenqvist, Anna K. Melin, Øystein Sylta and Monica K. Torstveit
Intensive training periods may negatively influence immune function, but the immunological consequences of specific high-intensity training (HIT) prescriptions are not well defined.
This study explored whether three different HIT prescriptions influence multiple health-related biomarkers and whether biomarker responses to HIT were associated with upper respiratory illness (URI) risk.
Twenty-five male cyclists and triathletes were randomised to three HIT groups and completed twelve HIT sessions over four weeks. Peak oxygen consumption (V̇O2peak) was determined using an incremental cycling protocol, while resting serum biomarkers (cortisol, testosterone, 25(OH)D and ferritin), salivary immunoglobulin-A (s-IgA) and energy availability (EA) were assessed before and after the training intervention. Participants self-reported upper respiratory symptoms during the intervention and episodes of URI were identified retrospectively.
Fourteen athletes reported URIs, but there were no differences in incidence, duration or severity between groups. Increased risk of URI was associated with higher s-IgA secretion rates (odds ratio=0.90, 90% CI:0.83-0.97). Lower pre-intervention cortisol and higher EA predicted a 4% increase in URI duration. Participants with higher V̇O2peak reported higher total symptom scores (incidence rate ratio=1.07, 90% CI:1.01-1.13).
Although multiple biomarkers were weakly associated with risk of URI, the direction of associations between s-IgA, cortisol, EA and URI risk were inverse to previous observations and physiological rationale. There was a cluster of URIs within the first week of the training intervention, but no samples were collected at this time-point. Future studies should incorporate more frequent sample time-points, especially around the onset of new training regimes, and include athletes with suspected or known nutritional deficiencies.