The adaptive response to training is determined by the combination of the intensity, volume, and frequency of the training. Various periodized approaches to training are used by aquatic sports athletes to achieve performance peaks. Nutritional support to optimize training adaptations should take periodization into consideration; that is, nutrition should also be periodized to optimally support training and facilitate adaptations. Moreover, other aspects of training (e.g., overload training, tapering and detraining) should be considered when making nutrition recommendations for aquatic athletes. There is evidence, albeit not in aquatic sports, that restricting carbohydrate availability may enhance some training adaptations. More research needs to be performed, particularly in aquatic sports, to determine the optimal strategy for periodizing carbohydrate intake to optimize adaptations. Protein nutrition is an important consideration for optimal training adaptations. Factors other than the total amount of daily protein intake should be considered. For instance, the type of protein, timing and pattern of protein intake and the amount of protein ingested at any one time influence the metabolic response to protein ingestion. Body mass and composition are important for aquatic sport athletes in relation to power-to-mass and for aesthetic reasons. Protein may be particularly important for athletes desiring to maintain muscle while losing body mass. Nutritional supplements, such as b-alanine and sodium bicarbonate, may have particular usefulness for aquatic athletes’ training adaptation.
Iñigo Mujika, Trent Stellingwerff and Kevin Tipton
Oliver C. Witard, Ina Garthe and Stuart M. Phillips
Dietary protein is widely regarded as a key nutrient for allowing optimal training adaptation ( Tipton, 2008 ) and optimizing body composition ( Hector & Phillips, 2018 ; Murphy et al., 2015 ) in athletes including track and field athletes. Track and field athletics encompasses a broad spectrum of
Gary J. Slater, Jennifer Sygo and Majke Jorgensen
the training facility, as well as academic, professional, or personal commitments. This schedule can have an important impact on meal timing and access to food to support pretraining fuelling and recovery and should be considered when developing a sprinter’s nutrition plan. Sprint-training adaptations
Daniel J. Plews, Paul B. Laursen, Yann Le Meur, Christophe Hausswirth, Andrew E. Kilding and Martin Buchheit
To establish the minimum number of days that heart-rate-variability (HRV, ie, the natural logarithm of square root of the mean sum of the squared differences between R-R intervals, Ln rMSSD) data should be averaged to achieve correspondingly equivalent results as data averaged over a 1-wk period.
Standardized changes in Ln rMSSD between different phases of training (normal training, functional overreaching (FOR), overall training, and taper) and the correlation coefficients of percentage changes in performance vs changes in Ln rMSSD were compared when averaging Ln rMSSD from 1 to 7 d, randomly selected within the week.
Standardized Ln rMSSD changes (90% confidence limits, CL) from baseline to overload (FOR) were 0.20 ± 0.28, 0.33 ± 0.26, 0.49 ± 0.33, 0.48 ± 0.28, 0.47 ± 0.26, 0.45 ± 0.26, and 0.43 ± 0.29 on days 1 to 7, respectively. Correlations (90% CL) over the same time sequence and training phase were –.02 ± .23, –.07 ± .23, –.17 ± .22, –.25 ± .22, –.26 ± .22, –.28 ± .21, and –.25 ± .22 on days 1 to 7. There were almost perfect quadratic relationships between standardized changes/r values vs the number of days Ln rMSSD was averaged (r 2 = .92 and .97, respectively) in trained triathletes during FOR. This indicates a plateau in the increase in standardized changes/r values’ magnitude after 3 and 4 d, respectively, in trained triathletes.
Practitioners using HRV to monitor training adaptation should use a minimum of 3 (randomly selected) valid data points per week.
Daniel J. Plews, Paul B. Laursen, Andrew E. Kilding and Martin Buchheit
The aim of this study was to compare 2 different methodological assessments when analyzing the relationship between performance and heart-rate (HR) -derived indices (resting HR [RHR] and HR variability [HRV]) to evaluate positive adaptation to training. The relative change in estimated maximum aerobic speed (MAS) and 10-km-running performance was correlated to the relative change in RHR and the natural logarithm of the square root of the mean sum of the squared differences between R-R intervals on an isolated day (RHRday; Ln rMSSDday) or when averaged over 1 wk (RHRweek; Ln rMSSDweek) in 10 runners who responded to a 9-wk training intervention. Moderate and small correlations existed between changes in MAS and 10-km-running performance and RHRday (r = .35, 90%CI [–.35, .76] and r = –.21 [–.68, .39]), compared with large and very large correlations for RHRweek (r = –.62 [–.87, –.11] and r = .73 [.30, .91]). While a trivial correlation was observed for MAS vs Ln rMSSDday (r = –.06 [–.59, .51]), a very large correlation existed with Ln rMSSDweek (r = .72 [.28, .91]). Similarly, changes in 10-km-running performance revealed a small correlation with Ln rMSSDday (r = –.17 [–.66, .42]), vs a very large correlation for Ln rMSSDweek (r = –.76 [–.92, –.36]). In conclusion, the averaging of RHR and HRV values over a 1-wk period appears to be a superior method for evaluating positive adaption to training compared with assessing its value on a single isolated day.
Ville Vesterinen, Ari Nummela, Sami Äyrämö, Tanja Laine, Esa Hynynen, Jussi Mikkola and Keijo Häkkinen
Regular monitoring of adaptation to training is important for optimizing training load and recovery, which is the main factor in successful training.
To investigate the usefulness of a novel submaximal running test (SRT) in field conditions in predicting and tracking changes of endurance performance.
Thirty-five endurance-trained men and women (age 20–55 y) completed the 18-wk endurance-training program. A maximal incremental running test was performed at weeks 0, 9, and 18 for determination of maximal oxygen consumption (VO2max) and running speed (RS) at exhaustion (RSpeak) and lactate thresholds (LTs). In addition, the subjects performed weekly a 3-stage SRT including a postexercise heart-rate-recovery (HRR) measurement. The subjects were retrospectively grouped into 4 clusters according to changes in SRT results.
Large correlations (r = .60–.89) were observed between RS during all stages of SRT and all endurance-performance variables (VO2max, RSpeak, RS at LT2, and RS at LT1). HRR correlated only with VO2max (r = .46). Large relationships were also found between changes in RS during 80% and 90% HRmax stages of SRT and a change of RSpeak (r = .57, r = .79). In addition, the cluster analysis revealed the different trends in RS during 80% and 90% stages during the training between the clusters, which showed different improvements in VO2max and RSpeak.
The current SRT showed great potential as a practical tool for regular monitoring of individual adaptation to endurance training without time-consuming and expensive laboratory tests.
Eric Joseph Rosario, Rudolph Gino Villani, Jeff Harris and Rudi Klein
Aging generally results in muscle and bone atrophy, with accelerated loss in the first few years after menopause contributing to decline in strength, balance, and mobility. This investigation compared the effects of 1 of year periodized high-intensity strength training on a group of less-than-5-years (LF) postmenopausal women (n = 10, mean age 51 years) with its effects on a more-than-10-years (MT) postmenopausal group (n = 11, mean age 60 years). Mean lean body mass, strength, and balance increased over the intervention period for both groups, with no significant intergroup differences. Mean total fat mass significantly decreased for both groups, with no significant difference between groups. Total and regional bone density and mineral content did not significantly change in either group. These results indicate that even during the accelerated muscle-loss period after menopause, women can gain muscle and strength with resistance training to a similar extent as older women.
Llion A. Roberts, Kris Beattie, Graeme L. Close and James P. Morton
To test the hypothesis that antioxidants can attenuate high-intensity interval training–induced improvements in exercise performance.
Two groups of recreationally active males performed a high-intensity interval running protocol, four times per week for 4 wk. Group 1 (n = 8) consumed 1 g of vitamin C daily throughout the training period, whereas Group 2 (n = 7) consumed a visually identical placebo. Pre- and posttraining, subjects were assessed for VO2max, 10 km time trial, running economy at 12 km/h and distance run on the YoYo intermittent recovery tests level 1 and 2 (YoYoIRT1/2). Subjects also performed a 60 min run before and after training at a running velocity of 65% of pretraining VO2max so as to assess training-induced changes in substrate oxidation rates.
Training improved (P < .0005) VO2max, 10 km time trial, running economy, YoYoIRT1 and YoYoIRT2 in both groups, although there was no difference (P = .31, 0.29, 0.24, 0.76 and 0.59) between groups in the magnitude of training-induced improvements in any of the aforementioned parameters. Similarly, training also decreased (P < .0005) mean carbohydrate and increased mean fat oxidation rates during submaximal exercise in both groups, although no differences (P = .98 and 0.94) existed between training conditions.
Daily oral consumption of 1 g of vitamin C during a 4 wk high-intensity interval training period does not impair training-induced improvements in the exercise performance of recreationally active males.
Christian J. Cook, Liam P. Kilduff and C. Martyn Beaven
To examine the effects of moderate-load exercise with and without blood-flow restriction (BFR) on strength, power, and repeated-sprint ability, along with acute and chronic salivary hormonal parameters.
Twenty male semiprofessional rugby union athletes were randomized to a lower-body BFR intervention (an occlusion cuff inflated to 180 mmHg worn intermittently on the proximal thighs) or a control intervention that trained without occlusion in a crossover design. Experimental sessions were performed 3 times a week for 3 wk with 5 sets of 5 repetitions of bench press, leg squat, and pull-ups performed at 70% of 1-repetition maximum.
Greater improvements were observed (occlusion training vs control) in bench press (5.4 ± 2.6 vs 3.3 ± 1.4 kg), squat (7.8 ± 2.1 vs 4.3 ± 1.4 kg), maximum sprint time (−0.03 ± 0.03 vs –0.01 ± 0.02 s), and leg power (168 ± 105 vs 68 ± 50 W). Greater exercise-induced salivary testosterone (ES 0.84–0.61) and cortisol responses (ES 0.65–0.20) were observed after the occlusion intervention sessions compared with the nonoccluded controls; however, the acute cortisol increases were attenuated across the training block.
Occlusion training can potentially improve the rate of strength-training gains and fatigue resistance in trained athletes, possibly allowing greater gains from lower loading that could be of benefit during high training loads, in competitive seasons, or in a rehabilitative setting. The clear improvement in bench-press strength resulting from lower-body occlusion suggests a systemic effect of BFR training.
Training quantification is basic to evaluate an endurance athlete’s responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes’ performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.