Ingesting carbohydrate (CHO) beverages during prolonged, continuous heavy exercise results in smaller changes in the plasma concentrations of several cytokines and attenuates a decline in neutrophil function. In contrast, ingesting CHO during prolonged intermittent exercise appears to have negligible influence on these responses, probably due to the overall moderate intensity of these intermittent exercise protocols. Therefore, we examined the effect of CHO ingestion on plasma interIeukin-6 (IL-6), tumor necrosis factor-α (TNF-α) and lipopolysaccharide (LPS)-stimuIated neutrophil degranulation responses to high-intensity intermittent running. Six trained male soccer players performed 2 exercise trials, 7 days apart, in a randomized, counterbalanced design. On each occasion, they completed six 15-min periods of intermittent running consisting of maximal sprinting interspersed with less intense periods of running and walking. Subjects consumed either CHO or artificially sweetened placebo(PLA) beverages immediately before and at 15-min intervals during the exercise. At 30 min post-exercise, CHO versus PLA was associated with a higher plasma glucose concentration (p< .01), a lower plasma cortisol and IL-6 concentration (p < .02), and fewer numbers of circulating neutrophils (p < .05). Following the exercise, LPS-stimulated elastase release per neutrophil fell 31 % below baseline values on the PLA trial (p = .06) compared with 11% on the CHO trial (p = .30). Plasma TNF-α concentration increased following the exercise (main effect of time, p < .001) but was not affected by CHO. These data indicate that CHO ingestion attenuates changes in plasma IL-6 concentration, neutrophil trafficking, and LPS-stimulated neutrophil degranulation in response to intermittent exercise that involves bouts of very high intensity exercise.
Nicolette C. Bishop, Michael Gleeson, Ceri W. Nicholas and Ajmol Ali
Deborah K. Fletcher and Nicolette C. Bishop
This study investigated the effect of a high and low dose of caffeine on antigen-stimulated natural killer (NK) cell (CD3−CD56+) activation after prolonged, strenuous cycling, as assessed by the early-activation molecule CD69. In a randomized crossover design, 12 healthy male endurance-trained cyclists cycled for 90 min at 70% VO2peak 60 min after ingesting either 0 (PLA), 2 (2CAF), or 6 (6CAF) mg/kg body mass of caffeine. Whole blood was stimulated with Pediacel (5 in 1) vaccine. A high dose of caffeine (6CAF) increased the number of CD3−CD56+ cells in the circulation immediately postexercise compared with PLA (p < .05). For both 2CAF and 6CAF, the geometric mean fluorescence intensity (GMFI) of CD69+ expression on unstimulated CD3−CD56+ cells was significantly higher than with PLA (both p < .05). When cells were stimulated with antigen, the GMFI of CD69 expression remained significantly higher with 2CAF than with PLA 1 hr postexercise (p < .05). Although not achieving statistical significance, 6CAF also followed a similar trend when stimulated (p = .09). There were no differences in GMFI of CD69 expression between 2CAF and 6CAF. These results suggest that a high (6 mg/kg) dose of caffeine was associated with the recruitment of NK cells into the circulation and that both a high and low (2 mg/kg) dose of caffeine increased unstimulated and antigen-stimulated NK-cell activation 1 hr after high-intensity exercise. Furthermore, there does not appear to be a dose-dependent effect of caffeine on NK-cell activation 1 hr after prolonged intensive cycling.
Martin J. Turner and Alberto P. Avolio
International guidelines suggest limiting sodium intake to 86–100 mmol/day, but average intake exceeds 150 mmol/day. Participants in physical activities are, however, advised to increase sodium intake before, during and after exercise to ensure euhydration, replace sodium lost in sweat, speed rehydration and maintain performance. A similar range of health benefits is attributable to exercise and to reduction in sodium intake, including reductions in blood pressure (BP) and the increase of BP with age, reduced risk of stroke and other cardiovascular diseases, and reduced risk of osteoporosis and dementia. Sweat typically contains 40–60 mmol/L of sodium, leading to approximately 20–90 mmol of sodium lost in one exercise session with sweat rates of 0.5–1.5 L/h. Reductions in sodium intake of 20–90 mmol/day have been associated with substantial health benefits. Homeostatic systems reduce sweat sodium as low as 3–10 mmol/L to prevent excessive sodium loss. “Salty sweaters” may be individuals with high sodium intake who perpetuate their “salty sweat” condition by continual replacement of sodium excreted in sweat. Studies of prolonged high intensity exercise in hot environments suggest that sodium supplementation is not necessary to prevent hyponatremia during exercise lasting up to 6 hr. We examine the novel hypothesis that sodium excreted in sweat during physical activity offsets a significant fraction of excess dietary sodium, and hence may contribute part of the health benefits of exercise. Replacing sodium lost in sweat during exercise may improve physical performance, but may attenuate the long-term health benefits of exercise.
John Hough, Caroline Robertson and Michael Gleeson
This study examined the influence of 10 days of intensified training on salivary cortisol and testosterone responses to 30-min, high-intensity cycling (55/80) in a group of male elite triathletes.
Seven elite male triathletes (age 19 ± 1 y, V̇O2max 67.6 ± 4.5 mL · kg–1 · min–1) completed the study. Swim distances increased by 45%. Running and cycling training hours increased by 25% and 229%, respectively. REST-Q questionnaires assessed mood status before, during, and after the training period. Unstimulated saliva samples were collected before, after, and 30 min after a continuous, high-intensity exercise test. Salivary cortisol and testosterone concentrations were assessed.
Compared with pretraining, blunted exercise-induced salivary testosterone responses to the posttraining 55/80 were found (P = .004). The absolute response of salivary testosterone concentrations to the 55/80 decreased pretraining to posttraining from 114% to 85%. No changes were found in exercise-induced salivary cortisol concentration responses to the 55/80. REST-Q scores indicated no changes in the participants’ psychological stress–recovery levels over the training camp.
The blunted exercise-induced salivary testosterone is likely due to decreased testicular testosterone production and/or secretion, possibly attributable to hypothalamic dysfunction or reduced testicular blood flow. REST-Q scores suggest that the triathletes coped well with training-load elevations, which could account for the finding of no change in the exercise-induced salivary cortisol concentration. Overall, these findings suggest that the 55/80 can detect altered exercise-induced salivary testosterone concentrations in an elite athletic population due to increased training stress. However, this alteration occurs independently of a perceived elevation of training stress.
Kellie R. Pritchard-Peschek, David G. Jenkins, Mark A. Osborne and Gary J. Slater
The aim of the current study was to investigate the effect of 180 mg of pseudoephedrine (PSE) on cycling time-trial (TT) performance. Six well-trained male cyclists and triathletes (age 33 ± 2 yr, mass 81 ± 8 kg, height 182.0 ± 6.7 cm, VO2max 56.8 ± 6.8 ml ⋅ kg−1 ⋅ min−1; M ± SD) underwent 2 performance trials in which they completed a 25-min variable-intensity (50–90% maximal aerobic power) warm-up, followed by a cycling TT in which they completed a fixed amount of work (7 kJ/kg body mass) in the shortest possible time. Sixty minutes before the start of exercise, they orally ingested 180 mg of PSE or a cornstarch placebo (PLA) in a randomized, crossover, double-blind manner. Venous blood was sampled immediately pre- and postexercise for the analysis of pH plus lactate, glucose, and norepinephrine (NE). PSE improved cycling TT performance by 5.1% (95% CI 0–10%) compared with PLA (28:58.9 ± 4:26.5 and 30:31.7 ± 4:36.7 min, respectively). There was a significant Treatment × Time interaction (p = .04) for NE, with NE increasing during the PSE trial only. Similarly, blood glucose also showed a trend (p = .06) for increased levels postexercise in the PSE trial. The ingestion of 180 mg of PSE 60 min before the onset of high-intensity exercise improved cycling TT performance in well-trained athletes. It is possible that changes in metabolism or an increase in central nervous system stimulation is responsible for the observed ergogenic effect of PSE.
Dawn M. Maffucci and Robert G. McMurray
The purpose of this study was to compare the effect a 6-hr versus 3-hr prefeeding regimen on exercise performance. The subjects were 8 active women (21.4 ± 0.9 years, 60.4±2.4 kg, 19.9 ± 1.3% body fat. and 165.6±2.1 cm). All women completed 2 exercise trials (separated by 3—6d) on a treadmill where they ran at moderate intensity for 30 min with 30-s sprints at 5-min intervals, followed directly by increasing incrementally the grade until volitional fatigue was achieved. The exercise trials were performed 3 hr and 6 hr after consuming 40 ± 3 kJ/kg meal. Time to exhaustion was 0.75 min shorter (p = .0001) for the 6-H trials compared to the 3-H trials. There were no significant differences in submaximal or peak oxygen uptake, heart rate, or rating of perceived exertion (p > .05). The 6-H trials compared to the 3-H trials resulted in .05 lower RERs (p = .0002), and a 2 mmol lower blood lactate at exhaustion (p = .012). Blood glucose levels and cortisol responses to exercise were similar between trials (p > .05). However, both resting and post exercise insulin levels were lower during 6-H trials. It was concluded that performance of moderate- to high-intensity exercise lasting 35—40 min is improved by consuming a moderately-high carbohydrate. low fat, low protein meal 3-hr before exercise compared to a similar meal consumed 6 hr prior to exercise. Thus, athletes should not skip meals before competition or training sessions.
Hee-Tae Roh, Su-Youn Cho, Hyung-Gi Yoon and Wi-Young So
We investigated the effects of aerobic exercise intensity on oxidative–nitrosative stress, neurotrophic factor expression, and blood–brain barrier (BBB) permeability. Fifteen healthy men performed treadmill running under low-intensity (LI), moderate-intensity (MI), and high-intensity (HI) conditions. Blood samples were collected immediately before exercise (IBE), immediately after exercise (IAE), and 60 min after exercise (60MAE) to examine oxidative–nitrosative stress (reactive oxygen species [ROS]; nitric oxide [NO]), neurotrophic factors (brain-derived neurotrophic factor [BDNF]; nerve growth factor [NGF]), and blood-brain barrier (BBB) permeability (S-100β; neuron-specific enolase). ROS concentration significantly increased IAE and following HI (4.9 ± 1.7 mM) compared with that after LI (2.8 ± 1.4 mM) exercise (p < .05). At 60MAE, ROS concentration was higher following HI (2.5 ± 1.2 mM) than after LI (1.5 ± 0.5 mM) and MI (1.4 ± 0.3 mM) conditions (p < .05). Plasma NO IAE increased significantly after MI and HI exercise (p < .05). Serum BDNF, NGF, and S-100b levels were significantly higher IAE following MI and HI exercise (p < .05). BDNF and S-100b were higher IAE following MI (29.6 ± 3.4 ng/mL and 87.1 ± 22.8 ng/L, respectively) and HI (31.4 ± 3.8 ng/mL and 100.6 ± 21.2 ng/L, respectively) than following LI (26.5 ± 3.0 ng/mL and 64.8 ± 19.2 ng/L, respectively) exercise (p < .05). 60MAE, S-100b was higher following HI (71.1 ± 14.5 ng/L) than LI (56.2 ± 14.7 ng/L) exercise (p < .05). NSE levels were not significantly different among all intensity conditions and time points (p > .05). Moderate- and/or high-intensity exercise may induce higher oxidative-nitrosative stress than may low-intensity exercise, which can increase peripheral neurotrophic factor levels by increasing BBB permeability.
Enda F. Whyte, Nicola Gibbons, Grainne Kerr and Kieran A. Moran
Context: Determination of return to play (RTP) after sport-related concussion (SRC) is critical given the potential consequences of premature RTP. Current RTP guidelines may not identify persistent exercise-induced neurocognitive deficits in asymptomatic athletes after SRC. Therefore, postexercise neurocognitive testing has been recommended to further inform RTP determination. To implement this recommendation, the effect of exercise on neurocognitive function in healthy athletes should be understood. Objective: To examine the acute effects of a high-intensity intermittent-exercise protocol (HIIP) on neurocognitive function assessed by the Symbol Digits Modality Test (SDMT) and Stroop Interference Test. Design: Cohort study. Setting: University laboratory. Participants 40 healthy male athletes (age 21.25 ± 1.29 y, education 16.95 ± 1.37 y). Intervention: Each participant completed the SDMT and Stroop Interference Test at baseline and after random allocation to a condition (HIIP vs control). A mixed between-within-subjects ANOVA assessed time- (pre- vs postcondition) -by-condition interaction effects. Main Outcome Measures: SDMT and Stroop Interference Test scores. Results: There was a significant time-by-condition interaction effect (P < .001, η 2 = .364) for the Stroop Interference Test scores, indicating that the HIIP group scored significantly lower (56.05 ± 9.34) postcondition than the control group (66.39 ± 19.6). There was no significant time-by-condition effect (P = .997, η 2 < .001) for the SDMT, indicating that there was no difference between SDMT scores for the HIIP and control groups (59.95 ± 10.7 vs 58.56 ± 14.02). Conclusions: In healthy athletes, the HIIP results in a reduction in neurocognitive function as assessed by the Stroop Interference Test, with no effect on function as assessed by the SDMT. Testing should also be considered after high-intensity exercise in determining RTP decisions for athletes after SRC in conjunction with the existing recommended RTP protocol. These results may provide an initial reference point for future research investigating the effects of an HIIP on the neurocognitive function of athletes recovering from SRC.
Rob Duffield, Johann Edge, Robert Merrells, Emma Hawke, Matt Barnes, David Simcock and Nicholas Gill
The aim of this study was to determine whether compression garments improve intermittent-sprint performance and aid performance or self-reported recovery from high-intensity efforts on consecutive days.
Following familiarization, 14 male rugby players performed two randomized testing conditions (with or without garments) involving consecutive days of a simulated team sport exercise protocol, separated by 24 h of recovery within each condition and 2 weeks between conditions. Each day involved an 80-min high-intensity exercise circuit, with exercise performance determined by repeated 20-m sprints and peak power on a cart dynamometer (single-man scrum machine). Measures of nude mass, heart rate, skin and tympanic temperature, and blood lactate (La−) were recorded throughout each day; also, creatine kinase (CK) and muscle soreness were recorded each day and 48 h following exercise.
No differences (P = .20 to 0.40) were present between conditions on either day of the exercise protocol for repeated 20-m sprint efforts or peak power on a cart dynamometer. Heart rate, tympanic temperature, and body mass did not significantly differ between conditions; however, skin temperature was higher under the compression garments. Although no differences (P = .50) in La− or CK were present, participants felt reduced levels of perceived muscle soreness in the ensuing 48 h postexercise when wearing the garments (2.5 ± 1.7 vs 3.5 ± 2.1 for garment and control; P = .01).
The use of compression garments did not improve or hamper simulated team-sport activity on consecutive days. Despite benefits of reduced self-reported muscle soreness when wearing garments during and following exercise each day, no improvements in performance or recovery were apparent.
Dajo Sanders, Mathieu Heijboer, Ibrahim Akubat, Kenneth Meijer and Matthijs K. Hesselink
To assess if short-duration (5 to ~300 s) high-power performance can accurately be predicted using the anaerobic power reserve (APR) model in professional cyclists.
Data from 4 professional cyclists from a World Tour cycling team were used. Using the maximal aerobic power, sprint peak power output, and an exponential constant describing the decrement in power over time, a power-duration relationship was established for each participant. To test the predictive accuracy of the model, several all-out field trials of different durations were performed by each cyclist. The power output achieved during the all-out trials was compared with the predicted power output by the APR model.
The power output predicted by the model showed very large to nearly perfect correlations to the actual power output obtained during the all-out trials for each cyclist (r = .88 ± .21, .92 ± .17, .95 ± .13, and .97 ± .09). Power output during the all-out trials remained within an average of 6.6% (53 W) of the predicted power output by the model.
This preliminary pilot study presents 4 case studies on the applicability of the APR model in professional cyclists using a field-based approach. The decrement in all-out performance during high-intensity exercise seems to conform to a general relationship with a single exponential-decay model describing the decrement in power vs increasing duration. These results are in line with previous studies using the APR model to predict performance during brief all-out trials. Future research should evaluate the APR model with a larger sample size of elite cyclists.