and U.S. Department of Agriculture, 2015 ). Some tools also have integrated other messages, such as sustainability, to reduce the environmental impact of food choices ( Food and Agriculture Organization of the United Nations, 2016 ). However, there are few valid tools specifically designed for
Alba Reguant-Closa, Margaret M. Harris, Tim G. Lohman and Nanna L. Meyer
Graeme L. Close, Craig Sale, Keith Baar and Stephane Bermon
. Preparticipation predictors for Championships injury and illness have been identified ( Timpka et al., 2017 ). For instance, athletes who reported an illness symptom causing anxiety before the competition were five times more likely to sustain an injury during the championships. Moreover, intensive training camps
Pål Haugnes, Per-Øyvind Torvik, Gertjan Ettema, Jan Kocbach and Øyvind Sandbakk
pacing strategy, with the main factor leading to reduction of speed being reduced cycle rate. Based on these findings, we would advise sprint XC skiers to concurrently develop both these capacities, and to employ technical strategies where a high cycle rate can be sustained when fatigue occurs. However
Xiaoya Ma, Kaitlyn J. Patterson, Kayla M. Gieschen and Peter F. Bodary
The prevalence of iron deficiency tends to be higher in athletic populations, especially among endurancetrained females. Recent studies have provided evidence that the iron-regulating hormone hepcidin is transiently increased with acute exercise and suggest that this may contribute to iron deficiency anemia in athletes. The purpose of this study was to determine whether resting serum hepcidin is significantly elevated in highly trained female distance runners compared with a low exercise control group. Due to the importance of the monocyte in the process of iron recycling, monocyte expression of hepcidin was also measured. A single fasted blood sample was collected midseason from twenty female distance runners averaging 81.9 ± 14.2 km of running per week. Ten age-, gender-, and BMI-matched low-exercise control subjects provided samples during the same period using identical collection procedures. There was no difference between the runners (RUN) and control subjects (CON) for serum hepcidin levels (p = .159). In addition, monocyte hepcidin gene expression was not different between the two groups (p = .635). Furthermore, no relationship between weekly training volume and serum hepcidin concentration was evident among the trained runners. The results suggest that hepcidin is not chronically elevated with sustained training in competitive collegiate runners. This is an important finding because the current clinical conditions that link hepcidin to anemia include a sustained elevation in serum hepcidin. Nevertheless, additional studies are needed to determine the clinical relevance of the well-documented, transient rise in hepcidin that follows acute sessions of exercise.
Scott J. Montain, Samuel N. Cheuvront and Henry C. Lukaski
Uncertainty exists regarding the effect of sustained sweating on sweat mineral-element composition.
To determine the effect of multiple hours of exercise-heat stress on sweat mineral concentrations.
Seven heat-acclimated subjects (6 males, 1 female) completed 5 × 60 min of treadmill exercise (1.56 m/s, 2% grade) with 20 min rest between exercise periods in 2 weather conditions (27 °C, 40% relative humidity, 1 m/s and 35 °C, 30%, 1 m/s). Sweat was collected from a sweat-collection pouch attached to the upper back during exercise bouts 1, 3, and 5. Mineral elements were determined by using inductively coupled plasma-emission spectrography.
At 27 °C, sweat sodium (863  µg/mL; mean [SD]), potassium (222  µg/mL), calcium (16 ) µg/mL), magnesium (1265  ng/mL), and copper (80  ng/mL) remained similar to baseline over 7 h of exercise-heat stress, whereas sweat zinc declined 42–45% after the initial hour of exercise-heat stress (Ex1 = 655 , Ex3 = 382 , Ex5 = 355  µg/mL, P < 0.05). Similar outcomes were observed for sweat zinc at 35 °C when sweat rates were higher. Sweat rate had no effect on sweat trace-element composition.
Sweat sodium, potassium, and calcium losses during multiple hours of sustained sweating can be predicted from initial sweat composition. Estimates of sweat zinc losses, however, will be overestimated if sweat zinc conservation is not accounted for in sweat zinc-loss estimates.
Ralph Beneke, Renate M. Leithäuser and Oliver Ochentel
A link between lactate and muscular exercise was seen already more than 200 years ago. The blood lactate concentration (BLC) is sensitive to changes in exercise intensity and duration. Multiple BLC threshold concepts define different points on the BLC power curve during various tests with increasing power (INCP). The INCP test results are affected by the increase in power over time. The maximal lactate steady state (MLSS) is measured during a series of prolonged constant power (CP) tests. It detects the highest aerobic power without metabolic energy from continuing net lactate production, which is usually sustainable for 30 to 60 min. BLC threshold and MLSS power are highly correlated with the maximum aerobic power and athletic endurance performance. The idea that training at threshold intensity is particularly effective has no evidence. Three BLC-orientated intensity domains have been established: (1) training up to an intensity at which the BLC clearly exceeds resting BLC, light- and moderate-intensity training focusing on active regeneration or high-volume endurance training (Intensity < Threshold); (2) heavy endurance training at work rates up to MLSS intensity (Threshold ≤ Intensity ≤ MLSS); and (3) severe exercise intensity training between MLSS and maximum oxygen uptake intensity mostly organized as interval and tempo work (Intensity > MLSS). High-performance endurance athletes combining very high training volume with high aerobic power dedicate 70 to 90% of their training to intensity domain 1 (Intensity < Threshold) in order to keep glycogen homeostasis within sustainable limits.
Neil M. Johannsen and Rick L. Sharp
The purpose of this study was to investigate differences in substrate oxidation between dextrose (DEX) and unmodified (UAMS) and acid/alcohol-modified (MAMS) cornstarches. Seven endurance-trained men (VO2peak = 59.1 ± 5.4 mL·kg−1·min−1) participated in 2 h of exercise (66.4% ± 3.3% VO2peak) 30 min after ingesting 1 g/kg body weight of the experimental carbohydrate or placebo (PLA). Plasma glucose and insulin were elevated after DEX (P < 0.05) compared with UAMS, MAMS, and PLA. Although MAMS and DEX raised carbohydrate oxidation rate through 90 min of exercise, only MAMS persisted throughout 120 min (P < 0.05 compared with all trials). Exogenous-carbohydrate oxidation rate was higher in DEX than in MAMS and UAMS until 90 min of exercise. Acid/alcohol modification resulted in augmented carbohydrate oxidation with a small, sustained increase in exogenous-carbohydrate oxidation rate. MAMS appears to be metabolizable and available for oxidation during exercise.
Heather R. Clark, Margo E. Barker and Bernard M. Corfe
Mountain marathons are 2-d, self-supported adventure races, during which competitors must carry all nutritional requirements to sustain athletic effort. This requires a compromise between the energy required to perform and the weight penalty of carrying it. We have undertaken a nutritional survey of event competitors in the UK using a questionnaire-based approach and have monitored dehydration during the event. We found that competitors in longer-distance classes (> 50 km) carry significantly less mass of food, which is more energy dense, but that the calorific value is lower than that of competitors in shorter classes. Carbohydrate and protein consumption both positively associated with performance. Competitors became progressively dehydrated throughout the event. Counterintuitively, the better-performing subjects became the most dehydrated. Competitors at all distances should make more effort to rehydrate during breaks in the event. Competitors at shorter distances could choose more energy-dense foods to reduce weight penalty.
Ben Desbrow, Nicholas A. Burd, Mark Tarnopolsky, Daniel R. Moore and Kirsty J. Elliott-Sale
Adolescent, female, and masters athletes have unique nutritional requirements as a consequence of undertaking daily training and competition in addition to the specific demands of age- and gender-related physiological changes. Dietary education and recommendations for these special population athletes require a focus on eating for long-term health, with special consideration given to “at-risk” dietary patterns and nutrients (e.g., sustained restricted eating, low calcium, vitamin D and/or iron intakes relative to requirements). Recent research highlighting strategies to address age-related changes in protein metabolism and the development of tools to assist in the management of Relative Energy Deficiency in Sport are of particular relevance to special population athletes. Whenever possible, special population athletes should be encouraged to meet their nutrient needs by the consumption of whole foods rather than supplements. The recommendation of dietary supplements (particularly to young athletes) overemphasizes their ability to manipulate performance in comparison with other training/dietary strategies.
Katherine A. Beals and Melinda M. Manore
This study examined the prevalence of and relationship between the disorders of the female athlete triad in collegiate athletes participating in aesthetic, endurance, or team/anaerobic sports. Participants were 425 female collegiate athletes from 7 universities across the United States. Disordered eating, menstrual dysfunction, and musculoskeletal injuries were assessed by a health/medical, dieting and menstrual history questionnaire, the Eating Attitudes Test (EAT-26), and the Eating Disorder Inventory Body Dissatisfaction Subscale (EDI-BD). The percentage of athletes reporting a clinical diagnosis of anorexia and bulimia nervosa was 3.3% and 2.3%, respectively; mean (±SD) EAT and EDI-BD scores were 10.6 ± 9.6 and 9.8 ± 7.6, respectively. The percentage of athletes with scores indicating “at-risk” behavior for an eating disorder were 15.2% using the EAT-26 and 32.4% using the EDI-BD. A similar percentage of athletes in aesthetic, endurance, and team/anaerobic sports reported a clinical diagnosis of anorexia or bulimia. However, athletes in aesthetic sports scored higher on the EAT-26 (13.5 ± 10.9) than athletes in endurance (10.0 ± 9.3) or team/anaerobic sports (9.9 ± 9.0, p < .02); and more athletes in aesthetic versus endurance or team/anaerobic sports scored above the EAT-26 cut-off score of 20 (p < .01). Menstrual irregularity was reported by 31% of the athletes not using oral contraceptives, and there were no group differences in the prevalence of self-reported menstrual irregularity. Muscle and bone injuries sustained during the collegiate career were reported by 65.9% and 34.3% of athletes, respectively, and more athletes in aesthetic versus endurance and team/anaerobic sports reported muscle (p = .005) and/or bone injuries (p < .001). Athletes “at risk” for eating disorders more frequently reported menstrual irregularity (p = .004) and sustained more bone injuries (p = .003) during their collegiate career. These data indicate that while the prevalence of clinical eating disorders is low in female collegiate athletes, many are “at risk” for an eating disorder, which places them at increased risk for menstrual irregularity and bone injuries.