Beer is used to socialize postexercise, celebrate sport victory, and commiserate postdefeat. Rich in polyphenols, beer has antioxidant effects when consumed in moderation, but its alcohol content may confer some negative effects. Despite beer’s popularity, no review has explored its effects on exercise performance, recovery, and adaptation. Thus, a systematic literature search of three databases (PubMed, SPORTDiscus, and Web of Science) was conducted by two reviewers. The search resulted in 16 studies that were appraised and reviewed. The mean PEDro score was 5.1. When individuals are looking to rehydrate postexercise, a low-alcohol beer (<4%) may be more effective. If choosing a beer higher in alcoholic content (>4%), it is advised to pair this with a nonalcoholic option to limit diuresis, particularly when relatively large volumes of fluid (>700 ml) are consumed. Adding Na+ to alcoholic beer may improve rehydration by decreasing fluid losses, but palatability may decrease. These conclusions are largely based on studies that standardized beverage volume, and the results may not apply equally to situations where people ingest fluids and food ad libitum. Ingesting nonalcoholic, polyphenol-rich beer could be an effective strategy for preventing respiratory infections during heavy training. If consumed in moderation, body composition and strength qualities seem largely unaffected by beer. Mixed results that limit sweeping conclusions are owed to variations in study design (i.e., hydration and exercise protocols). Future research should incorporate exercise protocols with higher ecological validity, recruit more women, prioritize chronic study designs, and use ad libitum fluid replacement protocols for more robust conclusions.
Browse
Got Beer? A Systematic Review of Beer and Exercise
Jaison L. Wynne and Patrick B. Wilson
The Utility of the Low Energy Availability in Females Questionnaire to Detect Markers Consistent With Low Energy Availability-Related Conditions in a Mixed-Sport Cohort
Margot A. Rogers, Michael K. Drew, Renee Appaneal, Greg Lovell, Bronwen Lundy, David Hughes, Nicole Vlahovich, Gordon Waddington, and Louise M. Burke
The Low Energy Availability in Females Questionnaire (LEAF-Q) was validated to identify risk of the female athlete triad (triad) in female endurance athletes. This study explored the ability of the LEAF-Q to detect conditions related to low energy availability (LEA) in a mixed sport cohort of female athletes. Data included the LEAF-Q, SCOFF Questionnaire for disordered eating, dual-energy X-ray absorptiometry-derived body composition and bone mineral density, Mini International Neuropsychiatric Interview, blood pressure, and blood metabolic and reproductive hormones. Participants were grouped according to LEAF-Q score (≥8 or <8), and a comparison of means was undertaken. Sensitivity, specificity, and predictive values of the overall score and subscale scores were calculated in relation to the triad and biomarkers relevant to LEA. Fisher’s exact test explored differences in prevalence of these conditions between groups. Seventy-five athletes (18–32 years) participated. Mean LEAF-Q score was 8.0 ± 4.2 (55% scored ≥8). Injury and menstrual function subscale scores identified low bone mineral density (100% sensitivity, 95% confidence interval [15.8%, 100%]) and menstrual dysfunction (80.0% sensitivity, 95% confidence interval [28.4%, 99.5%]), respectively. The gastrointestinal subscale did not detect surrogate markers of LEA. LEAF-Q score cannot be used to classify athletes as “high risk” of conditions related to LEA, nor can it be used as a surrogate diagnostic tool for LEA given the low specificity identified. Our study supports its use as a screening tool to rule out risk of LEA-related conditions or to create selective low-risk groups that do not need management as there were generally high negative predictive values (range 76.5–100%) for conditions related to LEA.
Volume 31 (2021): Issue 4 (Jul 2021)
Disordered Eating, Development of Menstrual Irregularity, and Reduced Bone Mass Change After a 3-Year Follow-Up In Female Adolescent Endurance Runners
Michelle T. Barrack, Marta D. Van Loan, Mitchell Rauh, and Jeanne F. Nichols
This prospective study evaluated the 3-year change in menstrual function and bone mass among 40 female adolescent endurance runners (age 15.9 ± 1.0 years) according to baseline disordered eating status. Three years after initial data collection, runners underwent follow-up measures including the Eating Disorder Examination Questionnaire and a survey evaluating menstrual function, running training, injury history, and prior sports participation. Dual-energy X-ray absorptiometry was used to measure bone mineral density and body composition. Runners with a weight concern, shape concern, or global score ≥4.0 or reporting >1 pathologic behavior in the past 28 days were classified with disordered eating. Compared with runners with normal Eating Disorder Examination Questionnaire scores at baseline, runners with disordered eating at baseline reported fewer menstrual cycles/year (6.4 ± 4.5 vs. 10.5 ± 2.8, p = .005), more years of amenorrhea (1.6 ± 1.4 vs. 0.3 ± 0.5, p = .03), and a higher proportion of menstrual irregularity (75.0% vs. 31.3%, p = .02) and failed to increase lumbar spine or total hip bone mineral density at the 3-year follow-up. In a multivariate model including body mass index and menstrual cycles in the past year at baseline, baseline shape concern score (B = −0.57, p value = .001) was inversely related to the annual number of menstrual cycles between assessments. Weight concern score (B = −0.40, p value = .005) was inversely associated with lumbar spine bone mineral density Z-score change between assessments according to a multivariate model adjusting for age and body mass index. These finding support associations between disordered eating at baseline and future menstrual irregularities or reduced accrual of lumbar spine bone mass in female adolescent endurance runners.
Individual Participant Data Meta-Analysis Provides No Evidence of Intervention Response Variation in Individuals Supplementing With Beta-Alanine
Gabriel Perri Esteves, Paul Swinton, Craig Sale, Ruth M. James, Guilherme Giannini Artioli, Hamilton Roschel, Bruno Gualano, Bryan Saunders, and Eimear Dolan
Currently, little is known about the extent of interindividual variability in response to beta-alanine (BA) supplementation, nor what proportion of said variability can be attributed to external factors or to the intervention itself (intervention response). To investigate this, individual participant data on the effect of BA supplementation on a high-intensity cycling capacity test (CCT110%) were meta-analyzed. Changes in time to exhaustion (TTE) and muscle carnosine were the primary and secondary outcomes. Multilevel distributional Bayesian models were used to estimate the mean and SD of BA and placebo group change scores. The relative sizes of group SDs were used to infer whether observed variation in change scores were due to intervention or non-intervention-related effects. Six eligible studies were identified, and individual data were obtained from four of these. Analyses showed a group effect of BA supplementation on TTE (7.7, 95% credible interval [CrI] [1.3, 14.3] s) and muscle carnosine (18.1, 95% CrI [14.5, 21.9] mmol/kg DM). A large intervention response variation was identified for muscle carnosine (σIR = 5.8, 95% CrI [4.2, 7.4] mmol/kg DM) while equivalent change score SDs were shown for TTE in both the placebo (16.1, 95% CrI [13.0, 21.3] s) and BA (15.9, 95% CrI [13.0, 20.0] s) conditions, with the probability that SD was greater in placebo being 0.64. In conclusion, the similarity in observed change score SDs between groups for TTE indicates the source of variation is common to both groups, and therefore unrelated to the supplement itself, likely originating instead from external factors such as nutritional intake, sleep patterns, or training status.
Effect of Exercising in the Heat on Intestinal Fatty Acid-Binding Protein, Endotoxins, and Lipopolysaccharide-Binding Protein Markers in Trained Athletic Populations: A Systematic Literature Review
Alice Wallett, Julien D. Périard, Philo Saunders, and Andrew McKune
Along with digestion and absorption of nutrients, the gastrointestinal epithelium acts as a primary intestinal defense layer, preventing luminal pathogens from entering the circulation. During exercise in the heat, epithelial integrity can become compromised, allowing bacteria and bacterial endotoxins to translocate into circulation, triggering a systemic inflammatory response and exacerbating gastrointestinal damage. While this relationship seems clear in the general population in endurance/ultraendurance exercise, the aim of this systematic review was to evaluate the effect of exercise in the heat on blood markers of gastrointestinal epithelial disturbance in well-trained individuals. Following the 2009 Preferred Reporting Items for Systematic Reviewed and Meta-Analyses guidelines, five electronic databases were searched for appropriate research, and 1,885 studies were identified. Five studies met the inclusion criteria and were subject to full methodological appraisal by two reviewers. Critical appraisal of the studies was conducted using the McMasters Critical Review Form. The studies investigated changes in markers of gastrointestinal damage (intestinal fatty acid–binding protein, endotoxin, and/or lipopolysaccharide-binding protein) following acute exercise in warm to hot conditions (≥ 30 °C) and included trained or well-trained participants with direct comparisons to a control temperate condition (≤ 22 °C). The studies found that prolonged submaximal and strenuous exercise in hot environmental conditions can acutely increase epithelial disturbance compared with exercise in cooler conditions, with disturbances not being clinically relevant. However, trained and well-trained populations appear to tolerate exercise-induced gastrointestinal disturbance in the heat. Whether this is an acquired tolerance related to regular training remains to be investigated.
Dietary Intake and Gastrointestinal Integrity in Runners Undertaking High-Intensity Exercise in the Heat
Naroa Etxebarria, Nicole A. Beard, Maree Gleeson, Alice Wallett, Warren A. McDonald, Kate L. Pumpa, and David B. Pyne
Gastrointestinal disturbances are one of the most common issues for endurance athletes during training and competition in the heat. The relationship between typical dietary intake or nutritional interventions and perturbations in or maintenance of gut integrity is unclear. Twelve well-trained male endurance athletes (peak oxygen consumption = 61.4 ± 7.0 ml·kg−1·min−1) completed two trials in a randomized order in 35 °C (heat) and 21 °C (thermoneutral) conditions and kept a detailed nutritional diary for eight consecutive days between the two trials. The treadmill running trials consisted of 15 min at 60% peak oxygen consumption, 15 min at 75% peak oxygen consumption, followed by 8 × 1-min high-intensity efforts. Venous blood samples were taken at the baseline, at the end of each of the three exercise stages, and 1 hr postexercise to measure gut integrity and the permeability biomarker concentration for intestinal fatty-acid-binding protein, lipopolysaccharide, and lipopolysaccharide-binding protein. The runners self-reported gut symptoms 1 hr postexercise and 3 days postexercise. The heat condition induced large (45–370%) increases in intestinal fatty-acid-binding protein, lipopolysaccharide-binding protein, and lipopolysaccharide concentrations compared with the baseline, but induced mild gastrointestinal symptoms. Carbohydrate and polyunsaturated fat intake 24 hr preexercise were associated with less lipopolysaccharide translocation. Protein, carbohydrate, total fat, and polyunsaturated fat intake (8 days) were positively associated with the percentage increase of intestinal fatty-acid-binding protein in both conditions (range of correlations, 95% confidence interval = .62–.93 [.02, .98]). Typical nutrition intake partly explained increases in biomarkers and the attenuation of symptoms induced by moderate- and high-intensity exercise under both heat and thermoneutral conditions.
Commentary in Response to “A Review of Nonpharmacological Strategies in the Treatment of Relative Energy Deficiency in Sport”
Nicole C.A. Strock, Kristen J. Koltun, and Emily A. Ricker
Embracing Change: The Evolving Science of Relative Energy Deficiency in Sport
Megan A. Kuikman, Margo Mountjoy, Trent Stellingwerff, and Jamie F. Burr
The Impact of Low Energy Availability on Nonexercise Activity Thermogenesis and Physical Activity Behavior in Recreationally Trained Adults
Alexandra Martin, Hande Hofmann, Clemens Drenowatz, Birgit Wallmann-Sperlich, Billy Sperlich, and Karsten Koehler
Energy availability describes the amount of dietary energy remaining for physiological functionality after the energy cost of exercise is deducted. The physiological and hormonal consequences of low energy availability (LEA) are well established, but the impact of LEA on physical activity behavior outside of exercise and, specifically, nonexercise activity thermogenesis (NEAT) has not been systematically examined. The authors conducted a secondary analysis of a repeated-measures crossover study in which recreationally trained young men (n = 6, 25 ± 1.0 years) underwent two 4-day conditions of LEA (15 kcal·kg fat-free mass−1 ·day−1) with and without endurance exercise (LEA + EX and LEA EX) and two energy-balanced control conditions (CON + EX and CON EX). The duration and intensity of physical activity outside of prescribed exercise were assessed using the SenseWear Pro3 armband. LEA did not alter NEAT (p = .41), nor time spent in moderate to vigorous (p = .20) and low-intensity physical activity (p = .17). However, time spent in low-intensity physical activity was lower in LEA + EX than LEA − EX (13.7 ± 0.3 vs. 15.2 ± 0.3 hr/day; p = .002). Short-term LEA does not seem to impact NEAT per se, but the way it is attained may impact physical activity behavior outside of exercise. As the participants expended similar amounts of energy during NEAT (900–1,300 kcal/day = 12.5–18.0 kcal·kg fat-free mass−1·day−1) and prescribed exercise bouts (15.0 kcal·kg fat-free mass−1·day−1), excluding it as a component of energy expenditure may skew the true energy available for physiological functionality in active populations.