This study examined the prevalence of and relationship between the disorders of the female athlete triad in collegiate athletes participating in aesthetic, endurance, or team/anaerobic sports. Participants were 425 female collegiate athletes from 7 universities across the United States. Disordered eating, menstrual dysfunction, and musculoskeletal injuries were assessed by a health/medical, dieting and menstrual history questionnaire, the Eating Attitudes Test (EAT-26), and the Eating Disorder Inventory Body Dissatisfaction Subscale (EDI-BD). The percentage of athletes reporting a clinical diagnosis of anorexia and bulimia nervosa was 3.3% and 2.3%, respectively; mean (±SD) EAT and EDI-BD scores were 10.6 ± 9.6 and 9.8 ± 7.6, respectively. The percentage of athletes with scores indicating “at-risk” behavior for an eating disorder were 15.2% using the EAT-26 and 32.4% using the EDI-BD. A similar percentage of athletes in aesthetic, endurance, and team/anaerobic sports reported a clinical diagnosis of anorexia or bulimia. However, athletes in aesthetic sports scored higher on the EAT-26 (13.5 ± 10.9) than athletes in endurance (10.0 ± 9.3) or team/anaerobic sports (9.9 ± 9.0, p < .02); and more athletes in aesthetic versus endurance or team/anaerobic sports scored above the EAT-26 cut-off score of 20 (p < .01). Menstrual irregularity was reported by 31% of the athletes not using oral contraceptives, and there were no group differences in the prevalence of self-reported menstrual irregularity. Muscle and bone injuries sustained during the collegiate career were reported by 65.9% and 34.3% of athletes, respectively, and more athletes in aesthetic versus endurance and team/anaerobic sports reported muscle (p = .005) and/or bone injuries (p < .001). Athletes “at risk” for eating disorders more frequently reported menstrual irregularity (p = .004) and sustained more bone injuries (p = .003) during their collegiate career. These data indicate that while the prevalence of clinical eating disorders is low in female collegiate athletes, many are “at risk” for an eating disorder, which places them at increased risk for menstrual irregularity and bone injuries.
Katherine A. Beals and Melinda M. Manore
Dru A. Henson, David C. Nieman, Andy D. Blodgett, Diane E. Butterworth, Alan Utter, J. Mark Davis, Gerald Sonnenfeld, Darla S. Morton, Omar R. Fagoaga and Sandra L. Nehlsen-Cannarella
The influence of exercise mode and 6% carbohydrate (C) versus placebo (P) beverage ingestion on lymphocyte proliferation, natural killer cell cytotoxicily (NKCA), Interleukin (IL)-1ß production, and hormonal responses to 2.5 hr of intense running and cycling (~75%
Joshua Lowndes, Robert F. Zoeller, George A. Kyriazis, Mary P. Miles, Richard L. Seip, Niall M. Moyna, Paul S. Visich, Linda S. Pescatello, Paul M. Gordon, Paul D. Thompson and Theodore J. Angelopoulos
The purpose of this study was to examine whether leptin levels affect the response of leptin to exercise training (ET) and whether this is also affected by C-reactive protein (CRP) or the three common Apolipoprotein E genotypes (APOE). Ninety-seven (male = 45, female = 52) sedentary individuals underwent 6 months of supervised ET. Blood was sampled before the initiation of ET, and again 24 and 72 hr after completion of the final training session. ET resulted in a small reduction in body mass (80.47 ± 18.03 vs 79.42 ± 17.34 kg, p < .01). Leptin was reduced 24 hr after the final exercise session (p < .01), but returned to normal after 72 hr (p > .05)—Pre: 13.51 ± 12.27, 24hr: 12.14 ± 12.34, 72hr: 12.98 ± 11.40 ng/ml. The most hyperleptinemic individuals had a greater initial response, which was sustained through to 72 hr after the final session in the pooled study population (p < .01), and in both males (p < .05) and females (p < .05) separately. CRP was related to leptin independently of body weight and positively related to the reductions in leptin. APOE genotype was not related to leptin levels and did not affect the response to ET. Leptin levels may only be reduced by ET in those with hyperleptinemia. In addition, both the initial extent of hyperleptinemia and the subsequent reduction in leptin may be related to low grade chronic systemic inflammation.
Veronika Leichtfried, Friedrich Hanser, Andrea Griesmacher, Markus Canazei and Wolfgang Schobersberger
Demands on concentrative and cognitive performance are high in sport shooting and vary in a circadian pattern, aroused by internal and external stimuli. The most prominent external stimulus is light. Bright light (BL) has been shown to have a certain impact on cognitive and physical performance.
To evaluate the impact of a single half hour of BL exposure in the morning hours on physical and cognitive performance in 15 sport shooters. In addition, courses of sulfateoxymelatonin (aMT6s), tryptophan (TRP), and kynurenine (KYN) were monitored.
In a crossover design, 15 sport shooters were exposed to 30 min of BL and dim light (DL) in the early-morning hours. Shooting performance, balance, visuomotor performance, and courses of aMT6s, TRP, and KYN were evaluated.
Shooting performance was 365.4 (349.7–381.0) and 368.5 (353.9–383.1), identical in both light setups. Numbers of right reactions (sustained attention) and deviations from the horizontal plane (balance-related measure) were higher after BL. TRP concentrations decreased from 77.5 (73.5–81.4) to 66.9 (60.7–67.0) in the DL setup only.
The 2 light conditions generated heterogeneous visuomotor and physiological effects in sport shooters. The authors therefore suggest that a single half hour of BL exposure is effective in improving cognitive aspects of performance, but not physical performance. Further research is needed to evaluate BL’s impact on biochemical parameters.
David M. Morris, Joshua R. Huot, Adam M. Jetton, Scott R. Collier and Alan C. Utter
Dehydration has been shown to hinder performance of sustained exercise in the heat. Consuming fluids before exercise can result in hyperhydration, delay the onset of dehydration during exercise and improve exercise performance. However, humans normally drink only in response to thirst, which does not result in hyperhydration. Thirst and voluntary fluid consumption have been shown to increase following oral ingestion or infusion of sodium into the bloodstream. We measured the effects of acute sodium ingestion on voluntary water consumption and retention during a 2-hr hydration period before exercise. Subjects then performed a 60-min submaximal dehydration ride (DR) followed immediately by a 200 kJ performance time trial (PTT) in a warm (30 °C) environment. Water consumption and retention during the hydration period was greater following sodium ingestion (1380 ± 580 mL consumed, 821 ± 367 ml retained) compared with placebo (815 ± 483 ml consumed, 244 ± 402 mL retained) and no treatment (782 ± 454 ml consumed, 148 ± 289 mL retained). Dehydration levels following the DR were significantly less after sodium ingestion (0.7 ± 0.6%) compared with placebo (1.3 ± 0.7%) and no treatment (1.6 ± 0.4%). Time to complete the PTT was significantly less following sodium consumption (773 ± 158 s) compared with placebo (851 ± 156 s) and no treatment (872 ± 190 s). These results suggest that voluntary hyperhydration can be induced by acute consumption of sodium and has a favorable effect on hydration status and performance during subsequent exercise in the heat.
Marcin Baranowski, Jan Górski, Barbara Klapcinska, Zbigniew Waskiewicz and Ewa Sadowska-Krepa
We have previously shown that acute exercise increases the level of sphingosine-1-phosphate (S1P) in plasma and ceramide in erythrocytes of untrained subjects. The aim of the current study was to examine the effect of ultramarathon run on the plasma and erythrocyte levels of the following bioactive sphingolipids: S1P, sphinganine-1-phosphate (SA1P), sphingosine, sphinganine, and ceramide. Blood samples were collected from seven male amateur runners participating in a 48-hr ultramarathon race before the run, after 24 and 48 hr of running, and following 24 and 48 hr of recovery. The sphingolipids were quantified by means of HPLC. Sustained running for 48 hr resulted in a progressive decline in plasma S1P to a level significantly lower than at prerace, and then remained stable over the next 48 hr of recovery. In erythrocytes, S1P content was stable until 24 hr of recovery, then rose abruptly to reach peak values after 48 hr of recovery. The plasma level of SA1P decreased progressively during the competition and remained unchanged over the recovery. In erythrocytes, the level of SA1P increased after 24 hr running and normalized thereafter. The level of ceramide, both in plasma and erythrocytes, was not significantly affected by the ultraendurance run. We speculate that reduction in plasma level of S1P during and after the run reduces its biological actions and might be responsible for some negative side-effects of the ultraendurance effort.
Naroa Etxebarria, Shaun D’Auria, Judith M. Anson, David B. Pyne and Richard A. Ferguson
The patterns of power output in the ~1-h cycle section of Olympic-distance triathlon races are not well documented. Here the authors establish a typical cycling-race profile derived from several International Triathlon Union elite-level draftinglegal triathlon races.
The authors collated 12 different race power profiles from elite male triathletes (N = 5, age 25 ± 5 y, body mass 65.5 ± 5.6 kg; mean ± SD) during 7 international races. Power output was recorded using SRM cranks and analyzed with proprietary software.
The mean power output was 252 ± 33 W, or 3.9 ± 0.5 W/kg in relative terms, with a coefficient of variation of 71% ± 13%. Normalized power (power output an athlete could sustain if intensity were maintained constant without any variability) for the entire cycle section was 291 ± 29 W, or 40 ± 13 W higher than the actual mean power output. There were 34 ± 14 peaks of power output above 600 W and ~18% time spent at >100% of maximal aerobic power.
Cycling during Olympic-distance triathlon, characterized by frequent and large power variations including repeat supramaximal efforts, equates to a higher workload than cycling at constant power.
Jace A. Delaney, Heidi R. Thornton, Grant M. Duthie and Ben J. Dascombe
Rugby league coaches adopt replacement strategies for their interchange players to maximize running intensity; however, it is important to understand the factors that may influence match performance.
To assess the independent factors affecting running intensity sustained by interchange players during professional rugby league.
Global positioning system (GPS) data were collected from all interchanged players (starters and nonstarters) in a professional rugby league squad across 24 matches of a National Rugby League season. A multilevel mixed-model approach was employed to establish the effect of various technical (attacking and defensive involvements), temporal (bout duration, time in possession, etc), and situational (season phase, recovery cycle, etc) factors on the relative distance covered and average metabolic power (Pmet) during competition. Significant effects were standardized using correlation coefficients, and the likelihood of the effect was described using magnitude-based inferences.
Superior intermittent running ability resulted in very likely large increases in both relative distance and Pmet. As the length of a bout increased, both measures of running intensity exhibited a small decrease. There were at least likely small increases in running intensity for matches played after short recovery cycles and against strong opposition. During a bout, the number of collision-based involvements increased running intensity, whereas time in possession and ball time out of play decreased demands.
These data demonstrate a complex interaction of individual- and match-based factors that require consideration when developing interchange strategies, and the manipulation of training loads during shorter recovery periods and against stronger opponents may be beneficial.
Ana Sousa, Pedro Figueiredo, David Pendergast, Per-Ludvik Kjendlie, João P. Vilas-Boas and Ricardo J. Fernandes
Swimming has become an important area of sport science research since the 1970s, with the bioenergetic factors assuming a fundamental performance-influencing role. The purpose of this study was to conduct a critical evaluation of the literature concerning oxygen-uptake (VO2) assessment in swimming, by describing the equipment and methods used and emphasizing the recent works conducted in ecological conditions. Particularly in swimming, due to the inherent technical constraints imposed by swimming in a water environment, assessment of VO2max was not accomplished until the 1960s. Later, the development of automated portable measurement devices allowed VO2max to be assessed more easily, even in ecological swimming conditions, but few studies have been conducted in swimming-pool conditions with portable breath-by-breath telemetric systems. An inverse relationship exists between the velocity corresponding to VO2max and the time a swimmer can sustain it at this velocity. The energy cost of swimming varies according to its association with velocity variability. As, in the end, the supply of oxygen (whose limitation may be due to central—O2 delivery and transportation to the working muscles—or peripheral factors—O2 diffusion and utilization in the muscles) is one of the critical factors that determine swimming performance, VO2 kinetics and its maximal values are critical in understanding swimmers’ behavior in competition and to develop efficient training programs.
Randall L. Wilber
“Live high-train low” (LH+TL) altitude training allows athletes to “live high” for the purpose of facilitating altitude acclimatization, as characterized by a significant and sustained increase in endogenous erythropoietin and subsequent increase in erythrocyte volume, while simultaneously enabling them to “train low” for the purpose of replicating sea-level training intensity and oxygen flux, thereby inducing beneficial metabolic and neuromuscular adaptations. In addition to natural/terrestrial LH+TL, several simulated LH+TL devices have been developed including nitrogen apartments, hypoxic tents, and hypoxicator devices. One of the key issues regarding the practical application of LH+TL is what the optimal hypoxic dose is that is needed to facilitate altitude acclimatization and produce the expected beneficial physiological responses and sea-level performance effects. The purpose of this review is to examine this issue from a research-based and applied perspective by addressing the following questions: What is the optimal altitude at which to live, how many days are required at altitude, and how many hours per day are required? It appears that for athletes to derive the hematological benefits of LH+TL while using natural/terrestrial altitude, they need to live at an elevation of 2000 to 2500 m for >4 wk for >22 h/d. For athletes using LH+TL in a simulated altitude environment, fewer hours (12-16 h) of hypoxic exposure might be necessary, but a higher elevation (2500 to 3000 m) is required to achieve similar physiological responses.