Despite consistent reports of poor bone health in male jockeys, it is not yet known if this is a consequence of low energy availability or lack of an osteogenic stimulus. Given the rationale that low energy availability is a contributing factor in low bone health, we tested the hypothesis that both hip and lumbar bone mineral density (BMD) should progressively worsen in accordance with the years of riding. In a cross-sectional design, male apprentice (n = 17) and senior (n = 14) jockeys (matched for body mass and fat-free mass) were assessed for hip and lumbar spine BMD, as well as both measured and predicted resting metabolic rate (RMR). Despite differences (p < .05) in years of race riding (3.4 ± 2 vs. 16.3 ± 6.8), no differences were apparent (p > .05) in hip (−0.9 ± 1.1 vs. −0.8 ± 0.7) and lumbar Z-scores (−1.3 ± 1.4 vs. −1.5 ± 1) or measured RMR (1,459 ± 160 vs. 1,500 ± 165 kcal/day) between apprentices and senior jockeys, respectively. Additionally, years of race riding did not demonstrate any significant correlations (p > .05) with either hip or lumbar spine BMD. Measured RMR was also not different (p > .05) from predicted RMR in either apprentice (1,520 ± 44 kcal/day) or senior jockeys (1,505 ± 70 kcal/day). When considered with previously published data examining underreporting of energy intake and direct assessments of energy expenditure, we suggest that low BMD in jockeys is not due to low energy availability per se but rather the lack of an osteogenic stimulus associated with riding.
George Wilson, Dan Martin, James P. Morton and Graeme L. Close
Graeme L. Close, Craig Sale, Keith Baar and Stephane Bermon
Injuries are an inevitable consequence of athletic performance with most athletes sustaining one or more during their athletic careers. As many as one in 12 athletes incur an injury during international competitions, many of which result in time lost from training and competition. Injuries to skeletal muscle account for over 40% of all injuries, with the lower leg being the predominant site of injury. Other common injuries include fractures, especially stress fractures in athletes with low energy availability, and injuries to tendons and ligaments, especially those involved in high-impact sports, such as jumping. Given the high prevalence of injury, it is not surprising that there has been a great deal of interest in factors that may reduce the risk of injury, or decrease the recovery time if an injury should occur: One of the main variables explored is nutrition. This review investigates the evidence around various nutrition strategies, including macro- and micronutrients, as well as total energy intake, to reduce the risk of injury and improve recovery time, focusing upon injuries to skeletal muscle, bone, tendons, and ligaments.
Louise M. Burke, Graeme L. Close, Bronwen Lundy, Martin Mooses, James P. Morton and Adam S. Tenforde
Low energy availability (LEA) is a key element of the Female Athlete Triad. Causes of LEA include failure to match high exercise energy expenditure (unintentional) or pathological behaviors of disordered eating (compulsive) and overzealous weight control programs (misguided but intentional). Recognition of such scenarios in male athletes contributed to the pronouncement of the more inclusive Relative Energy Deficiency in Sport (RED-S) syndrome. This commentary describes the insights and experience of the current group of authors around the apparently heightened risk of LEA in some populations of male athletes: road cyclists, rowers (lightweight and open weight), athletes in combat sports, distance runners, and jockeys. The frequency, duration, and magnitude of the LEA state appear to vary between populations. Common risk factors include cyclical management of challenging body mass and composition targets (including “making weight”) and the high energy cost of some training programs or events that is not easily matched by energy intake. However, additional factors such as food insecurity and lack of finances may also contribute to impaired nutrition in some populations. Collectively, these insights substantiate the concept of RED-S in male athletes and suggest that a specific understanding of a sport, subpopulation, or culture may identify a complex series of factors that can contribute to LEA and the type and severity of its outcomes. This commentary provides a perspective on the range of risk factors that should be addressed in future surveys of RED-S in athletic populations and targeted for specific investigation and modification.
Harry E. Routledge, Stuart Graham, Rocco Di Michele, Darren Burgess, Robert M. Erskine, Graeme L. Close and James P. Morton
The authors aimed to quantify (a) the periodization of physical loading and daily carbohydrate (CHO) intake across an in-season weekly microcycle of Australian Football and (b) the quantity and source of CHO consumed during game play and training. Physical loading (via global positioning system technology) and daily CHO intake (via a combination of 24-hr recall, food diaries, and remote food photographic method) were assessed in 42 professional male players during two weekly microcycles comprising a home and away fixture. The players also reported the source and quantity of CHO consumed during all games (n = 22 games) and on the training session completed 4 days before each game (n = 22 sessions). The total distance was greater (p < .05) on game day (GD; 13 km) versus all training days. The total distance differed between training days, where GD-2 (8 km) was higher than GD-1, GD-3, and GD-4 (3.5, 0, and 7 km, respectively). The daily CHO intake was also different between training days, with reported intakes of 1.8, 1.4, 2.5, and 4.5 g/kg body mass on GD-4, GD-3, GD-2, and GD-1, respectively. The CHO intake was greater (p < .05) during games (59 ± 19 g) compared with training (1 ± 1 g), where in the former, 75% of the CHO consumed was from fluids as opposed to gels. Although the data suggest that Australian Football players practice elements of CHO periodization, the low absolute CHO intakes likely represent considerable underreporting in this population. Even when accounting for potential underreporting, the data also suggest Australian Football players underconsume CHO in relation to the physical demands of training and competition.
Andreas M. Kasper, S. Andy Sparks, Matthew Hooks, Matthew Skeer, Benjamin Webb, Houman Nia, James P. Morton and Graeme L. Close
Rugby is characterized by frequent high-intensity collisions, resulting in muscle soreness. Players consequently seek strategies to reduce soreness and accelerate recovery, with an emerging method being cannabidiol (CBD), despite anti-doping risks. The prevalence and rationale for CBD use in rugby has not been explored; therefore, we recruited professional male players to complete a survey on CBD. Goodness of fit chi-square (χ2) was used to assess CBD use between codes and player position. Effects of age on use were determined using χ2 tests of independence. Twenty-five teams provided 517 player responses. While the majority of players had never used CBD (p < .001, V = 0.24), 26% had either used it (18%) or were still using it (8%). Significantly more CBD use was observed in rugby union compared with rugby league (p = .004, V = 0.13), but player position was not a factor (p = .760, V = 0.013). CBD use increased with players’ age (p < .001, V = 0.28), with mean use reaching 41% in the players aged 28 years and older category (p < .0001). The players using CBD primarily used the Internet (73%) or another teammate (61%) to obtain information, with only 16% consulting a nutritionist. The main reasons for CBD use were improving recovery/pain (80%) and sleep (78%), with 68% of players reporting a perceived benefit. These data highlight the need for immediate education on the risks of CBD, as well as the need to explore the claims regarding pain and sleep.
Louise M. Burke, Linda M. Castell, Douglas J. Casa, Graeme L. Close, Ricardo J. S. Costa, Ben Desbrow, Shona L. Halson, Dana M. Lis, Anna K. Melin, Peter Peeling, Philo U. Saunders, Gary J. Slater, Jennifer Sygo, Oliver C. Witard, Stéphane Bermon and Trent Stellingwerff
The International Association of Athletics Federations recognizes the importance of nutritional practices in optimizing an Athlete’s well-being and performance. Although Athletics encompasses a diverse range of track-and-field events with different performance determinants, there are common goals around nutritional support for adaptation to training, optimal performance for key events, and reducing the risk of injury and illness. Periodized guidelines can be provided for the appropriate type, amount, and timing of intake of food and fluids to promote optimal health and performance across different scenarios of training and competition. Some Athletes are at risk of relative energy deficiency in sport arising from a mismatch between energy intake and exercise energy expenditure. Competition nutrition strategies may involve pre-event, within-event, and between-event eating to address requirements for carbohydrate and fluid replacement. Although a “food first” policy should underpin an Athlete’s nutrition plan, there may be occasions for the judicious use of medical supplements to address nutrient deficiencies or sports foods that help the athlete to meet nutritional goals when it is impractical to eat food. Evidence-based supplements include caffeine, bicarbonate, beta-alanine, nitrate, and creatine; however, their value is specific to the characteristics of the event. Special considerations are needed for travel, challenging environments (e.g., heat and altitude); special populations (e.g., females, young and masters athletes); and restricted dietary choice (e.g., vegetarian). Ideally, each Athlete should develop a personalized, periodized, and practical nutrition plan via collaboration with their coach and accredited sports nutrition experts, to optimize their performance.