Louise M. Burke and Peter Peeling
Many expert sporting bodies now support a pragmatic acceptance of the use of performance supplements which have passed a risk:benefit analysis of being safe, effective, and permitted for use, while also being appropriate to the athlete’s age and maturation in their sport. However, gaining evidence of the performance benefits of these supplements is a process challenged by the scarcity of research in relation to the number of available products, and the limitations of the poor quality of some studies. While meta-analyses and systematic reviews can help to provide information about the general use of performance supplements, the controlled scientific trial provides the basis on which these reviews are undertaken, as well as an opportunity to address more specific questions about supplement applications. Guidelines for the design of studies include the choice of well-trained athletes who are familiarized with performance tasks that have been chosen on their basis of their known reliability and validity. Supplement protocols should be chosen to maximize the likely benefits, and researchers should also make efforts to control confounding variables, while keeping conditions similar to real-life practices. Performance changes should be interpreted in light of what is meaningful to the outcomes of sporting competition. Issues that have been poorly addressed to date include the use of several supplements in combination and the use of the same supplement over successive events, both within a single, and across multiple competition days. Strategies to isolate and explain the variability of benefits to individuals are also a topic for future investigation.
Louise M. Burke and Inigo Mujika
Postexercise recovery is an important topic among aquatic athletes and involves interest in the quality, quantity, and timing of intake of food and fluids after workouts or competitive events to optimize processes such as refueling, rehydration, repair, and adaptation. Recovery processes that help to minimize the risk of illness and injury are also important but are less well documented. Recovery between workouts or competitive events may have two separate goals: (a) restoration of body losses and changes caused by the first session to restore performance for the next and (b) maximization of the adaptive responses to the stress provided by the session to gradually make the body become better at the features of exercise that are important for performance. In some cases, effective recovery occurs only when nutrients are supplied, and an early supply of nutrients may also be valuable in situations in which the period immediately after exercise provides an enhanced stimulus for recovery. This review summarizes contemporary knowledge of nutritional strategies to promote glycogen resynthesis, restoration of fluid balance, and protein synthesis after different types of exercise stimuli. It notes that some scenarios benefit from a proactive approach to recovery eating, whereas others may not need such attention. In fact, in some situations it may actually be beneficial to withhold nutritional support immediately after exercise. Each athlete should use a cost–benefit analysis of the approaches to recovery after different types of workouts or competitive events and then periodize different recovery strategies into their training or competition programs.
Louise M. Burke and Mary P. Miles
Reid Reale, Gary Slater and Louise M. Burke
Purpose: Combat sport athletes undertake chronic and rapid weight loss (RWL) practices to qualify for weight divisions lower than their training weight. Variation between sports in the prevalence, methods, and magnitude of weight loss as well as recovery practices may be influenced by factors including competition level and culture. Differences in methodologies of previous research in combat sports make direct comparisons difficult; thus, this study aimed to examine weight loss practices among all Olympic combat sports in Australia, using standardized methodology. Methods: High-caliber competitors in wrestling, boxing, judo, and taekwondo (n = 260) at Australian competitions were surveyed using a validated tool that provides quantification of how extreme an athlete’s weight loss practices are: the rapid weight loss score (RWLS). Additional qualitative and quantitative survey data were also collected. Results: Neither sport, sex, nor weight division group had an effect on RWLS; however, a significant effect of athlete caliber was detected (F2,215 = 4.953, mean square error = 4.757, P = .00792). Differences between sports were also evident for most weight ever lost in order to compete (H = 19.92, P = .0002), age at which weight cutting began (H = 16.34, P = .001), and selected methods/patterns of RWL (P < .001). Weight cycling between competitions was common among all sports as were influences on athlete’s behaviors. Conclusions: Although many similarities in weight loss practices and experiences exist between combat sports, specific differences were evident. Nuanced, context/culturally specific guidelines should be devised to assist fighters’ in optimizing performance while minimizing health implications.
Nikki A. Jeacocke and Louise M. Burke
When testing is undertaken to monitor an athlete’s progress toward competition goals or the effect of an intervention on athletic outcomes, sport scientists should aim to minimize extraneous variables that influence the reliability, sensitivity, or validity of performance measurement. Dietary preparation is known to influence metabolism and exercise performance. Few studies, however, systematically investigate the outcomes of protocols that acutely control or standardize dietary intake in the hours and days before a performance trial. This review discusses the nutrients and dietary components that should be standardized before performance testing and reviews current approaches to achieving this. The replication of habitual diet or dietary practices, using tools such as food diaries or dietary recalls to aid compliance and monitoring, is a common strategy, and the use of education aids to help athletes achieve dietary targets offers a similarly low burden on the researcher. However, examination of dietary intake from real-life examples of these protocols reveals large variability between and within participants. Providing participants with prepackaged diets reduces this variability but can increase the burden on participants, as well as the researcher. Until studies can better quantify the effect of different protocols of dietary standardization on performance testing, sport scientists can only use a crude cost–benefit analysis to choose the protocols they implement. At the least, study reports should provide a more comprehensive description of the dietary-standardization protocols used in the research and the effect of these on the dietary intake of participants during the period of interest.
Gregory Shaw, Gary Slater and Louise M. Burke
Thirty nine elite Australian swimmers (13 AIS, 26 OTHER) completed a standardized questionnaire regarding their supplement use during a pre competition camp. The data were compared with a similar study conducted 11 years earlier (11 AIS, 23 OTHER) and framed around the classification system of the Sport Supplement Program of the Australian Institute of Sport. The prevalence of supplement use remained constant over time (2009: 97%, 1998: 100%). However, the current swimmers used a greater number of dietary supplements (9.2 ± 3.7 and 5.9 ± 2.9; p = .001), accounted for by an increase in the reported use of supplements with a greater evidence base (Sports Foods, Ergogenics, and Group B supplements). In contrast, fewer supplements considered less reputable (Group C and D) were reported by the 2009 cohort (0.7 ± 1.0 and 1.6 ± 1.3; p = .003). AIS swimmers reported a greater use of Ergogenics (4.3 ± 1.8 and 3.1 ± 1.7; p = .002), and less use of Group C and D supplements overall (0.8 ± 1.2 and 1.3 ± 1.2; p = .012), which was explained primarily by a smaller number of these supplements reported by the 2009 group (1998 AIS: 1.5 ± 1.4, 2009 AIS: 0.2 ± 0.6; p = .004). Although the prevalence of supplement use has not changed over time, there has been a significant increase in the number and type of products they are using. The potential that these changes can be attributed to a Sports Supplement Program merit investigation.
Reid Reale, Gary Slater and Louise M. Burke
It is common for athletes in weight-category sports to try to gain a theoretical advantage by competing in weight divisions that are lower than their day-to-day body mass (BM). Weight loss is achieved not only through chronic strategies (body-fat losses) but also through acute manipulations before weigh-in (“making weight”). Both have performance implications. This review focuses on Olympic combat sports, noting that the varied nature of regulations surrounding the weigh-in procedures, weight requirements, and recovery opportunities in these sports provide opportunity for a wider discussion of factors that can be applied to other weight-category sports. The authors summarize previous literature that has examined the performance effects of weightmaking practices before investigating the physiological nature of these BM losses. Practical recommendations in the form of a decision tree are provided to guide the achievement of acute BM loss while minimizing performance decrements.
Gregory Shaw, Gary Slater and Louise M. Burke
This study examined the influence the Australian Institute of Sport (AIS) Sport Supplement Program had on supplement practices of elite Australian swimmers, comparing those guided by the Program with others in the same national team. Thirty-nine elite swimmers (13 AIS, 26 Other; 20 female, 19 male; age 21.8 ± 3.3 y) completed a questionnaire investigating supplement use. Ninety-seven percent of swimmers reported taking supplements or sports foods over the preceding 12 months. AIS swimmers reported using more total brands (p = .02) and supplements considered Ergogenic (p = .001) than Other swimmers who used more supplements considered to be lacking scientific support (p = .028). Swimmers rated the risk of a negative outcome from the use of supplements available in Australia (Mdn = 3.0) as less than the risk of supplements from international sources (Mdn = 4.0; p < .001). AIS swimmers were more likely to report dietitians (p < .001) and sports physicians (p = .017) as advisors of their supplement use. Other swimmers more frequently reported fellow athletes as a source of supplement advice (p = .03). AIS swimmers sourced a greater percentage of their supplements from an organized program (94 ± 16%) compared with Other (40 ± 32%; p < .001) who sourced a greater percentage (30 ± 30%) of their dietary supplements from supermarkets. These findings suggest that swimmers influenced by this sport supplement program more frequently use supplements that are recommended by allied health trained individuals, classified as evidence based and provided by the program.
Julia L. Bone and Louise M. Burke
Low energy availability can place athletes at increased risk of injury and illness and can be detected by a lower metabolic rate. The lowest metabolic rate is captured at the bedside, after an overnight fast and termed inpatient resting energy expenditure (REE). Measurements done in a laboratory with a shorter overnight fast are termed outpatient REE. Although important to know what the lowest energy expenditure, a bedside measure and/or 12-hr fast is not always practical or logistically possible particularly when you take into account an athlete’s training schedule. The aim of this investigation was to compare a bedside measure of resting metabolism with a laboratory measure in athletes following an 8-hr fast. Thirty-two athletes (24 females and eight males) underwent measures of resting metabolism using indirect calorimetry once at their bedside (inpatient) and once in a simulated laboratory setting (outpatient). Paired t test was used to compare the mean ± SD differences between the two protocols. Inpatient REE was 7,302 ± 1,272 kJ/day and outpatient REE was 7,216 ± 1,116 kJ/day (p = .448). Thirteen participants repeated the outpatient protocol and 17 repeated the inpatient protocol to assess the day-to-day variation. Reliability was assessed using the intraclass correlation coefficient and typical error. The inpatient-protocol variability was 96% with a typical error of 336.2 kJ/day. For the outpatient protocol, the intraclass correlation coefficient and typical error were 87% and 477.6 kJ/day, respectively. Results indicate no difference in REE when measured under inpatient and outpatient conditions; however, the inpatient protocol has greater reliability.