Rugby is a worldwide intermittent team sport. Players tend to be heavier than the majority of similar team sport athletes on whom the dietary guidelines have been developed. Therefore, the aim of the current review was to describe the intakes of rugby union players. Article databases were searched up to February 2017 and were included if they were published in English and reported dietary intakes of male rugby union players. Of the research articles identified, energy intakes were lower than two of three studies that reported intakes and expenditure, which would suggest the players were losing weight that is somewhat supported by the decreases in skinfolds seen during preseason. However, it should also be noted that there are errors in both the measurement of energy intakes and expenditure. Carbohydrate intakes ranged from 2.6 to 6.5 g·kg−1·day−1, which is lower than the current relative to body mass recommendations; however, this would not be classed as a low-carbohydrate diet. The consistently low intakes of carbohydrate suggest that these intake levels maybe sufficient for performance, given the players greater body mass or there are errors in the measurements. However, there is currently no evidence for the carbohydrate needs of rugby union players in terms of performance. The lower intakes than expenditure would suggest the players were losing weight. Previous research shows that rugby union players lose body fat during preseason training.
Search Results
You are looking at 1 - 9 of 9 items for
- Author: Katherine Black x
- Refine by Access: All Content x
Macronutrient Intakes of Male Rugby Union Players: A Review
Katherine Elizabeth Black, Alistair David Black, and Dane Frances Baker
Female Recreational Exercisers at Risk for Low Energy Availability
Joanne Slater, Rebecca McLay-Cooke, Rachel Brown, and Katherine Black
Low energy availability (LEA) describes the disruption in normal physiological function existent when insufficient energy intake is combined with exercise. To conserve energy a range of endocrine adaptations occur, impairing health and athletic performance. The prevalence of LEA has not been fully established especially among recreational exercisers. Determining recreational exercisers at risk of LEA may help to maximize prevention, early diagnosis and treatment. The design of this study was a cross-sectional online survey. One-hundred and nine female recreational exercisers, with a mean age of 23.8 (SD 6.9) years were recruited via gyms and fitness centers throughout NZ. Participants completed an online questionnaire including questions from the LEAF-Q (Low Energy Availability in Females Questionnaire). A total of 45.0% (CI, 35.4%, 54.8%) of participants were classified as “at risk“ of LEA. For every extra hour of exercise per week the odds of being at risk of LEA were 1.13 times greater (CI 1.02, 1.25, p = .016). All participants reporting previous stress fracture injuries (n = 4) were classified as at risk for LEA. Significantly more subjects participating in an individual sport were classified as at risk for LEA (69.6%, CI 24.3%, 54.8%) compared with team sports (34.8%, CI 18.7%, 40.5%) (p = .006). The high prevalence of female recreational exercisers at risk of LEA is of concern, emphasizing the importance of increasing awareness of the issue, and promoting prevention and early detection strategies, so treatment can be implemented before health is severely compromised.
Energy Intakes of Ultraendurance Cyclists During Competition, an Observational Study
Katherine E. Black, Paula M.L. Skidmore, and Rachel C. Brown
Endurance events >10 hr are becoming increasingly popular but provide numerous physiological challenges, several of which can be attenuated with optimal nutritional intakes. Previous studies in ultraendurance races have reported large energy deficits during events. The authors therefore aimed to assess nutritional intakes in relation to performance among ultraendurance cyclists. This observational study included 18 cyclists in a 384-km cycle race. At race registration each cyclist’s support crew was provided with a food diary for their cyclist. On completion of the race, cyclists were asked to recall their race food and drink intakes. All food and fluids were analyzed using a computer software package. Mean (SD) time to complete the race was 16 hr 21 min (2 hr 2 min). Mean (SD) energy intake was 18.7 (8.6) MJ, compared with an estimated energy requirement for the race of 25.5 (7.4) MJ. There was a significant negative relationship between energy intake and time taken to complete the race (p = .023, r 2 = −.283). Mean (SD) carbohydrate, fat, and protein intakes were 52 (27), 15.84 (56.43), and 2.94 (7.25) g/hr, respectively. Only carbohydrate (p = .015, r 2 = −.563) and fat intake (p = .037, r 2 = −.494) were associated with time taken to complete the race. This study demonstrates the difficulties in meeting the high energy demands of ultraendurance cycling. The relationship between energy intake and performance suggests that reducing the energy deficit may be advantageous. Given the high carbohydrate intakes of these athletes, increasing energy intake from fat should be investigated as a means of decreasing energy deficits.
Case Study: Nutritional Strategies of a Cyclist With Celiac Disease During an Ultraendurance Race
Katherine Elizabeth Black, Paula Skidmore, and Rachel Clare Brown
Food intolerance is becoming increasingly prevalent, and increasing numbers of athletes have celiac disease. This poses challenges as dietary recommendations for exercise are largely based on gluten-containing carbohydrate-rich foods. The K4 cycle race covers 384 km around the Coromandel Peninsula, New Zealand. Lack of sleep, darkness, and temperature variations pose a number of nutritional challenges. Limited food choices present those with celiac disease with even greater challenges. This case study describes the intakes of one such athlete during training and competing in the K4. Nutritional intakes were obtained during training using weighed-food records and during the race via dietary recall and the weighing of foods pre- and postrace. As simple substitution of gluten-containing foods for gluten-free foods leads to increased energy intake, alternatives need to be considered. During the race, insufficient energy was consumed to meet the nutritional guidelines for endurance performance. This was probably due to the nature of the course, racing conditions, the consistency of gluten-free food, and, toward the end of the race, sensory-specific satiety.
Fluid and Sodium Balance of Elite Wheelchair Rugby Players
Katherine Elizabeth Black, Jody Huxford, Tracy Perry, and Rachel Clare Brown
Blood sodium concentration of tetraplegics during exercise has not been investigated. This study aimed to measure blood sodium changes in relation to fluid intakes and thermal comfort in tetraplegics during wheelchair rugby training. Twelve international male wheelchair rugby players volunteered, and measures were taken during 2 training sessions. Body mass, blood sodium concentration, and subjective thermal comfort using a 10-point scale were recorded before and after both training sessions. Fluid intake and the distance covered were measured during both sessions. The mean (SD) percentage changes in body mass during the morning and afternoon training sessions were +0.4%1 (0.65%) and +0.69% (1.24%), respectively. There was a tendency for fluid intake rate to be correlated with the percentage change in blood sodium concentration (p = .072, r 2 = .642) during the morning training session; this correlation reached significance during the afternoon session (p = .004, r 2 = .717). Fluid intake was significantly correlated to change in thermal comfort in the morning session (p = .018, r 2 = .533), with this correlation showing a tendency in the afternoon session (p = .066, r 2 = .151). This is the first study to investigate blood sodium concentrations in a group of tetraplegics. Over the day, blood sodium concentrations significantly declined; 2 players recorded blood sodium concentrations of 135 mmol/L, and 5 recorded blood sodium concentrations of 136 mmol/L. Excessive fluid intake as a means of attenuating thermal discomfort seems to be the primary cause of low blood sodium concentrations in tetraplegic athletes. Findings from this study could aid in the design of fluid-intake strategies for tetraplegics.
Hepcidin as a Prospective Individualized Biomarker for Individuals at Risk of Low Energy Availability
Claire E. Badenhorst, Katherine E. Black, and Wendy J. O’Brien
Hepcidin, a peptide hormone with an acknowledged evolutionary function in iron homeostasis, was discovered at the turn of the 21st century. Since then, the implications of increased hepcidin activity have been investigated as a potential advocate for the increased risk of iron deficiency in various health settings. Such implications are particularly relevant in the sporting community where peaks in hepcidin postexercise (∼3–6 hr) are suggested to reduce iron absorption and recycling, and contribute to the development of exercise-induced iron deficiency in athletes. Over the last decade, hepcidin research in sport has focused on acute and chronic hepcidin activity following single and repeated training blocks. This research has led to investigations examining possible methods to attenuate postexercise hepcidin expression through dietary interventions. The majority of macronutrient dietary interventions have focused on manipulating the carbohydrate content of the diet in an attempt to determine the health of athletes adopting the low-carbohydrate or ketogenic diets, a practice that is a growing trend among endurance athletes. During the process of these macronutrient dietary intervention studies, an observable coincidence of increased cumulative hepcidin activity to low energy availability has emerged. Therefore, this review aims to summarize the existing literature on nutritional interventions on hepcidin activity, thus, highlighting the link of hepcidin to energy availability, while also making a case for the use of hepcidin as an individualized biomarker for low energy availability in males and females.
A 0.1% L-Menthol Mouth Swill in Elite Male Rugby Players Has Different Effects in Forwards and Backs
Marcia L. Jerram, Dane Baker, Tiaki B. Smith, Phil Healey, Lee Taylor, and Katherine Black
Purpose: Menthol mouth swills can improve endurance performance in the heat, which is attributed to attenuations in nonthermally derived thermal sensation (TS) and perception of effort. However, research in elite team-sport athletes is absent. Therefore, this study investigated the performance and TS responses to a 0.1% menthol mouth rinse (MR) or placebo (PLA) among elite male rugby union players. Method: Twenty-seven (15 Forwards and 12 Backs) elite male Super Rugby players completed two 3-minute 15-a-side rugby-specific conditioning blocks, with MR or PLA provided at the start of training (baseline), at the start of each 3-minute block (swill 1 [S1] and swill 2 [S2]), and at the end of training (swill 3 [S3]). TS was assessed using the American Society of Heating, Refrigerating and Air-Conditioning Engineers 9-point Analog Sensation Scale after each swill and at baseline (preconditioning block). Acceptability was measured after baseline swill and S3 using a 5-question Likert scale. Physical performance was measured throughout training using global positioning system metrics. Results: MR attenuated TS from baseline to S1 (P = .003, SD = 1.01) and S2 (P = .002, SD = 1.09) in Forwards only, compared with PLA. Acceptability was higher only for Forwards in MR versus PLA at baseline (P = .003, SD = 1.3) and S3 (P = .004, SD = 0.75). MR had no effect on physical performance metrics (P > .05). Conclusion: MR attenuated the rise in TS with higher acceptability at S1 and S3 (in Forwards only) with no effect on selected physical performance metrics. Longer-duration exercise (eg, a match) in hot–humid conditions eliciting markedly increased body temperatures could theoretically allow favorable changes in TS to enhance performance—these postulations warrant experimental investigation.
Youth Sport Parenting Styles and Practices
Nicholas L. Holt, Katherine A. Tamminen, Danielle E. Black, James L. Mandigo, and Kenneth R. Fox
The purpose of this study was to examine parenting styles and associated parenting practices in youth sport. Following a season-long period of fieldwork, primary data were collected via interviews with 56 parents and supplemented by interviews with 34 of their female children. Data analysis was guided by Grolnick's (2003) theory of parenting styles. Analyses produced five findings: (1) Autonomy-supportive parents provided appropriate structure for their children and allowed them to be involved in decision making. These parents were also able to read their children's mood and reported open bidirectional communication. (2) Controlling parents did not support their children's autonomy, were not sensitive to their children's mood, and tended to report more closed modes of communication. (3) In some families, there were inconsistencies between the styles employed by the mother and father. (4) Some parenting practices varied across different situations. (5) Children had some reciprocal influences on their parents' behaviors. These findings reveal information about the multiple social interactions associated with youth sport parenting.
Levels of Social Complexity and Dimensions of Peer Experiences in Youth Sport
Nicholas L. Holt, Danielle E. Black, Katherine A. Tamminen, Kenneth R. Fox, and James L. Mandigo
We assessed young adolescent female soccer players’ perceptions of their peer group experiences. Data were collected via interviews with 34 girls from two youth soccer teams (M age = 13.0 years). Following inductive discovery analysis, data were subjected to an interpretive theoretical analysis guided by a model of peer experiences (Rubin, Bukowski, & Parker, 2006). Five categories of peer experiences were identified across three levels of social complexity. At the interaction level players integrated new members into the team and learned to interact with different types of people. At the relationship level players learned about managing peer conflict. At the group level a structure of leadership emerged and players learned to work together. Findings demonstrated interfaces between peer interactions, relationships, and group processes while also simplifying some apparently complex systems that characterized peer experiences on the teams studied.