Browse

You are looking at 141 - 150 of 5,393 items for :

  • Sport and Exercise Science/Kinesiology x
  • Physical Education and Coaching x
  • All content x
Clear All
Restricted access

Cédric Leduc, Julien Robineau, Jason C. Tee, Jeremy Cheradame, Ben Jones, Julien Piscione, and Mathieu Lacome

Purpose: To explore the effects of travel related to international rugby sevens competition on sleep patterns. Methods: A total of 17 international male rugby sevens players participated in this study. Actigraphic and subjective sleep assessments were performed daily during 2 separate Sevens World Series competition legs (Oceania and America). The duration of each competition leg was subdivided into key periods (pretour, precompetition, tournament 1, relocation, tournament 2, and posttour) lasting 2 to 7 nights. Linear mixed models in combination with magnitude-based decisions were used to assess (1) the difference between preseason and key periods and (2) the effect of travel direction (eastward or westward). Results: Shorter total sleep time (hours:minutes) was observed during tournament 2 (mean [SD], 06:16 [01:08]), relocation (06:09 [01:09]), and the pretour week (06:34 [01:24]) compared with the preseason (06:52 [01:00]). Worse sleep quality (arbitrary units) was observed during tournament 1 (6.1 [2.0]) and 2 (5.7 [1.2]), as well as during the relocation week (6.3 [1.5]) than during the preseason (6.5 [1.8]). When traveling eastward compared with westward, earlier fall-asleep time was observed during tournament 1 (ES − 0.57; 90% CI, −1.12 to −0.01), the relocation week (−0.70 [−1.11 to −0.28]), and the posttour (−0.57 [−0.95 to −0.18]). However, possibly trivial and unclear differences were observed during the precompetition week (0.15 [−0.15 to 0.45]) and tournament 2 (0.81 [−0.29 to 1.91]). Conclusion: The sleep patterns of elite rugby sevens players are robust to the effects of long-haul travel and jet lag. However, the staff should consider promoting sleep during the tournament and relocation week.

Restricted access

Leanne K. Elliott, Jonathan A. Weiss, and Meghann Lloyd

Early motor skill interventions have been shown to improve the motor skill proficiency of children with autism spectrum disorder; however, little is known about the secondary effects associated with these types of interventions (e.g., influence on behavior, social skills, family dynamics). The purpose of this qualitative study was to (a) investigate parents’ perceptions of the child-level benefits associated with a fundamental motor skill intervention for their 4-year-olds with autism spectrum disorder and (b) explore how child-level benefits influenced the family unit. Eight parents (N = 8) were interviewed (semistructured) about their experiences with the intervention for their child(ren); the study was grounded in phenomenology. Five main child-level benefits emerged, including improvements with (a) motor skills, (b) social skills, (c) listening skills, (d) turn-taking skills, and (e) transition skills. The child-level benefits then extended to family members in a number of ways (e.g., more positive sibling interactions). These findings highlight several important secondary effects that should be investigated in future research.

Restricted access

Wing-Chun V. Yeung, Chris Bishop, Anthony N. Turner, and Sean J. Maloney

Purpose: Previously, it has been shown that loaded warm-up (LWU) can improve change-of-direction speed (CODS) in professional badminton players. However, the effect of asymmetry on CODS in badminton players and the influence of LWU on asymmetry has not been examined. Methods: A total of 21 amateur badminton players (age 29.5 [8.4] y, playing experience 8.4 [4.2] y) completed 2 trials. In the first, they performed a control warm-up. In the second, they performed the same warm-up but with 3 exercises loaded with a weight vest (LWU). Following both warm-ups, players completed single-leg countermovement jump and badminton-specific CODS tests. Results: No significant differences between control warm-up and LWU were observed for CODS, single-leg countermovement jump, or single-leg countermovement jump asymmetry. However, small effect sizes suggested faster CODS (mean difference: −5%; d = −0.32) and lower asymmetries (mean difference: −3%; d = −0.39) following LWU. Five players (24%) experienced CODS improvements greater than the minimum detectable change while 2 (10%) responded negatively. Asymmetry was not correlated with CODS following control warm-up (ρ = .079; P = .733) but was negatively associated with CODS after LWU (ρ = −.491; P = .035). Conclusion: LWU may prove a strategy to trial on an individual basis, but generic recommendations should not be applied.

Restricted access

Thomas Mullen, Craig Twist, Matthew Daniels, Nicholas Dobbin, and Jamie Highton

Purpose: To identify the association between several contextual match factors, technical performance, and external movement demands on the subjective task load of elite rugby league players. Methods: Individual subjective task load, quantified using the National Aeronautics and Space Administration Task Load Index (NASA-TLX), was collected from 29 professional rugby league players from one club competing in the European Super League throughout the 2017 season. The sample consisted of 26 matches (441 individual data points). Linear mixed modeling revealed that various combinations of contextual factors, technical performance, and movement demands were associated with subjective task load. Results: Greater number of tackles (effect size correlation ± 90% confidence intervals; η2 = .18 ± .11), errors (η2 = .15 ± .08), decelerations (η2 = .12 ± .08), increased sprint distance (η2 = .13 ± .08), losing matches (η2 = .36 ± .08), and increased perception of effort (η2 = .27 ± .08) led to most likely–very likely increases in subjective total task load. The independent variables included in the final model for subjective mental demand (match outcome, time played, and number of accelerations) were unclear, excluding a likely small correlation with technical errors (η2 = .10 ± .08). Conclusions: These data provide a greater understanding of the subjective task load and their association with several contextual factors, technical performance, and external movement demands during rugby league competition. Practitioners could use this detailed quantification of internal loads to inform recovery sessions and current training practices.

Restricted access

Ricardo Augusto Silva de Souza, André Guedes da Silva, Magda Ferreira de Souza, Liliana Kataryne Ferreira Souza, Hamilton Roschel, Sandro Fernandes da Silva, and Bryan Saunders

CrossFit® is a high-intensity functional training method consisting of daily workouts called “workouts of the day.” No nutritional recommendations exist for CrossFit® that are supported by scientific evidence regarding the energetic demands of this type of activity or dietary and supplement interventions. This systematic review performed in accordance with PRISMA guidelines aimed to identify studies that determined (a) the physiological and metabolic demands of CrossFit® and (b) the effects of nutritional strategies on CrossFit® performance to guide nutritional recommendations for optimal recovery, adaptations, and performance for CrossFit® athletes and direct future research in this emerging area. Three databases were searched for studies that investigated physiological responses to CrossFit® and dietary or supplementation interventions on CrossFit® performance. Various physiological measures revealed the intense nature of all CrossFit® workouts of the day, reflected in substantial muscle fatigue and damage. Dietary and supplementation studies provided an unclear insight into effective strategies to improve performance and enhance adaptations and recovery due to methodological shortcomings across studies. This systematic review showed that CrossFit® is a high-intensity sport with fairly homogenous anaerobic and aerobic characteristics, resulting in substantial metabolic stress, leading to metabolite accumulation (e.g., lactate and hydrogen ions) and increased markers of muscle damage and muscle fatigue. Limited interventional data exist on dietary and supplementation strategies to optimize CrossFit® performance, and most are moderate to very low quality with some critical methodological limitations, precluding solid conclusions on their efficacy. High-quality work is needed to confirm the ideal dietary and supplemental strategies for optimal performance and recovery for CrossFit® athletes and is an exciting avenue for further research.

Restricted access

Carl Foster, Daniel Boullosa, Michael McGuigan, Andrea Fusco, Cristina Cortis, Blaine E. Arney, Bo Orton, Christopher Dodge, Salvador Jaime, Kim Radtke, Teun van Erp, Jos J. de Koning, Daniel Bok, Jose A. Rodriguez-Marroyo, and John P. Porcari

The session rating of perceived exertion (sRPE) method was developed 25 years ago as a modification of the Borg concept of rating of perceived exertion (RPE), designed to estimate the intensity of an entire training session. It appears to be well accepted as a marker of the internal training load. Early studies demonstrated that sRPE correlated well with objective measures of internal training load, such as the percentage of heart rate reserve and blood lactate concentration. It has been shown to be useful in a wide variety of exercise activities ranging from aerobic to resistance to games. It has also been shown to be useful in populations ranging from patients to elite athletes. The sRPE is a reasonable measure of the average RPE acquired across an exercise session. Originally designed to be acquired ∼30 minutes after a training bout to prevent the terminal elements of an exercise session from unduly influencing the rating, sRPE has been shown to be temporally robust across periods ranging from 1 minute to 14 days following an exercise session. Within the training impulse concept, sRPE, or other indices derived from sRPE, has been shown to be able to account for both positive and negative training outcomes and has contributed to our understanding of how training is periodized to optimize training outcomes and to understand maladaptations such as overtraining syndrome. The sRPE as a method of monitoring training has the advantage of extreme simplicity. While it is not ideal for the precise recording of the details of the external training load, it has large advantages relative to evaluating the internal training load.

Restricted access

Wolfgang Schobersberger, Michael Mairhofer, Simon Haslinger, Arnold Koller, Christian Raschner, Sibylle Puntscher, and Cornelia Blank

Purpose: To analyze the predictive value of parameters of submaximal and maximal cardiopulmonary exercise performance on International Ski Federation (Fédération Internationale de Ski) World Cup ranking (FIS ranking) in elite Austrian Alpine skiers. Methods: Over 7 World Cup seasons (2012–2018), exercise data (maximal oxygen uptake and maximum power output, lactate threshold 2, and ventilatory threshold 2, based on stepwise cycle spiroergometry) were analyzed to determine whether there was a correlation between world FIS ranking and exercise capacity of male and female elite Alpine skiers. Results: The data of 39 male skiers (age: 27.67 [4.20] y, body mass index: 26.03 [1.25] kg/m2) and 36 female skiers (age: 25.49 [3.18] y, body mass index: 22.97 [1.71] kg/m2) were included in this study. The maximum oxygen uptake and maximum power output ranged from 4.37 to 4.42 W/kg and 53.41 to 54.85 mL/kg/min in men and from 4.17 to 4.30 W/kg and 45.96 to 49.16 mL/kg/min in women, respectively, over the 7 seasons; the yearly mean FIS ranking ranged from 17 to 24 in men and 9 to 18 in women. In a fixed-effects model used for the subsequent panel regression analysis, no statistically significant effect on FIS ranking was found for the exercise parameters of interest. Conclusions: Neither maximal aerobic tests nor maximum power output significantly predicted competitive performance, as indexed by the FIS ranking. This reinforces the assumption that no single parameter determines competition performance in this complex sport. Therefore, identifying the optimum amount of endurance training remains a major challenge for athletes and coaches, as does identifying and improving the factors that determine performance.

Restricted access

Manuel Santiago Martin, Fernando Pareja Blanco, and Eduardo Saez De Villarreal

Purpose: This study aimed to compare the effects of 5 different 18-week in-season strength training programs on strength gains and specific water polo performance. Methods: A total of 56 young male water polo players were randomly assigned to the following 5 training groups: dry-land strength training, in-water-specific strength training, combined (dry-land and in-water) strength training, ballistic training, and eccentric-overload training. Physical performance was assessed before (Pre) and after (Post) the training period using the following battery of tests: in-water boost and countermovement jump, muscle strength in bench-press and full-squat, throwing speed (ThS), in-water agility, and 20-m maximal sprint swim. Results: Significant group × time interactions were observed for countermovement jump and in-water boost. Eccentric-overload training showed significantly higher gains in ThS and bench-press and full-squat strength than the rest of the training groups. In addition, all training groups (except in-water-specific strength training) induced significant improvements (P ≤ .05) in countermovement jump, in-water boost, and bench-press and full-squat strength. All training groups significantly increased (P ≤ .001) ThS. Moreover, all training groups improved (P ≤ .05) in-water agility (except dry-land strength training) and swimming sprint performance (except in-water-specific strength training and ballistic training). Conclusion: The findings indicate that the 18-week in-season strength training programs induced improvements in strength and specific water polo skills. The eccentric-overload training resulted in greater improvements in muscle strength (in both upper and lower body) and ThS than the other training methods examined in the study.

Restricted access

Simon A. Feros, Kris Hinck, and Jake Dwyer

Purpose: This study investigated the acute warm-up effects of modified-implement bowling on bowling speed, accuracy, perceived rhythm and perceived sensation with a regular ball. Methods: A total of 13 male amateur pace bowlers completed 3 sessions in a randomized, counterbalanced order. Each session comprised a warm-up of 21 progressive-effort deliveries with either a regular (156 g), 10% heavier (171.6 g), or 10% lighter (140.4 g) cricket ball followed by a 4-over pace-bowling assessment with a regular ball. Bowling speed was assessed with a radar gun, while accuracy was calculated via the radial error. Subjects rated their perceived exertion (0%–100%), rhythm (1–5 Likert scale), and sensation (1–5 Likert scale) after each delivery. Results: The linear mixed models revealed a significant effect for warm-up condition on perceived delivery sensation (F 2,916.404 = 24.137, P < .001), with a significant pairwise difference between the regular- and heavier-ball warm-up conditions of 0.20 ± 0.07 points (estimated marginal mean ± 95% confidence interval, P < .001). There were no statistically significant effects for warm-up condition on bowling speed, accuracy, and perceived delivery rhythm. Conclusions: These findings indicate that although the regular ball felt lighter to bowl with after using the heavier ball, there were no overall potentiating or detrimental effects of using this particular modified-implement warm-up on bowling speed, accuracy, and perceived rhythm in amateur pace bowlers. Future research is encouraged to trial other protocols for eliciting potentiation to ultimately enhance bowling speed in training or in shorter match formats (eg, Twenty20).