Purpose: To investigate and explore the relationships between physiological and perceptual recovery and stress responses to elite netball tournament workloads. Methods: Nine elite female netballers were observed across a 3-day (T1–3), 4-match tournament. Participants provided salivary samples for cortisol and alpha-amylase analysis, completed the Short Recovery Stress Scale (SRSS), and reported session ratings of perceived exertion. Inertial measurement units and heart-rate monitors determined player load, changes of direction (COD), summated heart-rate zones, and jumps. Results: Analysis revealed 6 significant SRSS time effects: (1) decreased recovery markers of physical performance (P = .042), emotional balance (P = .034), and overall recovery (P = .001) and (2) increased perceptual stress markers of muscular stress (P = .001), negative emotional state (P = .026), and overall stress (P = .010). Salivary cortisol decreased over the tournament (T1–3) before progressively increasing posttournament with greater salivary samples for cortisol on T+2 compared with T3 (P = .014, ES = −1.29; −2.24 to −0.22]) and T+1 (P = .031, ES = −1.54; −2.51 to −0.42). SRSS overall recovery moderately negatively correlated with COD (r = −.41, P = .028) and session ratings of perceived exertion (r = −.40, P = .034). Cumulative workload did not relate to posttournament perceptual or salivary responses. Percentage change in salivary variables related (P < .05) to total player load, total COD, and overall recovery across specific cumulative time periods. Conclusions: During and after an elite netball tournament, athletes indicated increased perceptual stress and lack of recovery. The SRSS is a valuable tool for recovery–stress monitoring in elite tournament netball. It is recommended that practitioners monitor COD due to its negative influence on perceived overall recovery.
Suzanna Russell, Marni J. Simpson, Angus G. Evans, Tristan J. Coulter, and Vincent G. Kelly
Cristian Ieno, Roberto Baldassarre, Maddalena Pennacchi, Antonio La Torre, Marco Bonifazi, and Maria Francesca Piacentini
Purpose: To analyze training-intensity distribution (TID) using different independent monitoring systems for internal training load in a group of elite open-water swimmers. Methods: One hundred sixty training sessions were monitored in 4 elite open-water swimmers (2 females and 2 males: 23.75 [4.86] y, 62.25 [6.18] kg, 167 [6.68] cm) during 5 weeks of regular training. Heart-rate-based methods, such as time in zone (TIZ), session goal (SG), and hybrid (SG/TIZ), were used to analyze TID. Similarly to SG/TIZ, a new hybrid approach, the rating of perceived exertion (RPE)/TIZ for a more accurate analysis of TID was used. Moreover, based on the 3-zone model, the session ratings of perceived exertion of the swimmers and the coach were compared. Results: Heart-rate- and RPE-based TID methods were significantly different in quantifying Z1 (P = .012; effect size [ES] = 0.490) and Z2 (P = .006; ES = 0.778), while no difference was observed in the quantification of Z3 (P = .428; ES = 0.223). The heart-rate-based data for Z1, Z2, and Z3 were 83.2%, 7.4%, and 8.1% for TIZ; 80.8%, 8.3%, and 10.8% for SG/TIZ; and 55%, 15.6%, and 29.4% for SG. The RPE-based data were 70.9%, 19.9%, and 9.2% for RPE/TIZ% and 41.2%, 48.9%, and 9.7% for the session rating of perceived exertion. No differences were observed between the coach’s and the swimmers’ session ratings of perceived exertion in the 3 zones (Z1: P = .663, ES = −0.187; Z2: P = .110, ES = 0.578; Z3: P = .149, ES = 0.420). Conclusion: Using RPE-based TID methods, Z2 was significantly larger compared with Z1. These results show that RPE-based TID methods in elite open-water swimmers are affected by both intensity and volume.
Wassim Moalla, Mohamed Saifeddin Fessi, Sabeur Nouira, Alberto Mendez-Villanueva, Valter Di Salvo, and Said Ahmaidi
Purpose: To investigate the optimal pretaper duration on match running performance in a professional soccer team. Methods: The training load was monitored during daily training sessions and matches during 2 seasons according to different periodization strategies. Matches’ running distances were collected using match analysis system. The data were analyzed in 3 types of mesocycle blocks of 5 (M5), 4 (M4), and 3 weeks (M3), concludes all of them by 1 taper week. Results: Significant decreases in the training load during the taper weeks compared to standard weeks were observed in 3 types of mesocycle blocks (d ≥ 5; P < .01). An increase in overall match running performance was observed in matches played after the taper weeks compared to matches played after the standard weeks during M4 for all speed ranges (d ≥ 1.3; P < .05). The increase was only observed in low-intensity running (d = 1.3; P < .04) and total distance, low-intensity running, and intense running (d ≥ 1.3; P < .05) in M5 and M3, respectively. Match running performance following the taper weeks between the 3 different mesocycle durations was significantly higher in M4 for the number of high-speed running, sprinting, and high-intensity running (P < .05). The greatest enhancement of match running performance was observed at M4 when the training load was decreased by approximately 18% during the tapering period. Conclusion: This study suggests that a period of 3 standard weeks of training followed by 1 taper week is the optimal taper strategy when compared to different pretaper durations.
Peter W. Harrison, Lachlan P. James, David G. Jenkins, Michael R. McGuigan, Robert W. Schuster, and Vincent G. Kelly
Purpose: The aim of this study was to map responses over 32 hours following high-load (HL) and moderate-load (ML) half-squat priming. Methods: Fifteen participants completed control, HL (87% 1RM), and ML (65% 1RM) activities in randomized, counterbalanced order. Countermovement jump (CMJ), squat jump (SJ), saliva testosterone, saliva cortisol, and perceptual measures were assessed before and 5 minutes, 8 hours, 24 hours, and 32 hours after each activity. Results are presented as percentage change from baseline and 95% confidence interval (CI). Cliff delta was used to determine threshold for group changes. Results: SJ height increased by 4.5% (CI = 2.2–6.8, Cliff delta = 0.20) 8 hours following HL. CMJ and SJ improved by 6.1% (CI = 2.1–7.8, Cliff delta = 0.27) and 6.5% (CI = 1.2–11.8, Cliff delta = 0.30), respectively, 32 hours after ML. No clear diurnal changes in CMJ or SJ occurred 8 hours following control; however, increases of 3.9% (CI = 2.9–9.2, Cliff delta = 0.26) and 4.5% (CI = 0.9–8.1, Cliff delta = 0.24), respectively, were observed after 32 hours. Although diurnal changes in saliva hormone concentration occurred (Cliff delta = 0.37–0.92), the influence of priming was unclear. Perceived “physical feeling” was greater 8 hours following HL (Cliff delta = 0.36) and 32 hours after ML and control (Cliff delta = 0.17–0.34). Conclusions: HL priming in the morning may result in small improvements in jump output and psychophysiological state in the afternoon. Similar improvements were observed in the afternoon the day after ML priming.
Ali Brian, Sally Taunton Miedema, Jerraco L. Johnson, and Isabel Chica
Fundamental motor skills (FMS) are an underlying mechanism driving physical activity behavior and promoting positive developmental trajectories for health. However, little is known about FMS of preschool-aged children with visual impairments (VI). The purpose of this study was to examine the FMS of preschool-aged children (N = 25) with (n = 10) and without (n = 15) VI as measured using the Test of Gross Motor Development-3. Children without VI performed significantly higher than their peers for locomotor (M = +11.87, p = .014, η2 = .31) and ball skills (M = +13.69, p < .001, η2 = .56). Regardless of the presence of a VI, many participants struggled with developing FMS, with the greatest disparity resting within ball skills. These findings help to clarify the FMS levels of preschool-aged children with VI. Thus, there is a need for both further inquiry and intervention for all children.
Helmi Chaabene, Yassine Negra, Senda Sammoud, Jason Moran, Rodrigo Ramirez-Campillo, Urs Granacher, and Olaf Prieske
Purpose: To examine the effects of balance exercises conducted prior to complex training (bCT) versus complex training (CT) only on measures of physical fitness in young female elite handball players. Methods: Participants aged 17 years were randomly assigned to bCT (n = 11) or CT (n = 12). The 2 training interventions lasted 8 weeks with 2 sessions per week in replacement of some technical/tactical handball exercises and were matched for total training volume. Before and after training, tests were performed for the evaluation of proxies of muscle power (countermovement jump height, standing long-jump distance, and reactive strength index), muscle strength (back half-squat 1-repetition maximum), dynamic balance (Y-balance test), linear sprint speed (20-m sprint test), and change-of-direction speed (T test). Results: Two-factor repeated-measures analysis of variance revealed significant group × time interactions for the reactive strength index (d = 0.99, P = .03) and Y-balance test score (d = 1.32, P < .01). Post hoc analysis indicated significant pre–post reactive strength index improvements in CT (d = 0.69, P = .04) only. For the Y-balance test, significant pre–post increases were found in bCT (d = 0.71, P = .04) with no significant changes in CT (d = 0.61, P = .07). In addition, significant main effects of time were observed for half-squat 1-repetition maximum, countermovement jump, standing long jump, and T test performance (d = 1.50 to 3.10, P < .05). Conclusions: Both bCT and CT interventions were effective in improving specific measures of physical fitness in young elite female handball players. If the training goal is to improve balance in addition, balance exercises can be conducted within a CT training session and prior to CT exercises.
Daniel L. Plotkin, Kenneth Delcastillo, Derrick W. Van Every, Kevin D. Tipton, Alan A. Aragon, and Brad J. Schoenfeld
Branched-chain amino acids (BCAA) are one of the most popular sports supplements, marketed under the premise that they enhance muscular adaptations. Despite their prevalent consumption among athletes and the general public, the efficacy of BCAA has been an ongoing source of controversy in the sports nutrition field. Early support for BCAA supplementation was derived from extrapolation of mechanistic data on their role in muscle protein metabolism. Of the three BCAA, leucine has received the most attention because of its ability to stimulate the initial acute anabolic response. However, a substantial body of both acute and longitudinal research has now accumulated on the topic, affording the ability to scrutinize the effects of BCAA and leucine from a practical standpoint. This article aims to critically review the current literature and draw evidence-based conclusions about the putative benefits of BCAA or leucine supplementation on muscle strength and hypertrophy as well as illuminate gaps in the literature that warrant future study.
Ed Maunder, Daniel J. Plews, Fabrice Merien, and Andrew E. Kilding
Many endurance athletes perform specific blocks of training in hot environments in “heat stress training camps.” It is not known if physiological threshold heart rates measured in temperate conditions are reflective of those under moderate environmental heat stress. A total of 16 endurance-trained cyclists and triathletes performed incremental exercise assessments in 18°C and 35°C (both 60% relative humidity) to determine heart rates at absolute blood lactate and ventilatory thresholds. Heart rate at fixed blood lactate concentrations of 2, 3, and 4 mmol·L−1 and ventilatory thresholds were not significantly different between environments (P > .05), despite significant heat stress-induced reductions in power output of approximately 10% to 17% (P < .05, effect size = 0.65–1.15). The coefficient of variation for heart rate at these blood lactate concentrations (1.4%−2.9%) and ventilatory thresholds (2.3%−2.7%) between conditions was low, with significant strong positive correlations between measurements in the 2 environments (r = .92–.95, P < .05). These data indicate heart rates measured at physiological thresholds in temperate environments are reflective of measurements taken under moderate environmental heat stress. Therefore, endurance athletes embarking on heat stress training camps can use heart rate–based thresholds ascertained in temperate environments to prescribe training under moderate environmental heat stress.
Bent R. Rønnestad, Sjur J. Øfsteng, Fabio Zambolin, Truls Raastad, and Daniel Hammarström
Purpose: To compare the effects of a 1-week high-intensity aerobic-training shock microcycle composed of either 5 short-interval sessions (SI; n = 9, 5 series with 12 × 30-s work intervals interspersed with 15-s recovery and 3-min recovery between series) or 5 long-interval sessions (LI; n = 8, 6 series of 5-min work intervals with 2.5-min recovery between series) on indicators of endurance performance in well-trained cyclists. Methods: Before and following 6 days with standardized training loads after the 1-week high-intensity aerobic-training shock microcycle, both groups were tested in physiological determinants of endurance performance. Results: From pretraining to posttraining, SI achieved a larger improvement than LI in maximal oxygen uptake (5.7%; 95% confidence interval, 1.3–10.3; P = .015) and power output at a blood lactate concentration of 4 mmol·L−1 (3.8%; 95% confidence interval, 0.2–7.4; P = .038). There were no group differences in changes of fractional use of maximal oxygen uptake at a workload corresponding to a blood lactate concentration of 4 mmol·L−1, gross efficiency, or the 1-minute peak power output from the maximal-oxygen-uptake test. Conclusion: The SI protocol may induce superior changes in indicators of endurance performance compared with the LI protocol, indicating that SI can be a good strategy during a 1-week high-intensity aerobic-training shock microcycle in well-trained cyclists.
Shaun Abbott, Goshi Yamauchi, Mark Halaki, Marcela Torres Castiglioni, James Salter, and Stephen Cobley
Purpose: The study aimed to (1) accurately examine longitudinal relationships between maturity status and both technical skill indices and performance in Australian male (N = 64) age-group Front-crawl swimmers (10–15 y) and (2) determine whether individual differences in maturation influenced relationships between technical skill level and swimming performance. Methods: A repeated-measures design was used to assess maturity status and performance on 200-m Front-crawl trial across 2 competition seasons (2018–2020). Assessments were made on 3 to 5 occasions (median = 3) separated by approximately 4 months. Average horizontal velocity and stroke frequency were used to calculate technical skill indices, specifically stroke index, and arm propelling efficiency. Relationships between variables were assessed using linear mixed models, identifying fixed, and random effect estimates. Results: Curvilinear trends best described significant longitudinal relationships between maturity status with horizontal velocity (F = 10.33 [1, 233.77]; P = .002) and stroke index (F = 5.55 [1, 217.9]; P = .02) during 200-m Front-crawl trials. Maturity status was not significantly related to arm propelling efficiency (P = .08). However, arm propelling efficiency was an independent predictor of Front-crawl velocity (F = 55.89 [1, 210.45]; P < .001). Conclusions: Maturity status predicted assessment of swimmer technical skill (stroke index) and swimming performance. However, technical skill accessed via arm propelling efficiency was independent of maturation and was predictive of performance. Maturity status influences performance evaluation based on technical skill and velocity. Findings highlight the need to account for maturation and technical skill in age-group swimmers to better inform swimmer evaluation.