Purpose: To examine the dose–response relationship between match-play high-speed running (HSR), very high-speed running (VHSR), and sprint (SPR) distances versus subsequent ratings of fatigue and soreness. Methods: Thirty-six outfield players competing in the professional National Women’s Soccer League (NWSL, United States) with a minimum of five 90-minute match observations were monitored during the 2016 and 2017 seasons (408 match observations, 11 /player). HSR (≥3.47 m·s−1), VHSR (≥5.28 m·s−1), and SPR (≥6.25 m·s−1) were determined generically (GEN) in players using a 10-Hz global positioning system. HSR, VHSR, and SPR speed thresholds were also reconfigured according to player peak speed per se and in combination with the final velocity achieved in the 30:15 Intermittent Fitness Test (locomotor approach to establishing individual speed zones). On the morning following matches (match day [MD + 1]), players recorded subjective wellness ratings of fatigue and soreness using 7-point Likert scales. Results: Fatigue (−2.32; 95% CI, −2.60 to −2.03 au; P < .0001) and soreness (−2.05; 95% CI, −2.29 to −1.81; P < .0001) ratings worsened on MD + 1. Standardized unit changes in HSRGEN (fatigue: −0.05; 95% CI, −0.11 to 0.02 and soreness: −0.02, 95% CI, −0.07 to 0.04) and VHSRGEN (fatigue: −0.06; 95% CI, −0.12 to 0.00 and soreness: −0.04; 95% CI, −0.10 to 0.02) had no influence on wellness ratings at MD + 1. Individualized speed thresholds did not improve the model fit. Conclusions: Subjective ratings of fatigue and wellness are not sensitive to substantial within-player changes in match physical performance. HSR, VHSR, and SPR thresholds customized for individual players’ athletic qualities did not improve the dose–response relationship between external load and wellness ratings.
Dawn Scott, Dean Norris and Ric Lovell
Gianluca Vernillo, Adrien Mater, Gregory Doucende, Johan Cassirame and Laurent Mourot
Purpose: To study the consequences of a fatiguing ultratrail run of 6 hours on self-optimizing capability during uphill and downhill (DR) running. Methods: The authors collected temporal stride kinematics and metabolic data in 8 (experimental group) male runners before and after the ultratrail run and in 6 (control group) male ultramarathon runners who did not run but stayed awake and performed normal, daily physical activities avoiding strenuous exercises over the 6-hour period. For each subject, preferred and optimal stride frequencies were measured, where stride frequency was systematically varied above and below the preferred one (±4% and ±8%) while running 3 conditions on level, 5% uphill, or 5% DR in a randomized order. Results: Preferred and optimal stride frequencies across grade, group, and time showed no significant differences (P ≥ .184). Metabolic cost and the energetically optimum metabolic cost showed a grade × group × time interaction (P ≥ .011), with an ∼11% increase in the 2 variables only during the DR bouts (P ≥ .037). Conclusions: Despite maintaining similar dynamics of stride frequency adjustments during the DR bout, the experimental group was not able to optimize its gait. This suggests that the DR section of ultratrail runs can introduce a perturbing factor in the runners’ optimization process, highlighting the need for incorporating DR bouts in the training programs of ultratrail runners to minimize the deleterious effects of DR on the energetically optimal gait.
Laurent Schmitt, Stéphane Bouthiaux and Grégoire P. Millet
Purpose: To report the changes in the training characteristics, performance, and heart-rate variability (HRV) of the world’s most successful male biathlete of the last decade. Method: During the analyzed 11-year (2009–2019) period, the participant won 7 big crystal globes, corresponding to the winner of the International Biathlon Union World Cup. The training characteristics are reported as yearly volume (in hours) of low-intensity training (LIT), moderate- and high-intensity training, and speed and strength training. Performance was quantified by the number of World Cup top-3 positions per season. HRV was expressed as low- and high-frequency spectral power (in milliseconds squared), root-mean-square difference of successive R–R interval (in milliseconds), and heart rate (in beats per minute). Results: The training volume increased from 530 to ∼700 hours per year in 2009–2019, with a large polarization in training intensity distribution (ie, LIT 86.3% [2.9%]; moderate-intensity training 3.4% [1.5%]; high-intensity training 4.0% [0.7%]; strength 6.3% [1.6%]). The number of top-3 positions increased from 2 to 24–26 in 2009–2018 but decreased to 6 in 2019. The mean supine values in the root-mean-square difference of successive R–R interval and high-frequency spectral power divided by heart rate increased until 2015, which were stable over 2016–2018 but decreased in 2019. The number of top-3 positions was related to the total (r = .66, P = .02) and LIT (r = .92, P < .001) volume and to several markers of supine parasympathetic activity. Conclusion: The improvement in performance of the participant was mainly determined by the progressive increase in training volume, especially performed at low intensity, and was correlated to parasympathetic activity markers. This case study confirms the effectiveness of the training method, with a large amount of LIT in an elite endurance athlete, and of regular HRV monitoring.
Matteo Bonato, Antonio La Torre, Marina Saresella, Ivana Marventano, Giampiero Merati, Giuseppe Banfi and Jacopo A. Vitale
Purpose: The authors compared sleep quality and salivary cortisol concentration after high-intensity interval training (HIIT) and small-sided games (SSGs) performed at the habitual training time in nonprofessional male soccer players. Methods: A total of 32 players (age = 24  y, height = 1.77 [0.06] m, and body mass = 75  kg) were randomized into an HIIT group or an SSG group. Actual sleep time, sleep efficiency (SE), sleep latency, immobility time (IT), moving time (MT), and fragmentation index were monitored using actigraphy before (PRE) and 2 nights after (POST 1 and POST 2) the training session. Salivary cortisol levels were measured before (PRE) and after (POST) training. Cortisol awakening response was evaluated. Results: Significant intragroup differences in the HIIT group were noted for actual sleep time (P < .0001), SE (P < .0001), sleep latency (P = .047), IT (P < .0001), MT (P < .0001), and fragmentation index (P < .0001) between PRE and POST 1 and for SE (P = .035), IT (P = .004), MT (P = .006), and fragmentation index (P = .048) between PRE and POST 2. Intergroup differences for actual sleep time (P = .014), SE (P = .048), IT (P < .0001), and MT (P = .046) were observed between the HIIT and the SSGs group at POST 1 were detected. Significant intragroup variations were observed in PRE and POST salivary cortisol levels (P < .0001 for HIIT; P = .0003 for SSGs) and cortisol awakening response (P < .0001 for HIIT; P < .0001 for SSGs). Significant intergroup differences between the HIIT and the SSGs group were found at POST (P < .0001) and in cortisol awakening response (P = .017). Conclusions: Changes in actigraphy-based sleep parameters and salivary cortisol levels were greater after an acute session of HIIT than SSGs in this sample of nonprofessional male soccer players.
Piia Kaikkonen, Esa Hynynen, Arto Hautala and Juha P. Ahtiainen
Purpose: It is known that modifying the endurance-type training load of athletes may result in altered cardiac autonomic modulation that may be estimated with heart rate variability (HRV). However, the specific effects of intensive resistance-type training remain unclear. The main aim of this study was to find out whether an intensive 2-wk resistance training period affects the nocturnal HRV and strength performance of healthy participants. Methods: Young healthy men (N = 13, age 24  y) performed 2-wk baseline training, 2-wk intensive training, and a 9-d tapering periods, with 2, 5, and 2 hypertrophic whole-body resistance exercise sessions per week, respectively. Maximal isometric and dynamic strength were tested at the end of these training periods. Nocturnal HRV was also analyzed at the end of these training periods. Results: As a main finding, the nocturnal root mean square of differences of successive R-R intervals decreased (P = .004; from 49  to 43  ms; 95% CI, 2.4–10.4; effect size = 0.97) during the 2-wk intensive resistance training period. In addition, maximal isometric strength improved slightly (P = .045; from 3933  to 4138  N; 95% CI, 5.4–404; effect size = 0.60). No changes were found in 1-repetition-maximum leg press or leg press repetitions at 80% 1-repetition maximum. Conclusions: The present data suggest that increased training load due to a short-term intensive resistance training period can be detected by nocturnal HRV. However, despite short-term accumulated physiological stress, a tendency of improvement in strength performance was detected.
Ewan R. Williams, James McKendry, Paul T. Morgan and Leigh Breen
Purpose: Compression garments are widely used as a tool to accelerate recovery from intense exercise and have also gained traction as a performance aid, particularly during periods of limited recovery. This study tested the hypothesis that increased pressure levels applied via high-pressure compression garments would enhance “multiday” exercise performance. Methods: A single-blind crossover design, incorporating 3 experimental conditions—loose-fitting gym attire (CON), low-compression (LC), and high-compression (HC) garments—was adopted. A total of 10 trained male cyclists reported to the laboratory on 6 occasions, collated into 3 blocks of 2 consecutive visits. Each “block” consisted of 3 parts, an initial high-intensity protocol, a 24-hour period of controlled rest while wearing the applied condition/garment (CON, LC, and HC), and a subsequent 8-km cycling time trial, while wearing the respective garment. Subjective discomfort questionnaires and blood pressure were assessed prior to each exercise bout. Power output, oxygen consumption, and heart rate were continuously measured throughout exercise, with plasma lactate, creatine kinase, and myoglobin concentrations assessed at baseline and the end of exercise, as well as 30 and 60 minutes postexercise. Results: Time-trial performance was significantly improved during HC compared with both CON and LC (HC = 277 , CON = 266 , and LC = 265  W; P < .05). In addition, plasma lactate was significantly lower at 30 and 60 minutes postexercise on day 1 in HC compared with CON. No significant differences were observed for oxygen consumption, heart rate, creatine kinase, or subjective markers of discomfort. Conclusion: The pressure levels exerted via lower-limb compression garments influence their effectiveness for cycling performance, particularly in the face of limited recovery.
Christopher J. Stevens, Megan L.R. Ross, Amelia J. Carr, Brent Vallance, Russ Best, Charles Urwin, Julien D. Périard and Louise Burke
Purpose: Hot-water immersion (HWI) after training in temperate conditions has been shown to induce thermophysiological adaptations and improve endurance performance in the heat; however, the potential additive effects of HWI and training in hot outdoor conditions remain unknown. Therefore, this study aimed to determine the effect of repeated postexercise HWI in athletes training in a hot environment. Methods: A total of 13 (9 female) elite/preelite racewalkers completed a 15-day training program in outdoor heat (mean afternoon high temperature = 34.6°C). Athletes were divided into 2 matched groups that completed either HWI (40°C for 30–40 min) or seated rest in 21°C (CON), following 8 training sessions. Pre–post testing included a 30-minute fixed-intensity walk in heat, laboratory incremental walk to exhaustion, and 10,000-m outdoor time trial. Results: Training frequency and volume were similar between groups (P = .54). Core temperature was significantly higher during immersion in HWI (38.5 [0.3]) than CON (37.8°C [0.2°C]; P < .001). There were no differences between groups in resting or exercise rectal temperature or heart rate, skin temperature, sweat rate, or the speed at lactate threshold 2, maximal O2 uptake, or 10,000-m performance (P > .05). There were significant (P < .05) pre–post differences for both groups in submaximal exercising heart rate (∼11 beats·min−1), sweat rate (0.34–0.55 L·h−1) and thermal comfort (1.2–1.5 arbitrary units), and 10,000-m racewalking performance time (∼3 min). Conclusions: Both groups demonstrated significant improvement in markers of heat adaptation and performance; however, the addition of HWI did not provide further enhancements. Improvements in adaptation appeared to be maximized by the training program in hot conditions.
Harry G. Banyard, James J. Tufano, Jonathon J.S. Weakley, Sam Wu, Ivan Jukic and Kazunori Nosaka
Purpose: To compare the effects of velocity-based training (VBT) and 1-repetition-maximum (1RM) percentage-based training (PBT) on changes in strength, loaded countermovement jump (CMJ), and sprint performance. Methods: A total of 24 resistance-trained males performed 6 weeks of full-depth free-weight back squats 3 times per week in a daily undulating format, with groups matched for sets and repetitions. The PBT group lifted with fixed relative loads varying from 59% to 85% of preintervention 1RM. The VBT group aimed for a sessional target velocity that was prescribed from pretraining individualized load–velocity profiles. Thus, real-time velocity feedback dictated the VBT set-by-set training load adjustments. Pretraining and posttraining assessments included the 1RM, peak velocity for CMJ at 30%1RM (PV-CMJ), 20-m sprint (including 5 and 10 m), and 505 change-of-direction test (COD). Results: The VBT group maintained faster (effect size [ES] = 1.25) training repetitions with less perceived difficulty (ES = 0.72) compared with the PBT group. The VBT group had likely to very likely improvements in the COD (ES = −1.20 to −1.27), 5-m sprint (ES = −1.17), 10-m sprint (ES = −0.93), 1RM (ES = 0.89), and PV-CMJ (ES = 0.79). The PBT group had almost certain improvements in the 1RM (ES = 1.41) and possibly beneficial improvements in the COD (ES = −0.86). Very likely favorable between-groups effects were observed for VBT compared to PBT in the PV-CMJ (ES = 1.81), 5-m sprint (ES = 1.35), and 20-m sprint (ES = 1.27); likely favorable between-groups effects were observed in the 10-m sprint (ES = 1.24) and nondominant-leg COD (ES = 0.96), whereas the dominant-leg COD (ES = 0.67) was possibly favorable. PBT had small (ES = 0.57), but unclear differences for 1RM improvement compared to VBT. Conclusions: Both training methods improved 1RM and COD times, but PBT may be slightly favorable for stronger individuals focusing on maximal strength, whereas VBT was more beneficial for PV-CMJ, sprint, and COD improvements.