Purpose: The authors investigated the effect of foot cooling (FC) between sets in a leg press pyramid workout with resistance-trained participants. Methods: A total of 12 resistance-trained men (age = 21.8 [0.6] y; training experience = 1.7  y) performed a pyramid workout, including 4 sets of 85% to 90% 1-repetition maximum leg press exercise to exhaustion with interset FC or noncooling in a repeated-measures crossover design separated by 5 days. The authors immersed the participants’ feet in 10°C water for 2.5 minutes between sets. Results: Two-way repeated-measures analysis of variance revealed that FC elicited significantly higher repetitions and electromyography (EMG) values of the vastus lateralis (simple main effect of condition) than did noncooling (P < .05) in the second (repetitions: 11 [3.5] vs 7.75 [3.2]; EMG: 63.4% [19.4%] vs 54.5% [18.4%]), third (repetitions: 8.9 [3.2] vs 6.4 [2.1]; EMG: 71.5% [17.4%] vs 60.6% [19.4%]), and fourth (repetitions: 7.5 [2.7] vs 5.1 [2.2]; EMG: 75.2% [19.6%] vs 59.3% [23.5%]) sets. The authors also detected a simple main effect of set in the FC and noncooling conditions on repetitions (P < .05) and in the FC condition on the vastus lateralis EMG values. Although the authors observed no time × trial interactions for the rating of perceived exertion, the authors observed main effects on the sets (7.7–9.6 vs 7.9–9.3, P < .05). Conclusions: Interset FC provides an ergogenic effect on a leg press pyramid workout and may offset fatigue, as indicated by higher repetitions and EMG response, without increasing perceived exertion.
Zong-Yan Cai, Wen-Yi Wang, Yi-Ming Huang, and Chih-Min Wu
Pablo Nájera-Ferrer, Carlos Pérez-Caballero, Juan José González-Badillo, and Fernando Pareja-Blanco
Purpose: This study aimed to analyze the response to 4 concurrent training interventions differing in the training sequence and in the velocity loss (VL) threshold during strength training (20% vs 40%) on following endurance and strength performance. Methods: A randomized crossover research design was used. Sixteen trained men performed 4 training interventions consisting of endurance training (ET) followed by resistance training (RT), with 20% and 40% VL, respectively (ET + RT20 and ET + RT40), and RT with 20% and 40% VL, respectively, followed by ET (RT20 + ET and RT40 + ET). The ET consisted of running for 10 minutes at 90% of maximal aerobic velocity. The RT consisted of 3 squat sets with 60% of 1-repetition maximum. A 5-minute rest was given between exercises. The oxygen uptake throughout the ET and repetition velocity during RT were recorded. The blood lactate concentration, vertical jump, and squat velocity were measured at preexercise and after the endurance and strength exercises. Results: The RT40 + ET protocol showed an impaired running time along with higher ventilatory equivalents compared with those protocols that performed the ET without previous fatigue. No significant differences were observed in the repetitions per set performed for a given VL threshold, regardless of the exercise sequence. The protocols consisting of 40%VL induced greater reductions in jump height and squat velocity, along with elevated blood lactate concentration. Conclusions: A high VL magnitude (40%VL) induced higher metabolic and mechanical stress, as well as greater residual fatigue, on the following ET performance.
Steffen Held, Anne Hecksteden, Tim Meyer, and Lars Donath
Purpose: The present intervention study examined the effects of intensity-matched velocity-based strength training with a 10% velocity loss (VL10) versus traditional 1-repetition maximum (1RM) based resistance training to failure (TRF) on 1RM and maximal oxygen uptake (
Sylvia Moeskops, Jon L. Oliver, Paul J. Read, Gregory D. Myer, and Rhodri S. Lloyd
Purpose: To quantify speed, peak momentum, standing long jump (SLJ), and the ratio of vertical to horizontal take-off velocity (Ratiovert–hori TOV) in young female gymnasts of different maturity status and their influence on vaulting vertical TOV. Methods: One hundred twenty gymnasts age 5–14 years were subdivided into maturity groupings using percentage of predicted adult height. Participants performed three 20-m sprints, SLJ, and straight jump vaults that were recorded using 2-dimensional video and analyzed using digitizing software. Results: All speed intervals, peak speed, peak momentum, SLJ distance, vault height, and vertical TOV increased between the early prepubertal and late prepubertal (P < .001; d = 0.65–1.10) and early prepubertal and pubertal (P < .001; d = 0.75–1.00) groups. No differences between these metrics were observed between the 2 most mature groups (d = 0.01–0.55). Multiple regression analyses revealed peak speed had the strongest association with vertical TOV (R2 = 59%) and also identified the Ratiovert–hori as a secondary determinant (R2 = 12%). A separate regression model indicated that maturity status (percentage of predicted adult height) moderately influences vertical TOV during vaulting (R2 = 41%). Conclusion: Speed and SLJ performance increase between the early prepubertal and late prepubertal years in young female gymnasts. However, given that peak speed and Ratiovert–hori combined to explain 71% of the total variance in vaulting vertical TOV, in order to increase aerial time for more advanced vaulting, practitioners should attempt to enhance peak speed alongside takeoff technique to develop gymnasts’ ability to transfer linear speed to vertical TOV.
Pedro L. Valenzuela, Jaime Gil-Cabrera, Eduardo Talavera, Lidia B. Alejo, Almudena Montalvo-Pérez, Cecilia Rincón-Castanedo, Iván Rodríguez-Hernández, Alejandro Lucia, and David Barranco-Gil
Purpose: To compare the effectiveness of resistance power training (RPT, training with the individualized load and repetitions that maximize power output) and cycling power training (CPT, short sprint training) in professional cyclists. Methods: The participants (20  y, peak oxygen uptake 78.0 [4.4] mL·kg−1·min−1) were randomly assigned to perform CPT (n = 8) or RPT (n = 10) in addition to their usual training regime for 7 weeks (2 sessions/wk). The training loads were continuously registered using the session rating of perceived exertion. The outcomes included endurance performance (8-min time trial and incremental test), as well as measures of muscle strength/power (1-repetition maximum and mean maximum propulsive power on the squat, hip thrust, and lunge exercises) and body composition (assessed by dual-energy X-ray absorptiometry). Results: No between-group differences were found for training loads or for any outcome (P > .05). Both interventions resulted in increased time-trial performance, as well as in improvements in other endurance-related outcomes (ie, ventilatory threshold, respiratory compensation point; P < .05). A significant or quasi-significant increase (P = .068 and .047 for CPT and RPT, respectively) in bone mineral content was observed after both interventions. A significant reduction in fat mass (P = .017), along with a trend (P = .059) toward a reduced body mass, was observed after RPT, but not CPT (P = .076 for the group × time interaction effect). Significant benefits (P < .05) were also observed for most strength-related outcomes after RPT, but not CPT. Conclusion: CPT and RPT are both effective strategies for the improvement of endurance performance and bone health in professional cyclists, although the latter tends to result in greater improvements in body composition and muscle strength/power.
Steve W. Thompson, David Rogerson, Alan Ruddock, Harry G. Banyard, and Andrew Barnes
Purpose: This study compared pooled against individualized load–velocity profiles (LVPs) in the free-weight back squat and power clean. Methods: A total of 10 competitive weightlifters completed baseline 1-repetition maximum assessments in the back squat and power clean. Three incremental LVPs were completed, separated by 48 to 72 hours. Mean and peak velocity were measured via a linear-position transducer (GymAware). Linear and nonlinear (second-order polynomial) regression models were applied to all pooled and individualized LVP data. A combination of coefficient of variation (CV), intraclass correlation coefficient, typical error of measurement, and limits of agreement assessed between-subject variability and within-subject reliability. Acceptable reliability was defined a priori as intraclass correlation coefficient > .7 and CV < 10%. Results: Very high to practically perfect inverse relationships were evident in the back squat (r = .83–.96) and power clean (r = .83–.89) for both regression models; however, stronger correlations were observed in the individualized LVPs for both exercises (r = .85–.99). Between-subject variability was moderate to large across all relative loads in the back squat (CV = 8.2%–27.8%) but smaller in the power clean (CV = 4.6%–8.5%). The power clean met our criteria for acceptable reliability across all relative loads; however, the back squat revealed large CVs in loads ≥90% of 1-repetition maximum (13.1%–20.5%). Conclusions: Evidently, load–velocity characteristics are highly individualized, with acceptable levels of reliability observed in the power clean but not in the back squat (≥90% of 1-repetition maximum). If practitioners want to adopt load–velocity profiling as part of their testing and monitoring procedures, an individualized LVP should be utilized over pooled LVPs.
Carl James, Aishwar Dhawan, Timothy Jones, and Olivier Girard
Purpose: To quantify the demands of specific on- and off-court sessions, using internal and external training load metrics, in elite squash. Methods: A total of 15 professional squash players (11 males and 4 females) wore a 100-Hz triaxial accelerometer/global positioning system unit and heart rate monitor during on-court “Group,” “Feeding,” “Ghosting,” “Matchplay,” and off-court “Conditioning” sessions across a 2-week in-season microcycle. Comparisons of absolute training load (total values) and relative intensity (per minute) were made between sessions for internal (session rating of perceived exertion, differential rating of perceived exertion, TRIMP) and external (Playerload, very high–intensity movements [>3.5 m·s−2]) metrics. Results: The Group sessions were the longest (79  min), followed by Feeding (55  min), Matchplay (46  min), Conditioning (37  min), and Ghosting (35  min). Time >90% maximum heart rate was the lowest during Feeding (vs all others P < .05) but other sessions were not different (all P > .05). Relative Playerload during Conditioning (14.3 [3.3] arbitrary unit [a.u.] per min, all P < .05) was higher than Ghosting (7.5 [1.2] a.u./min) and Matchplay (6.9 [1.5] a.u./min), with no difference between these 2 sessions (P ≥ .999). Conditioning produced the highest Playerloads (519  a.u., all P < .001), with the highest on-court Playerloads from Group (450  a.u., all P < .001). The highest session rating of perceived exertion (all P < .001), Edward’s TRIMP (all P < .001), and TEAM-TRIMP (all P < .019) occurred during the Group sessions. Conclusions: Squash Matchplay does not systematically produce the highest training intensities and loads. Group sessions provide the highest training loads for many internal and external parameters and, therefore, play a central role within the training process. These findings facilitate planning or adjustment of intensity, volume, and frequency of sessions to achieve desirable physical outcomes.
Jeffrey A. Rothschild, Matthieu Delcourt, Ed Maunder, and Daniel J. Plews
Purpose: To present a case report of an elite ultra-endurance cyclist, who was the winner and course record holder of 2 distinct races within a 4-month span: a 24-hour solo cycling race and a 2-man team multiday race (Race Across America). Methods: The athlete’s raw data (cycling power, heart rate [HR], speed, and distance) were obtained and analyzed for 2 ultra-endurance races and 11 weeks of training in between. Results: For the 24-hour race, the athlete completed 861.6 km (average speed 35.9 km·h−1, average power 210 W [2.8 W·kg−1], average HR 121 beats per minute) with a 37% decrease in power and a 22% decrease in HR throughout the race. During the 11 weeks between the 24-hour race and Race Across America, training intensity distribution (Zone 1/2/3) based on HR was 51%/39%/10%. For the Race Across America, total team time to complete the 4939-km race was 6 days, 10 hours, 39 minutes, at an average speed of 31.9 km·h−1. Of this, the athlete featured in this case study rode 75.2 hours, completing 2532 km (average speed 33.7 km·h−1, average power 203 W [2.7 W·kg−1]), with a 12% decrease in power throughout the race. Power during daytime segments was greater than nighttime (212  vs 189  W, P < .001,
Simon A. Feros, Damon A. Bednarski, and Peter J. Kremer
Purpose: To investigate the relationship between prescribed (preDI), perceived (perDI), and actual delivery intensity (actDI) in cricket pace bowling. Methods: Fourteen male club-standard pace bowlers (mean [SD]: age 24.2 [3.2] y) completed 1 bowling session comprising 45 deliveries. The first 15 deliveries composed the warm-up, where participants bowled 3 deliveries each at a preDI of 60%, 70%, 80%, 90%, and 95%. Bowlers reported the perDI after each delivery. The fastest delivery in the session was used as a reference to calculate relative ball-release speed for the warm-up deliveries, with this measure representing the actDI. Ball-release speed was captured by a radar gun. Results: For perDI, there was a very large relationship with preDI (rs = .90, P < .001). Similarly, for actDI, there was a large relationship with preDI (rs = .52, P < .001). Higher concordance was observed between perDI and preDI from 60% to 80% preDI. A plateau was observed for actDI from 70% to 95% preDI. Conclusions: The relationship between perDI and actDI was very large and large with respect to preDI, indicating that both variables can be used to monitor delivery intensity against the planned intensity and thus ensure healthy training adaptation. The optimal preDI that allowed pace bowlers to operate at submaximal perDI but still achieve close to maximal ball-release speeds was 70%. Bowling at the optimal preDI may significantly reduce the psychophysiological load per delivery in exchange for a trivial loss in ball-release speed.
Spencer S.H. Roberts, Emma Falkenberg, Alysha Stevens, Brad Aisbett, Michele Lastella, and Dominique Condo
Purpose: Australian football has elite men’s (Australian Football League; AFL) and women’s (Australian Football League Women’s; AFLW) competitions. This study compared AFL and AFLW players’ sleep and characterized players’ sleep in the context of current sleep recommendations. Methods: A total of 70 players (36 AFL, 34 AFLW) had their sleep monitored via actigraphy over a 10-day preseason period. Sleep outcomes and their intraindividual variation, were compared between AFL and AFLW players using linear mixed models. Proportions of players sleeping ≥7 and ≥8 hours per night, and achieving ≥85% sleep efficiency, were compared using chi-square analyses. Results: Compared with AFL players, AFLW players slept less (7.9 [0.5] vs 7.1 [0.6] h, P = .000), had lower sleep efficiency (89.5% [2.8%] vs 84.0% [4.4%], P = .000), and greater intraindividual variation in sleep efficiency (3.1% [0.9%] vs 5.1% [2.1%], P = .000). A total of 47% of AFLW versus 3% of AFL players averaged <7 hours sleep (χ 2 = 18.6, P = .000). A total of 88% of AFLW versus 50% of AFL players averaged <8 hours sleep (χ 2 = 11.9, P = .001). A total of 53% of AFLW versus 14% of AFL players averaged <85% sleep efficiency (χ 2 = 12.1, P = .001). Conclusions: AFLW players slept less and had poorer sleep quality than AFL players. Many AFLW players do not meet current sleep duration or sleep quality recommendations. Research should test strategies to improve sleep among Australian rules footballers, particularly among elite women.