Browse

You are looking at 51 - 60 of 5,488 items for :

  • Physical Education and Coaching x
  • Sport and Exercise Science/Kinesiology x
  • Refine by Access: All Content x
Clear All
Restricted access

Mark J. Kilgallon, Michael J. Johnston, Liam P. Kilduff, and Mark L. Watsford

Purpose: To compare resistance training using a velocity loss threshold with training to repetition failure on upper-body strength parameters in professional Australian footballers. Methods: A total of 26 professional Australian footballers (23.9 [4.2] y, 189.9 [7.8] cm, 88.2 [8.8] kg) tested 1-repetition-maximum strength (FPmax) and mean barbell velocity at 85% of 1-repetition maximum on floor press (FPvel). They were then assigned to 2 training groups: 20% velocity loss threshold training (VL; n = 12, maximum-effort lift velocity) or training to repetition failure (TF; n = 14, self-selected lift velocity). Subjects trained twice per week for 3 weeks before being reassessed on FPmax and FPvel. Training volume (total repetitions) was recorded for all training sessions. No differences were present between groups on any pretraining measure. Results: The TF group significantly improved FPmax (105.2–110.9 kg, +5.4%), while the VL group did not (107.5–109.2 kg, +1.6%) (P > .05). Both groups significantly increased FPvel (0.38–0.46 m·s−1, +19.1% and 0.37–0.42 m·s−1, +16.7%, respectively) with no between-groups differences evident (P > .05). The TF group performed significantly more training volume (12.2 vs 6.8 repetitions per session, P > .05). Conclusions: Training to repetition failure improved FPmax, while training using a velocity loss threshold of 20% did not. Both groups demonstrated similar improvements in FPvel despite the VL group completing 45% less total training volume than the TF group. The reduction in training volume associated with implementing a 20% velocity loss threshold may negatively impact the development of upper-body maximum strength while still enhancing submaximal movement velocity.

Restricted access

Alice Wallett, Julien D. Périard, Philo Saunders, and Andrew McKune

Along with digestion and absorption of nutrients, the gastrointestinal epithelium acts as a primary intestinal defense layer, preventing luminal pathogens from entering the circulation. During exercise in the heat, epithelial integrity can become compromised, allowing bacteria and bacterial endotoxins to translocate into circulation, triggering a systemic inflammatory response and exacerbating gastrointestinal damage. While this relationship seems clear in the general population in endurance/ultraendurance exercise, the aim of this systematic review was to evaluate the effect of exercise in the heat on blood markers of gastrointestinal epithelial disturbance in well-trained individuals. Following the 2009 Preferred Reporting Items for Systematic Reviewed and Meta-Analyses guidelines, five electronic databases were searched for appropriate research, and 1,885 studies were identified. Five studies met the inclusion criteria and were subject to full methodological appraisal by two reviewers. Critical appraisal of the studies was conducted using the McMasters Critical Review Form. The studies investigated changes in markers of gastrointestinal damage (intestinal fatty acid–binding protein, endotoxin, and/or lipopolysaccharide-binding protein) following acute exercise in warm to hot conditions (≥ 30 °C) and included trained or well-trained participants with direct comparisons to a control temperate condition (≤ 22 °C). The studies found that prolonged submaximal and strenuous exercise in hot environmental conditions can acutely increase epithelial disturbance compared with exercise in cooler conditions, with disturbances not being clinically relevant. However, trained and well-trained populations appear to tolerate exercise-induced gastrointestinal disturbance in the heat. Whether this is an acquired tolerance related to regular training remains to be investigated.

Restricted access

Andrew A. Flatt, Jeff R. Allen, Clay M. Keith, Matthew W. Martinez, and Michael R. Esco

Purpose: To track cardiac-autonomic functioning, indexed by heart-rate variability, in American college football players throughout a competitive period. Methods: Resting heart rate (RHR) and the natural logarithm root mean square of successive differences (LnRMSSD) were obtained throughout preseason and ∼3 times weekly leading up to the national championship among 8 linemen and 12 nonlinemen. Seated 1-minute recordings were performed via mobile device and standardized for time of day and proximity to training. Results: Relative to preseason, linemen exhibited suppressed LnRMSSD during camp-style preparation for the playoffs (P = .041, effect size [ES] = −1.01), the week of the national semifinal (P < .001, ES = −1.27), and the week of the national championship (P = .005, ES = −1.16). As a combined group, increases in RHR (P < .001) were observed at the same time points (nonlinemen ES = 0.48–0.59, linemen ES = 1.03–1.10). For all linemen, RHR trended upward (positive slopes, R 2 = .02–.77) while LnRMSSD trended downward (negative slopes, R 2 = .02–.62) throughout the season. Preseason to postseason changes in RHR (r = .50, P = .025) and LnRMSSD (r = −.68, P < .001) were associated with body mass. Conclusions: Heart-rate variability tracking revealed progressive autonomic imbalance in the lineman position group, with individual players showing suppressed values by midseason. Attenuated parasympathetic activation is a hallmark of impaired recovery and may contribute to cardiovascular maladaptations reported to occur in linemen following a competitive season. Thus, a descending pattern may serve as an easily identifiable red flag requiring attention from performance and medical staff.

Open access

Naroa Etxebarria, Nicole A. Beard, Maree Gleeson, Alice Wallett, Warren A. McDonald, Kate L. Pumpa, and David B. Pyne

Gastrointestinal disturbances are one of the most common issues for endurance athletes during training and competition in the heat. The relationship between typical dietary intake or nutritional interventions and perturbations in or maintenance of gut integrity is unclear. Twelve well-trained male endurance athletes (peak oxygen consumption = 61.4 ± 7.0 ml·kg−1·min−1) completed two trials in a randomized order in 35 °C (heat) and 21 °C (thermoneutral) conditions and kept a detailed nutritional diary for eight consecutive days between the two trials. The treadmill running trials consisted of 15 min at 60% peak oxygen consumption, 15 min at 75% peak oxygen consumption, followed by 8 × 1-min high-intensity efforts. Venous blood samples were taken at the baseline, at the end of each of the three exercise stages, and 1 hr postexercise to measure gut integrity and the permeability biomarker concentration for intestinal fatty-acid-binding protein, lipopolysaccharide, and lipopolysaccharide-binding protein. The runners self-reported gut symptoms 1 hr postexercise and 3 days postexercise. The heat condition induced large (45–370%) increases in intestinal fatty-acid-binding protein, lipopolysaccharide-binding protein, and lipopolysaccharide concentrations compared with the baseline, but induced mild gastrointestinal symptoms. Carbohydrate and polyunsaturated fat intake 24 hr preexercise were associated with less lipopolysaccharide translocation. Protein, carbohydrate, total fat, and polyunsaturated fat intake (8 days) were positively associated with the percentage increase of intestinal fatty-acid-binding protein in both conditions (range of correlations, 95% confidence interval = .62–.93 [.02, .98]). Typical nutrition intake partly explained increases in biomarkers and the attenuation of symptoms induced by moderate- and high-intensity exercise under both heat and thermoneutral conditions.

Restricted access

Antonis Kesisoglou, Andrea Nicolò, Lucinda Howland, and Louis Passfield

Purpose: To examine the effect of continuous (CON) and intermittent (INT) running training sessions of different durations and intensities on subsequent performance and calculated training load (TL). Methods: Runners (N = 11) performed a 1500-m time trial as a baseline and after completing 4 different running training sessions. The training sessions were performed in a randomized order and were either maximal for 10 minutes (10CON and 10INT) or submaximal for 25 minutes (25CON and 25INT). An acute performance decrement (APD) was calculated as the percentage change in 1500-m time-trial speed measured after training compared with baseline. The pattern of APD response was compared with that for several TL metrics (bTRIMP, eTRIMP, iTRIMP, running training stress score, and session rating of perceived exertion) for the respective training sessions. Results: Average speed (P < .001, ηp2=.924) was different for each of the initial training sessions, which all resulted in a significant APD. This APD was similar when compared across the sessions except for a greater APD found after 10INT versus 25CON (P = .02). In contrast, most TL metrics were different and showed the opposite response to APD, being higher for CON versus INT and lower for 10- versus 25-minute sessions (P < .001, ηp2>.563). Conclusion: An APD was observed consistently after running training sessions, but it was not consistent with most of the calculated TL metrics. The lack of agreement found between APD and TL suggests that current methods for quantifying TL are flawed when used to compare CON and INT running training sessions of different durations and intensities.

Restricted access

Bjarne Rud, Eivind Øygard, Even B. Dahl, Gøran Paulsen, and Thomas Losnegard

Purpose: We tested whether a single session of heavy-load resistance priming conducted in the morning improved double-poling (DP) performance in the afternoon. Methods: Eight national-level male cross-country skiers (mean [SD]: 23 [3] y, 184 [6] cm, 73 [7] kg, maximum oxygen consumption = 69 [6] mL·kg−1·min−1) carried out 2 days of afternoon performance tests. In the morning, 5 hours before tests, subjects were counterbalanced to either a session of 3 × 3 repetitions (approximately 85%–90% 1-repetition maximum) of squat and sitting pullover exercises or no exercise. The performance was evaluated in DP as time to exhaustion (TTE) (approximately 3 min) on a treadmill and 30-m indoor sprints before and after TTE (30-m DP pre/post). Furthermore, submaximal DP oxygen cost, countermovement jump, and isometric knee-extension force during electrical stimulation were conducted. Participants reported perceived readiness on test days. Results: Resistance exercise session versus no exercise did not differ for TTE (approximately 3 min above) (mean ± 95% confidence interval = 3.6% ± 6.0%; P = .29; effect size [ES], Cohen d = 0.27), 30-m DP pre (−0.56% ± 0.80%; P = .21; ES = 0.20), 30-m DP post (−0.18% ± 1.13%; P = .76; ES = 0.03), countermovement jump (−2.0% ± 2.8%; P = .21; ES = 0.12), DP oxygen cost (−0.13% ± 2.04%; P = .91; ES = 0.02), or perceived readiness (P ≥ .11). Electrical stimulation force was not different in contraction or relaxation time but revealed low-frequency fatigue in the afternoon for the resistance exercise session only (−12% [7%]; P = .01; ES = 1.3). Conclusion: A single session of heavy-load, low-volume resistance exercise in the morning did not increase afternoon DP performance of short duration in high-level skiers. However, leg low-frequency fatigue after resistance priming, together with the presence of small positive effects in 2 out of 3 DP tests, may indicate that the preconditioning was too strenuous.

Open access

Megan A. Kuikman, Margo Mountjoy, Trent Stellingwerff, and Jamie F. Burr

Restricted access

Maria Misailidi, Konstantinos Mantzios, Christos Papakonstantinou, Leonidas G. Ioannou, and Andreas D. Flouris

Purpose: We investigated the environmental conditions in which all outdoor International Tennis Federation (ITF) junior tournaments (athlete ages: <18 y) were held during 2010–2019. Thereafter, we performed a crossover trial (ClinicalTrials.gov: NCT04197375) assessing the efficacy of head–neck precooling for mitigating the heat-induced psychophysical and performance impacts on junior athletes during tennis match play. Methods: ITF junior tournament information was collected. We identified meteorological data from nearby (13.6 [20.3] km) weather stations for 3056 (76%) tournaments. Results: Overall, 30.1% of tournaments were held in hot (25°C–30°C wet-bulb globe temperature [WBGT]; 25.9%), very hot (30°C–35°C WBGT; 4.1%), or extremely hot (>35°C WBGT; 0.1%) conditions. Thereafter, 8 acclimatized male junior tennis athletes (age = 16.0 [0.9] y; height = 1.82 [0.04] m; weight = 71.3 [11.1] kg) were evaluated during 2 matches: one with head–neck precooling (27.7°C [2.2°C] WBGT) and one without (27.9°C [1.8°C] WBGT). Head–neck precooling reduced athletes’ core temperature from 36.9°C (0.2°C) to 36.4°C (0.2°C) (P = .001; d = 2.4), an effect reduced by warm-up. Head–neck precooling reduced skin temperature (by 0.3°C [1.3°C]) for the majority of the match and led to improved (P < .05) perceived exertion (by 13%), thermal comfort (by 14%), and thermal sensation (by 15%). Muscle temperature, heart rate, body weight, and urine specific gravity remained unaffected (P ≥ .05; d < 0.2). Small or moderate improvements were observed in most performance parameters assessed (d = 0.20–0.79). Conclusions: Thirty percent of the last decade’s ITF junior tournaments were held in hot, very hot, or extremely hot conditions (25°C–36°C WBGT). In such conditions, head–neck precooling may somewhat lessen the physiological and perceptual heat strain and lead to small to moderate improvements in the match-play performance of adolescent athletes.

Open access

Charli Sargent, Michele Lastella, Shona L. Halson, and Gregory D. Roach

Purpose: Anecdotal reports indicate that many elite athletes are dissatisfied with their sleep, but little is known about their actual sleep requirements. Therefore, the aim of this study was to compare the self-assessed sleep need of elite athletes with an objective measure of their habitual sleep duration. Methods: Participants were 175 elite athletes (n = 30 females), age 22.2 (3.8) years (mean [SD]) from 12 individual and team sports. The athletes answered the question “how many hours of sleep do you need to feel rested?” and they kept a self-report sleep diary and wore a wrist activity monitor for ∼12 nights during a normal phase of training. For each athlete, a sleep deficit index was calculated by subtracting their average sleep duration from their self-assessed sleep need. Results: The athletes needed 8.3 (0.9) hours of sleep to feel rested, their average sleep duration was 6.7 (0.8) hours, and they had a sleep deficit index of 96.0 (60.6) minutes. Only 3% of athletes obtained enough sleep to satisfy their self-assessed sleep need, and 71% of athletes fell short by an hour or more. Specifically, habitual sleep duration was shorter in athletes from individual sports than in athletes from team sports (F 1,173 = 13.1, P < .001; d = 0.6, medium), despite their similar sleep need (F 1,173 = 1.40, P = .24; d = 0.2, small). Conclusions: The majority of elite athletes obtain substantially less than their self-assessed sleep need. This is a critical finding, given that insufficient sleep may compromise an athlete’s capacity to train effectively and/or compete optimally.