Search Results

You are looking at 1 - 10 of 14 items for

  • Author: Christopher J. Stevens x
  • Refine by Access: All Content x
Clear All Modify Search
Free access

Strategies to Involve End Users in Sport-Science Research

Christopher J. Stevens and Christian Swann

Restricted access

Development of a “Cooling” Menthol Energy Gel for Endurance Athletes: Effect of Menthol Concentration on Acceptability and Preferences

Christopher J. Stevens, Megan L.R. Ross, and Roxanne M. Vogel

Menthol is effective at stimulating thermosensitive neurons that evoke pleasant cooling sensations. Internal application of menthol can be ergogenic for athletes, and hence, addition of menthol to sports nutrition products may be beneficial for athletes. The aim of this study was to develop a menthol energy gel for consumption during exercise and to determine acceptability and preferences for gels with different menthol concentrations. With a randomized, crossover, and double-blind placebo-controlled design, 40 endurance athletes (20 females) ingested an energy gel with a menthol additive at a high (0.5%; HIGH) or low concentration (0.1%; LOW), or a mint-flavored placebo (CON), on separate occasions during outdoor endurance training sessions. The athletes rated the gels for cooling sensation, mint flavor intensity, sweetness, and overall experience and provided feedback. Results are reported as median (interquartile range). Both menthol gels successfully delivered a cooling sensation, with a significantly greater response for HIGH (5.0 [4.0–5.0]) compared with LOW (3.5 [3.0–4.0]; p = .022) and CON (1.0 [1.0–2.0]; p < .0005), and LOW compared with CON (p < .0005). Ratings of mint flavor intensity followed the same trend as cooling sensation, while ratings of overall experience were significantly worse for HIGH (2.0 [1.0–3.0]) compared with LOW (4.0 [2.0–4.0]; p = .001) and CON (4.0 [3.0–4.0]; p < .0005). An energy gel with the addition of menthol at 0.1–0.5% provides a cooling sensation for athletes with a dose–response when ingested during exercise. The 0.1% concentration is recommended to maximize the overall experience of the gel.

Restricted access

Dose–Response and Time Course Effects of Acute Resistance Exercise on Executive Function

Christopher J. Brush, Ryan L. Olson, Peter J. Ehmann, Steven Osovsky, and Brandon L. Alderman

The purpose of this study was to examine possible dose–response and time course effects of an acute bout of resistance exercise on the core executive functions of inhibition, working memory, and cognitive flexibility. Twenty-eight participants (14 female; M age = 20.5 ± 2.1 years) completed a control condition and resistance exercise bouts performed at 40%, 70%, and 100% of their individual 10-repetition maximum. An executive function test battery was administered at 15 min and 180 min postexercise to assess immediate and delayed effects of exercise on executive functioning. At 15 min postexercise, high-intensity exercise resulted in less interference and improved reaction time (RT) for the Stroop task, while at 180 min low- and moderate-intensity exercise resulted in improved performance on plus–minus and Simon tasks, respectively. These findings suggest a limited and task-specific influence of acute resistance exercise on executive function in healthy young adults.

Restricted access

Changes in Core Temperature During an Elite Female Rugby Sevens Tournament

Mitchell J. Henderson, Bryna C.R. Chrismas, Christopher J. Stevens, Aaron J. Coutts, and Lee Taylor

Purpose: To characterize player core temperature (Tc) across a World Rugby Women’s Sevens Series tournament day (WRWSS) and determine the efficacy of commonly employed cold-water-immersion (CWI) protocols. Methods: Tc was measured in 12 elite female rugby sevens athletes across 3 games (G1–3) from day 1 of the Sydney WRWSS tournament. Symptoms of exertional heat illness, perceptual scales, CWI details, playing minutes, external-load data (measured by global positioning systems), and wet-bulb globe temperature (range 18.5°C–20.1°C) were also collected. Linear mixed models and magnitude-based inferences were used to assess differences in Tc between periods (G1–3 and warm-ups [WU]). Results: Average Tc was “very likely” lower (effect size; ±90% confidence limit −0.33; ±0.18) in G1 than in G2. Peak Tc was “very likely” (0.71; ±0.28) associated with increased playing time. CWI did not remove the accumulated Tc due to WU and match-play activity (∼1°C–2°C rise in Tc still present compared with Tc at WU onset for players ≥6-min match play). Conclusions: Elite WRWSS athletes experienced high Tc during WU (Tc peak 37.9–39.0°C) and matches (Tc peak 37.9–39.8°C), a magnitude known to reduce intermittent high-intensity physical performance (≥39°C). The CWI protocol resulted in players (≥6-min match play) with ∼1°C to 2°C raised Tc compared with Tc at WU onset.

Restricted access

Heat Stress Training Camps for Endurance Sport: A Descriptive Case Study of Successful Monitoring in 2 Ironman Triathletes

Ed Maunder, Andrew E. Kilding, Christopher J. Stevens, and Daniel J. Plews

A common practice among endurance athletes is to purposefully train in hot environments during a “heat stress camp.” However, combined exercise-heat stress poses threats to athlete well-being, and therefore, heat stress training has the potential to induce maladaptation. This case study describes the monitoring strategies used in a successful 3-week heat stress camp undertaken by 2 elite Ironman triathletes, namely resting heart rate variability, self-report well-being, and careful prescription of training based on previously collected physiological data. Despite the added heat stress, training volume very likely increased in both athletes, and training load very likely increased in one of the athletes, while resting heart rate variability and self-report well-being were maintained. There was also some evidence of favorable metabolic changes during routine laboratory testing following the camp. The authors therefore recommend that practitioners working with endurance athletes embarking on a heat stress training camp consider using the simple strategies employed in the present case study to reduce the risk of maladaptation and nonfunctional overreaching.

Restricted access

Comparison of 5 Normalization Methods for Knee Joint Moments in the Single-Leg Squat

Steven M. Hirsch, Christopher J. Chapman, David M. Frost, and Tyson A.C. Beach

Ratio scaling is the most common magnitude normalization approach for net joint moment (NJM) data. Generally, researchers compute a ratio between NJM and (some combination of) physical body characteristics (eg, mass, height, limb length, etc). However, 3 assumptions must be verified when normalizing NJM data this way. First, the regression line between NJM and the characteristic(s) used passes through the origin. Second, normalizing NJM eliminates its correlation with the characteristic(s). Third, the statistical interpretations following normalization are consistent with adjusted linear models. The study purpose was to assess these assumptions using data collected from 16 males and 16 females who performed a single-leg squat. Standard inverse dynamics analyses were conducted, and ratios were computed between the mediolateral and anteroposterior components of the knee NJM and participant mass, height, leg length, mass × height, and mass × leg length. Normalizing NJM-mediolateral by mass × height and mass × leg length satisfied all 3 assumptions. Normalizing NJM-anteroposterior by height and leg length satisfied all 3 assumptions. Therefore, if normalization of the knee NJM is deemed necessary to address a given research question, it can neither be assumed that using (any combination of) participant mass, height, or leg length as the denominator is appropriate nor consistent across joint axes.

Open access

Blood Flow Restriction Therapy Versus Standard Care for Reducing Quadriceps Atrophy After Anterior Cruciate Ligament Reconstruction

Lauren Anne Lipker, Caitlyn Rae Persinger, Bradley Steven Michalko, and Christopher J. Durall

Clinical Scenario: Quadriceps atrophy and weakness are common after anterior cruciate ligament reconstruction (ACLR). Blood flow restriction (BFR) therapy, alone or in combination with exercise, has shown some promise in promoting muscular hypertrophy. This review was conducted to ascertain the extent to which current evidence supports the use of BFR for reducing quadriceps atrophy following ACLR in comparison with standard care. Clinical Question: Is BFR more effective than standard care for reducing quadriceps atrophy after ACLR? Summary of Key Findings: The literature was searched for studies that directly compared BFR treatment to standard care in patients with ACLR. Three level I randomized control trial studies retrieved from the literature search met the inclusion criteria. Clinical Bottom Line: Reviewed data suggest that a short duration (13 d) of moderate-pressure BFR combined with low-resistance muscular training does not appear to measurably affect quadriceps cross-sectional area. However, a relatively long duration (15 wk) of moderate-pressure BFR combined with low-resistance muscular training may increase quadriceps cross-sectional area to a greater extent than low-resistance muscular training alone. The results of the third randomized control trial suggest that employing BFR while immobilized in the early postoperative period may reduce quadriceps atrophy following ACLR. Additional data are needed to establish if the benefits of BFR on quadriceps atrophy after ACLR outweigh the inherent risks and costs. Strength of Recommendation: All evidence for this review was level 1 (randomized control trial) based on the Centre for Evidence-Based Medicine criteria. However, the findings were inconsistent across the 3 studies regarding the effects of BFR on quadriceps atrophy resulting in a grade “B” strength of recommendation.

Restricted access

Limiting the Rise in Core Temperature During a Rugby Sevens Warm-Up With an Ice Vest

Lee Taylor, Christopher J. Stevens, Heidi R. Thornton, Nick Poulos, and Bryna C.R. Chrismas

Purpose: To determine how a cooling vest worn during a warm-up could influence selected performance (countermovement jump [CMJ]), physical (global positioning system [GPS] metrics), and psychophysiological (body temperature and perceptual) variables. Methods : In a randomized, crossover design, 12 elite male World Rugby Sevens Series athletes completed an outdoor (wet bulb globe temperature 23–27°C) match-specific externally valid 30-min warm-up wearing a phase-change cooling vest (VEST) and without (CONTROL), on separate occasions 7 d apart. CMJ was assessed before and after the warm-up, with GPS indices and heart rate monitored during the warm-ups, while core temperature (T c; ingestible telemetric pill; n = 6) was recorded throughout the experimental period. Measures of thermal sensation (TS) and thermal comfort (TC) was obtained pre-warm-up and post-warm-up, with rating of perceived exertion (RPE) taken post-warm-ups. Results: Athletes in VEST had a lower ΔT c (mean [SD]: VEST = 1.3°C [0.1°C]; CONTROL = 2.0°C [0.2°C]) from pre-warm-up to post-warm-up (effect size; ±90% confidence limit: −1.54; ±0.62) and T c peak (mean [SD]: VEST = 37.8°C [0.3°C]; CONTROL = 38.5°C [0.3°C]) at the end of the warm-up (−1.59; ±0.64) compared with CONTROL. Athletes in VEST demonstrated a decrease in ΔTS (−1.59; ±0.72) and ΔTC (−1.63; ±0.73) pre-warm-up to post-warm-up, with a lower RPE post-warm-up (−1.01; ±0.46) than CONTROL. Changes in CMJ and GPS indices were trivial between conditions (effect size < 0.2). Conclusions: Wearing the vest prior to and during a warm-up can elicit favorable alterations in physiological (T c) and perceptual (TS, TC, and RPE) warm-up responses, without compromising the utilized warm-up characteristics or physical-performance measures.

Restricted access

Limiting Rise in Heat Load With an Ice Vest During Elite Female Rugby Sevens Warm-Ups

Mitchell J. Henderson, Bryna C.R. Chrismas, Christopher J. Stevens, Job Fransen, Aaron J. Coutts, and Lee Taylor

Purpose: To determine the effect of wearing a phase-change cooling vest in elite female rugby sevens athletes during (1) a simulated match-day warm-up in hot conditions prior to a training session and (2) a prematch warm-up during a tournament in cool conditions. Methods: This study consisted of 2 randomized independent group designs (separated by 16 d) where athletes completed the same 23- to 25-minute match-day warm-up (1) in hot conditions (range = 28.0°C to 35.1°C wet bulb globe temperature [WBGT]) prior to training and (2) in cool conditions (range = 18.8°C to 20.1°C WBGT) prior to a World Rugby Women’s Sevens Series match. In both conditions, athletes were randomly assigned to wearing either (1) the standardized training/playing ensemble (synthetic rugby shorts and training tee/jersey) or (2) the standardized training/playing ensemble plus a commercial phase-change athletic cooling vest. Group-wise differences in core temperature rise from baseline, global positioning system–measured external locomotive output, and perceptual thermal load were compared. Results: Core temperature rise during a match warm-up was lower in the hot condition only (−0.65°C [95% confidence interval = −1.22°C to −0.08°C], η p 2 = .23 [95% confidence interval = .00 to .51], P = .028). No differences in various external-load variables were observed. Conclusions: Phase-change cooling vests can be worn by athletes prior to, and during, a prematch warm-up in hot conditions to limit excess core temperature rise without adverse effects on thermal perceptions or external locomotion output.

Restricted access

Additional Clothing Increases Heat Load in Elite Female Rugby Sevens Players

Mitchell J. Henderson, Bryna C.R. Chrismas, Christopher J. Stevens, Andrew Novak, Job Fransen, Aaron J. Coutts, and Lee Taylor

Purpose: To determine whether elite female rugby sevens players are exposed to core temperatures (Tc) during training in the heat that replicate the temperate match demands previously reported and to investigate whether additional clothing worn during a hot training session meaningfully increases the heat load experienced. Methods: A randomized parallel-group study design was employed, with all players completing the same approximately 70-minute training session (27.5°C–34.8°C wet bulb globe temperature) and wearing a standardized training ensemble (synthetic rugby shorts and training tee [control (CON); n = 8]) or additional clothing (standardized training ensemble plus compression garments and full tracksuit [additional clothing (AC); n = 6]). Groupwise differences in Tc, sweat rate, GPS-measured external locomotive output, rating of perceived exertion, and perceptual thermal load were compared. Results: Mean (P = .006, η p 2 = .88 ) and peak (P < .001, η p 2 = .97 ) Tc were higher in AC compared with CON during the training session. There were no differences in external load (F 4,9 = 0.155, P = .956, Wilks Λ = 0.935, η p 2 = .06 ) or sweat rate (P = .054, Cohen d = 1.09). A higher rating of perceived exertion (P = .016, Cohen d = 1.49) was observed in AC compared with CON. No exertional-heat-illness symptomology was reported in either group. Conclusions: Player Tc is similar between training performed in hot environments and match play in temperate conditions when involved for >6 minutes. Additional clothing is a viable and effective method to increase heat strain in female rugby sevens players without compromising training specificity or external locomotive capacity.