Yuri Hosokawa and Gretchen D. Oliver
Yuri Hosokawa, William M. Adams and Douglas J. Casa
Context: It is unknown how valid esophageal, rectal, and gastrointestinal temperatures (TES, TRE, and TGI) compare after exercise-induced hyperthermia under different hydration states. Objective: To examine the differences between TES, TRE, and TGI during passive rest following exercise-induced hyperthermia under 2 different hydration states: euhydrated (EU) and hypohydrated (HY). Design: Randomized crossover design. Setting: Controlled laboratory setting. Participants: 9 recreationally active male participants (mean ± SD age 24 ± 4 y, height 177.3 ± 9.9 cm, body mass 76.7 ± 11.6 kg, body fat 14.7% ± 5.8%). Intervention: Participants completed 2 trials (EU and HY) consisting of a bout of treadmill exercise (a 10-min walk at 4.8-7.2 km/h at a 5% grade followed by a 20-min jog at 8.0-12.1 km/h at a 1% grade) in a hot environment (ambient temperature 39.3 ± 1.0°C, relative humidity 37.6% ± 6.0%, wet bulb globe temperature 31.3 ± 1.5°C) followed by passive rest. Main Outcome Measures: Root-mean-squared difference (RMSD) was used to compare the variance of temperature readings at corresponding time points for TRE vs TGI, TRE vs TES, and TGI vs TES in EU and HY. RMSD values were compared using 3-way repeated-measures ANOVA. Post hoc analysis of significant main effects was done using Tukey honestly significant difference with significance set at P < .05. Results: RMSD values (°C) for all device comparisons were significantly different in EU (TRE-TGI, 0.11 ± 0.12; TRE-TES, 1.58 ± 1.01; TGI-TES, 2.04 ± 1.19) than HY (TRE-TGI, 0.22 ± 0.28; TRE-TES, 1.27 ± 0.61; TGI-TES, 1.16 ± 0.76) (P < .01). Across the 45-min bout of passive rest, there were no differences in TRE, TGI, and TES between EU and HY trials (P = .468). Conclusions: During passive rest after exercise in the heat, TRE and TGI were in good agreement when tracking body temperature, with a better agreement appearing in those maintaining a state of euhydration versus those who became hypohydrated during exercise; however, this small difference does not appear to be of clinical significance. The large differences were observed when comparing TGI and TRE with TES.
William M. Adams, Yuri Hosokawa and Douglas J. Casa
Although body cooling has both performance and safety benefits, knowledge on optimizing cooling during specific sport competition is limited.
To identify when, during sport competition, it is optimal for body cooling and to identify optimal body-cooling modalities to enhance safety and maximize sport performance.
A comprehensive literature search was conducted to identify articles with specific context regarding body cooling, sport performance, and cooling modalities used during sport competition. A search of scientific peer-reviewed literature examining the effects of body cooling on exercise performance was done to examine the influence of body cooling on exercise performance. Subsequently, a literature search was done to identify effective cooling modalities that have been shown to improve exercise performance.
The cooling modalities that are most effective in cooling the body during sport competition depend on the sport, timing of cooling, and feasibility based on the constraints of the sports rules and regulations. Factoring in the length of breaks (halftime substitutions, etc), the equipment worn during competition, and the cooling modalities that offer the greatest potential to cool must be considered in each individual sport.
Scientific evidence supports using body cooling as a method of improving performance during sport competition. Developing a strategy to use cooling modalities that are scientifically evidence-based to improve performance while maximizing athlete’s safety warrants further investigation.
Douglas J. Casa, Yuri Hosokawa, Luke N. Belval, William M. Adams and Rebecca L. Stearns
Exertional heat stroke (EHS) is among the leading causes of sudden death during sport and physical activity. However, previous research has shown that EHS is 100% survivable when rapidly recognized and appropriate treatment is provided. Establishing policies to address issues related to the prevention and treatment of EHS, including heat acclimatization, environment-based activity modification, body temperature assessment using rectal thermometry, and immediate, onsite treatment using cold-water immersion attenuates the risk of EHS mortality and morbidity. This article provides an overview of the current evidence regarding EHS prevention and management. The transfer of scientific knowledge to clinical practice has shown great success for saving EHS patients. Further efforts are needed to implement evidence-based policies to not only mitigate EHS fatality but also to reduce the overall incidence of EHS.
William M. Adams, Yuri Hosokawa, Robert A. Huggins, Stephanie M. Mazerolle and Douglas J. Casa
Evidence-based best practices for the recognition and treatment of exertional heat stroke (EHS) indicate that rectal thermometry and immediate, aggressive cooling via cold-water immersion ensure survival from this medical condition. However, little is known about the recovery, medical follow-up, and return to activity after an athlete has suffered EHS.
To highlight the transfer of evidenced-based research into clinical practice by chronicling the treatment, recovery, and return to activity of a runner who suffered an EHS during a warm-weather road race.
Warm-weather road race.
53-y-old recreationally active man.
A runner’s treatment, recovery, and return to activity from EHS and 2014 Falmouth Road Race performance.
Runner’s perceptions and experiences with EHS, body temperature, heart rate, hydration status, exercise intensity.
The runner successfully completed the 2014 Falmouth Road Race without incident of EHS. Four dominant themes emerged from the data: predisposing factors, ideal treatment, lack of medical follow-up, and patient education. The first theme identified 3 predisposing factors that contributed to the runner’s EHS: hydration, sleep loss, and lack of heat acclimatization. The runner received ideal treatment using evidence-based best practices. A lack of long-term medical care following the EHS with no guidance on the runner’s return to full activity was observed. The runner knew very little about EHS before the 2013 race, which drove him to seek knowledge as to why he suffered EHS. Using this newly learned information, he successfully completed the 2014 Falmouth Road Race without incident.
This case supports prior literature examining the factors that predispose individuals to EHS. Although evidence-based best practices regarding prompt recognition and treatment of EHS ensure survival, this case highlights the lack of medical follow-up and physician-guided return to activity after EHS.
Zachary Y. Kerr, Susan W. Yeargin, Yuri Hosokawa, Rebecca M. Hirschhorn, Lauren A. Pierpoint and Douglas J. Casa
Context: Recent data on exertional heat illness (EHI) in high school sports are limited yet warranted to identify specific settings with the highest risk of EHI. Objective: To describe the epidemiology of EHI in high school sports during the 2012/2013–2016/2017 academic years. Design: Descriptive epidemiology study. Setting: Aggregate injury and exposure data collected from athletic trainers working in high school sports in the United States. Patients or Other Participants: High school athletes during the 2012/2013–2016/2017 academic years. Intervention: High School Reporting Information Online surveillance system data from the 2012/2013–2016/2017 academic years were analyzed. Main Outcome Measures: EHI counts, rates per 10,000 athlete exposures (AEs), and distributions were examined by sport, event type, and US census region. EHI management strategies provided by athletic trainers were analyzed. Injury rate ratios with 95% confidence intervals (CIs) compared EHI rates. Results: Overall, 300 EHIs were reported for an overall rate of 0.13/10,000 AE (95% CI, 0.11 to 0.14). Of these, 44.3% occurred in American football preseason practices; 20.7% occurred in American football preseason practices with a registered air temperature ≥90°F and ≥1 hour into practice. The EHI rate was higher in American football than all other sports (0.52 vs 0.04/10,000 AE; injury rate ratio = 11.87; 95% CI, 9.22 to 15.27). However, girls’ cross-country had the highest competition EHI rate (1.18/10,000 AE). The EHI rate was higher in the South US census region than all other US census regions (0.23 vs 0.08/10,000 AE; injury rate ratio = 2.96; 95% CI, 2.35 to 3.74). Common EHI management strategies included having medical staff on-site at the onset of EHI (92.7%), removing athlete from play (85.0%), and giving athlete fluids via the mouth (77.7%). Conclusions: American football continues to have the highest overall EHI rate although the high competition EHI rate in girls’ cross-country merits additional examination. Regional differences in EHI incidence, coupled with sport-specific variations in management, may highlight the need for region- and sport-specific EHI prevention guidelines.
Amanda L. Zaleski, Linda S. Pescatello, Kevin D. Ballard, Gregory A. Panza, William Adams, Yuri Hosokawa, Paul D. Thompson and Beth A. Taylor
Context: Compression socks have become increasingly popular with athletes due to perceived enhancement of exercise performance and recovery. However, research examining the efficacy of compression socks to reduce exercise-associated muscle damage has been equivocal, with few direct measurements of markers of muscle damage. Objective: To examine the influence of compression socks worn during a marathon on creatine kinase (CK) levels. Design: A randomized controlled trial. Setting: 2013 Hartford Marathon, Hartford, CT. Participants: Adults (n = 20) randomized to control (CONTROL; n = 10) or compression sock (SOCK; n = 10) groups. Main Outcome Measures: Blood samples were collected 24 hours before, immediately after, and 24 hours following the marathon for the analysis of CK, a marker of muscle damage. Results: Baseline CK levels did not differ between CONTROL (89.3 [41.2] U/L) and SOCK (100.0 [56.2] U/L) (P = .63). Immediately following the marathon (≤1 h), CK increased 273% from baseline (P < .001 for time), with no difference in exercise-induced changes in CK from baseline between CONTROL (+293.9 [278.2] U/L) and SOCK (+233.1 [225.3] U/L; P = .60 for time × group). The day following the marathon (≤24 h), CK further increased 1094% from baseline (P < .001 for time), with no difference in changes in CK from baseline between CONTROL (+ 1191.9 [1194.8] U/L) and SOCK (+889.1 [760.2] U/L; P = .53 for time × group). These similar trends persisted despite controlling for potential covariates such as age, body mass index, and race finishing time (Ps > .29). Conclusions: Compression socks worn during a marathon do not appear to mitigate objectively measured markers of muscle damage immediately following and 24 hours after a marathon.