Despite the growing popularity of mindfulness and acceptance-based performance enhancement methods in applied sport psychology, evidence for their efficacy is scarce. The purpose of the current study is to test the feasibility and effect of a psychological training program based on Acceptance and Commitment Training (ACT) developed for ice hockey players. A controlled group feasibility designed study was conducted and included 21 elite male ice hockey players. The ACT program consisted of four, once a week, sessions with homework assignments between sessions. The results showed significant increase in psychological flexibility for the players in the training group. The outcome was positive for all feasibility measures. Participants found the psychological training program important to them as ice hockey players and helpful in their ice hockey development. Desirably, future studies should include objective performance data as outcome measure to foster more valid evidence for performance enhancement methods in applied sport psychology.
Tobias Lundgren, Gustaf Reinebo, Markus Näslund and Thomas Parling
Ui-Jae Hwang, Sung-Hoon Jung, Hyun-A Kim, Jun-Hee Kim and Oh-Yun Kwon
Context: Electrical muscle stimulation (EMS) was designed for artificial muscle activation or superimposed training. Objectives: To compare the effects of 8 weeks of superimposed technique (ST; application of electrical stimulation during a voluntary muscle action) and EMS on the cross-sectional area of the rectus abdominis, lateral abdominal wall, and on lumbopelvic control. Setting: University research laboratory. Design: Randomized controlled trial. Participants: Fifty healthy subjects were recruited and randomly assigned to either the ST or EMS group. Intervention: The participants engaged with the electrical stimulation techniques (ST or EMS) for 8 weeks. Main Outcome Measures: In all participants, the cross-sectional area of the rectus abdominis and lateral abdominal wall was measured by magnetic resonance imaging and lumbopelvic control, quantified using the single-leg and double-leg lowering tests. Results: There were no significant differences in the cross-sectional area of the rectus abdominis (right: P = .70, left: P = .99) or lateral abdominal wall (right: P = .07, left: P = .69) between groups. There was a significant difference between groups in the double-leg lowering test (P = .03), but not in the single-leg lowering test (P = .88). There were significant differences between the preintervention and postintervention in the single-leg (P < .001) and double-leg lowering tests (P < .001). Conclusions: ST could improve lumbopelvic control in the context of athletic training and fitness.
Kenneth Färnqvist, Stephen Pearson and Peter Malliaras
Context: Exercise is seen as the most evidence-based treatment for managing tendinopathy and although the type of exercise used to manage tendinopathy may induce adaptation in healthy tendons, it is not clear whether these adaptations occur in tendinopathy and if so whether they are associated with improved clinical outcomes. Objective: The aim of the study was to synthesize available evidence for adaptation of the Achilles tendon to eccentric exercise and the relationship between adaptation (change in tendon thickness) and clinical outcomes among people with Achilles tendinopathy. Evidence Acquisition: The search was performed in September 2018 in several databases. Studies investigating the response (clinical outcome and imaging on ultrasound/magnetic resonance imaging) of pathological tendons (tendinopathy, tendinosis, and partial rupture) to at least 12 weeks of eccentric exercise were included. Multiple studies that investigated the same interventions and outcome were pooled and presented in effect size estimates, mean difference, and 95% confidence intervals if measurement scales were the same, or standard mean difference and 95% confidence intervals if measurements scales were different. Where data could not be pooled the studies were qualitatively synthesized based on van Tulder et al. Evidence Synthesis: Eight studies met the inclusion and exclusion criteria and were included in the review. There was strong evidence that Achilles tendon thickness does not decrease in parallel with improved clinical outcomes. Conclusions: Whether a longer time to follow-up is more important than the intervention (ie, just the time per se) for a change in tendon thickness remains unknown. Future studies should investigate whether exercise (or other treatments) can be tailored to optimize tendon adaptation and function, and whether this relates to clinical outcomes.
Natalie L. Myers, Guadalupe Mexicano and Kristin V. Aguilar
Clinical Scenario: Workload monitoring and management of an athlete is viewed by many as an essential training component to determine if an athlete is adapting to a training program and to minimize injury risk. Although training workload may be measured a variety of different ways, session rate of perceived exertion (sRPE) is often used in the literature due to its clinical ease. In recent years, sports scientists have been investigating sRPE as a measure of internal workload and its relationship to injury in elite-level athletes using a metric known as the acute:chronic workload ratio (ACWR). This critically appraised topic was conducted to determine if internal workload using the ACWR is associated with injury. Focused Clinical Question: In elite-level athletes, is there an association between the ACWR for sRPE and noncontact injuries? Summary of Search, Best Evidence Appraised, and Key Findings: The literature was searched for studies investigating the association between noncontact injuries and the sRPE ACWR in elite athletes. Three prospective cohort studies were included. Two studies found that high ACWR led to 2.0 to 4.5 times greater injury risk compared with a more balanced ACWR. One study found that low chronic workloads coupled with a low ACWR were associated with injury. Clinical Bottom Line: The majority of evidence suggests that when the acute workload exceeds the chronic workload, there is an increase in injury risk. The evidence also supports that a low chronic workload with a low ACWR should be considered as an injury risk factor. Strength of Recommendation: Based on the American Family Physician’s Strength of Recommendation Taxonomy, there is level A evidence to support the sRPE ACWR as a risk factor for noncontract injuries in elite athletes.
Mahsa Jafari, Vahid Zolaktaf and Gholamali Ghasemi
Purpose: Firefighters require a high level of functional fitness to operate safely, effectively, and efficiently. The authors studied the distribution of functional movement screen (FMS) scores in firefighters and examined whether an 8-week corrective exercise program based on National Academy of Sport Medicine guidelines could improve them. Methods: All 524 active firefighters of a city completed the baseline FMS testing. Those who obtained a score of 14 or less, a sign of movement dysfunction, and volunteered to continue their participation were randomly assigned to either an experimental (n = 51) or a control (n = 45) group. Both groups participated in an 8-week training program. The control group used their own usual training routine, but the experimental group used the specific protocol designed for the study. Results: The FMS scores of 43% of the population were less than 14. Repeated-measures analysis of variance revealed a significant interaction between FMS scores of the groups (F 1,94 =165, P < .001). The experimental group showed a 69% improvement from pretest (10.6) to posttest (17.8), whereas the control group showed only a 3% improvement from pretest (11.8) to posttest (12.1). Conclusions: Preceding studies have shown that FMS scores less than 14 increase the injury risk. The findings showed that using our proposed training protocol, low FMS scores could be improved to 14 and higher. Considering the high injury rate of firefighters, the authors suggest administering FMS periodically and to use a training protocol such as ours, to increase functional fitness and reduce injury risk.
Ashley N. Marshall, Alison R. Snyder Valier, Aubrey Yanda and Kenneth C. Lam
Context: There has been an increased interest in understanding how ankle injuries impact patient outcomes; however, it is unknown how the severity of a previous ankle injury influences health-related quality of life (HRQOL). Objective: To determine the impact of a previous ankle injury on current HRQOL in college athletes. Design: Cross-sectional study. Setting: Athletic training clinics. Participants: A total of 270 participants were grouped by the severity of a previous ankle injury (severe = 62, mild = 65, and no injury = 143). Main Outcome Measures: Participants completed the Foot and Ankle Ability Measure (FAAM) and the Short Form 12 (SF-12). Methods: A 2-way analysis of variance with 2 factors (injury group and sex) was used to identify interaction and main effects for the FAAM and SF-12. Results: No interactions were identified between injury group and sex. Significant main effects were observed for injury group, where the severe injury group scored lower than athletes with mild and no injuries on the FAAM activities of daily living, FAAM Global, and SF-12 mental health subscale scores. In addition, a main effect was present for sex in the SF-12 general health, social functioning, and mental health subscales in which females reported significantly lower scores than males. Conclusions: Our findings suggest that a severe ankle injury impacts HRQOL, even after returning back to full participation. In addition, females tended to report lower scores than males for aspects of the SF-12, suggesting that sex should be considered when evaluating HRQOL postinjury. As a result, clinicians should consider asking athletes about their previous injury history, including how much time was lost due to the injury, and should mindful of returning athletes to play before they are physiologically and psychologically ready, as there could be long-term negative effects on the patients’ region-specific function as well as aspects of their HRQOL.
Mark A. Sutherlin, L. Colby Mangum, Shawn Russell, Susan Saliba, Jay Hertel and Joe M. Hart
Context: Reduced spinal stabilization, delayed onset of muscle activation, and increased knee joint stiffness have been reported in individuals with a history of low back pain (LBP). Biomechanical adaptations resulting from LBP may increase the risk for future injury due to suboptimal loading of the lower-extremity or lumbar spine. Assessing landing mechanics in these individuals could help identify which structures might be susceptible to future injury. Objective: To compare vertical and joint stiffness of the lower-extremity and lumbar spine between individuals with and without a previous history of LBP. Design: Cross-sectional study. Setting: Research laboratory. Participants: There were 45 participants (24 without a previous history of LBP—age 23  y, height 169.0 [8.5] cm, mass 69.8 [13.8] kg; 21 with a previous history of LBP—age 25 y, height 170.0 [8.0] cm, mass 70.2 [11.8] kg). Interventions: Single-limb landing trials on the dominant and nondominant limb from a 30-cm box. Main Outcome Measures: Vertical stiffness and joint stiffness of the ankle, knee, hip, and lumbar spine. Results: Individuals with a previous history of LBP had lower vertical stiffness (P = .04), but not joint stiffness measures compared with those without a previous history of LBP (P > .05). Overall females had lower vertical (P = .01), ankle (P = .02), and hip stiffness (P = .04) compared with males among all participants. Males with a previous history of LBP had lower vertical stiffness compared with males without a previous history LBP (P = .01). Among all individuals without a previous history of LBP, females had lower vertical (P < .01) and ankle stiffness measures (P = .04) compared with males. Conclusions: Landing stiffness may differ among males and females and a previous history of LBP. Comparisons between individuals with and without previous LBP should be considered when assessing landing strategies, and future research should focus on how LBP impacts landing mechanics.
Clinical Scenario: Ice hockey and soccer are both dynamic sports that involve continuous, unpredictable play. These athletes consistently demonstrate higher rates of groin strains compared with other contact sports. Measuring the hip adductor/abductor ratio has the potential to expose at-risk players, reduce injury rates, and preserve groin health in players with chronic strains. Focused Clinical Question: What is the clinical utility of measuring the hip adductor/abductor ratio for preseason and in-season ice hockey and soccer players? Summary of Key Findings: Three studies, all of which were prospective cohort designs, were included. One study involved assessing preseason strength and flexibility as a risk factor for adductor strains in professional ice hockey players. Another study performed with the same professional hockey team used preseason hip adductor/abductor strength ratios to screen for those players who would benefit from a strengthening intervention aimed at reducing the incidence of adductor strains. The final study, which was performed in elite U17 soccer players, assessed the effectiveness of monthly in-season strength monitoring as a guide to trigger in-season interventions to decrease injury incidence. Clinical Bottom Line: Measuring the hip adductor/abductor strength ratio in hockey and soccer players can be a beneficial preseason and in-season tool to predict future groin strain risk and screen for athletes who might benefit from a strengthening intervention. Strength of Recommendation: Level 3 evidence exists to support monitoring the hip adductor/abductor strength ratio to assess and reduce the risk of adductor strains in ice hockey and soccer players.
Caroline Westwood, Carolyn Killelea, Mallory Faherty and Timothy Sell
Context: Concussions are consequence of sports participation. Recent reports indicate there is an increased risk of lower-extremity musculoskeletal injury when returning to sport after concussion suggesting that achieving “normal” balance may not fully indicate the athlete is ready for competition. The increased risk of injury may indicate the need to refine a screening tool for clearance. Objective: Assess the between-session reliability and the effects of adding a cognitive task to static and dynamic postural stability testing in a healthy population. Setting: Clinical laboratory. Participants: Twelve healthy subjects (6 women; age 22.3 [2.9] y, height 174.4 [7.5] cm, weight 70.1 [12.7] kg) participated in this study. Design: Subjects underwent static and dynamic postural stability testing with and without the addition of a cognitive task (Stroop test). Test battery was repeated 10 days later. Dynamic postural stability testing consisted of a forward jump over a hurdle with a 1-legged landing. A stability index was calculated. Static postural stability was also assessed with and without the cognitive task during single-leg balance. Variability of each ground reaction force component was averaged. Main Outcome Measures: Interclass correlation coefficients (ICC2,1) were computed to determine the reliability. Standard error of measure, mean standard error, mean detectable change, and 95% confidence interval were all calculated. Results: Mean differences between sessions were low, with the majority of variables having moderate to excellent reliability (static .583–.877, dynamic .581–.939). The addition of the dual task did not have any significant effect on reliability of the task; however, generally, the ICC values improved (eyes open .583–.770, dual task .741–.808). Conclusions: The addition of a cognitive load to postural stability assessments had moderate to excellent reliability in a healthy population. These results provide initial evidence on the feasibility of dual-task postural stability testing when examining risk of lower-extremity musculoskeletal injury following return to sport in a concussed population.
Ryan Morrison, Kyle M. Petit, Chris Kuenze, Ryan N. Moran and Tracey Covassin
Context: Balance testing is a vital component in the evaluation and management of sport-related concussion. Few studies have examined the use of objective, low-cost, force-plate balance systems and changes in balance after a competitive season. Objective: To examine the extent of preseason versus postseason static balance changes using the Balance Tracking System (BTrackS) force plate in college athletes. Design: Pretest, posttest design. Setting: Athletic training facility. Participants: A total of 47 healthy, Division-I student-athletes (33 males and 14 females; age 18.4 [0.5] y, height 71.8 [10.8] cm, weight 85.6 [21.7] kg) participated in this study. Main Outcome Measures: Total center of pressure path length was measured preseason and postseason using the BTrackS force plate. A Wilcoxon signed-rank test was conducted to examine preseason and postseason changes. SEM and minimal detectable change were also calculated. Results: There was a significant difference in center of pressure path length differed between preseason (24.6 [6.8] cm) and postseason (22.7 [5.4] cm) intervals (P = .03), with an SEM of 3.8 cm and minimal detectable change of 10.5 cm. Conclusions: Significant improvements occurred for center of pressure path length after a competitive season, when assessed using the BTrackS in a sample of college athletes. Further research is warranted to determine the effectiveness of the BTrackS as a reliable, low-cost alternative to force-plate balance systems. In addition, clinicians may need to update baseline balance assessments more frequently to account for improvements.