Context: The sensory organization test (SOT) is a standard for quantifying sensory dependence via sway-referenced conditions (sway-referenced support and sway-referenced vision [SRV]). However, the SOT is limited to expensive equipment. Thus, a practical version of the SOT is more commonly employed—the clinical test for sensory integration in balance; however, it fails to induce postural instability to the level of SRV. Objective: Determine if Stroboscopic vision (SV), characterized by intermittent visual blocking, may provide an alternative to the SRV for assessing postural stability. Design: Descriptive laboratory study. Setting: Research laboratory. Participants: Eighteen participants (9 males, 9 females; age = 22.1 [2.1] y, height = 169.8 [8.5] cm, weight = 66.5 [10.6] kg). Intervention: Participants completed the SOT conditions, and then repeated SOT conditions 2 and 5 with SV created by specialized eyewear. Main Outcome Measures: A repeated-measures analysis of variance was completed on the time-to-boundary metrics of center-of-pressure excursion in the anteroposterior and mediolateral directions in order to determine the difference between the full-vision, SV, and SRV conditions. Results: Postural stability with either SRV or SV was significantly worse than with full vision (P < .05), with no significant difference between SV and SRV (P > .05). Limits of agreement analysis revealed similar effects of SV and SRV except for unstable surface mediolateral time-to-boundary. Conclusions: In general, SV was found to induce a degree of postural instability similar to that induced by SRV, indicating that SV could be a portable and relatively inexpensive alternative for the assessment of sensory dependence and reweighting.
Kyung-Min Kim, Joo-Sung Kim, Jeonghoon Oh, and Dustin R. Grooms
Kelsey M. Rynkiewicz, Lauren A. Fry, and Lindsay J. DiStefano
Clinical Scenario: Chronic exertional compartment syndrome (CECS) is a condition related with ischemia of the body’s tissue due to increases in intracompartmental pressures, which involves, among other symptoms, pain with exertion. CECS is often overlooked or misdiagnosed due to an ambiguous presentation. Diagnostic accuracy of CECS and subsequent management can be improved when contributing factors are known. Research is lacking on the type of patient most likely to experience CECS, highlighting the need for identification of common demographic characteristics among affected individuals. Clinical Question: What are the common demographic characteristics among patients exhibiting CECS of the lower leg? Summary of Key Findings: Four studies were identified (1 prospective consecutive study, 2 retrospective reviews, and 1 retrospective cohort study) that examined common characteristics among patients with CECS. Conflicting evidence exists on whether CECS is more commonly seen in men or in women. CECS has often been reported in young, active individuals but may present in older populations as well. Soccer, field hockey, lacrosse, competitive running, and speed skating have been associated with an increased likelihood of CECS development. Clinical Bottom Line: Current evidence has identified commonalities in sex, age, and sport participation as characteristics often present among individuals experiencing lower leg CECS. Other factors, such as overuse, trauma, diabetes, and gait mechanics, have also been identified in association with CECS. Further data through future prospective studies will help confirm the type of patient mostly likely to experience CECS. Strength of Recommendation: Grade B evidence exists that certain sex, age, and sport participation demographic characteristics are common among patients with CECS of the lower leg.
Scott Benson Street and Thomas Kaminski
Clinical Scenario: Hamstring injuries are the most prevalent lower-extremity injury among soccer players. The Fédération Internationale de Football Association (FIFA) has addressed this issue by developing the FIFA 11+ program, which is focused on improving strength and decreasing the incidence of lower-extremity injuries in the sport. This critically appraised topic focuses on this program as well as one of its components, the Nordic hamstring exercise, in the prevention of hamstring injuries. Clinical Question: Does the FIFA 11+ program prevent hamstring injuries in college-aged male soccer players? Summary of Key Findings: Four studies were selected to be critically appraised. The PEDro checklist was used to score the articles on methodology and consistency. All 4 articles demonstrated support for the clinical question. Clinical Bottom Line: There is moderate evidence to support the use of the FIFA 11+ program and Nordic hamstring exercise as part of a college soccer team’s warm-up routine. Strength of Recommendation: Grade B evidence exists in support of incorporating the FIFA 11+ program to reduce the incidence of hamstring injuries in male college soccer players.
Cameron Haun, Cathleen N. Brown, Kimberly Hannigan, and Samuel T. Johnson
Clinical Scenario: Deformation of the arch, as measured by navicular drop (ND), is linked to lower-extremity musculoskeletal injuries. The short foot exercise (SFE) has been used to strengthen the intrinsic foot muscles that support the arch. Clinical Question: Does the SFE decrease ND in healthy adults? Summary of Key Findings: Three studies that examined the use of the SFE on ND were included. A randomized control trial that compared the SFE to a towel-curl exercise and a control group found no significant differences between the 3 groups. A randomized control trial compared the SFE to the use of arch support insoles in individuals with a flexible flatfoot and found a significant improvement in the SFE group. A prospective cohort study, without a control group, reported a significant decrease in ND following a 4-week SFE intervention without a regression at an 8-week follow-up. Overall, two of the three studies reported a significant reduction in ND following an SFE. Clinical Bottom Line: There is preliminary data supporting the use of the SFE to decrease ND—particularly in individuals with a flexible flatfoot. However, issues with the study designs make it difficult to interpret the data. Strength of Recommendation: Due to limited evidence, there is grade B evidence to support the use of the SFE to decrease ND.
Jordan Bettleyon and Thomas W. Kaminski
Clinical Scenario: Low-level laser therapy (LLLT) is a controversial topic for its use in athletic recovery, mainly due to inconsistency in research regarding the application of LLLT. Articles on LLLT have assessed its effectiveness in untrained humans through pain scales, functional scales, and blood draws, and it has been found capable in nonathletic rehabilitative use. The controversy lies with LLLT in the recovering athlete. Not only do athletes need to perform at high levels, but each sport is unique in the metabolic demands placed on the athletes’ bodies. This modality can alter chemical mediators of the inflammatory process, specifically blood lactate (BL) and creatine kinase (CK). During soccer contests, it is a common problem for athletes to have an average CK level of 800 U/L and BL of 8 mmol·L, increasing delayed-onset muscle soreness and fatigue. Micro-CK level elevation is associated with cellular membrane damage, localized hypoxia, and electrolyte imbalances, hindering the recovery process. Clinical Question: Does LLLT decrease muscle-damaging mediators effecting player fatigue and delayed-onset muscle soreness after performance in soccer athletes versus sham treatment? Summary of Key Findings: In 3 studies, preperformance, postperformance, or preperformance and postperformance LLLT was performed and evaluated BL (2 of 3) and CK (2 of 3). In each article, BL and CK showed a significant decrease (P < .05) when performed either preperformance or postperformance versus the control group. The greatest decrease in these mediators was noticed when postperformance laser therapy was performed. Clinical Bottom Line: LLLT at 10, 30, or 50 J performed at a minimum of 2 locations on the rectus femoris, vastus lateralis, and vastus medialis bilaterally for 10 seconds each is significant in decreasing blood serum levels of BL and CK when performed postexercise. Strength of Recommendations: All 3 articles obtained a Physiotherapy Evidence Database score of ≥8/10.
Gabriela Souza de Vasconcelos, Anelize Cini, and Cláudia Silveira Lima
Context: Fencing is a sport of agility, with a higher incidence of lower-limb injuries, of which the ankle sprain is the most prevalent. Injury prevention is very important to improve performance and decrease the withdrawal time of athletes. Proprioceptive training programs can be added to the training of athletes, since, in addition to easy application and low cost, proprioception has the function of stabilizing the ankle joint to prevent injuries. Objective: To verify the influence of a 12-week proprioceptive training program on dynamic neuromuscular control in fencing athletes. Design: The study was a clinical trial, and the athletes were allocated, for convenience, in the intervention group or in the control group. Setting: The study was developed in 4 stages (preintervention, intervention, postintervention, and follow-up of 3). The neuromuscular control during the star excursion balance test was evaluated. Participants: The participants were 19 fencing athletes (intervention group: 10, and control group: 9), aged 14–35 years, from a multisport club. Interventions: The athletes performed the proprioceptive training during 12 weeks, 3 times a week, with a duration of 30 minutes. Main Outcome Measures: Dynamic neuromuscular control. Results: The data and SE were considered for statistical analysis, submitted to the generalized estimates equations test with Bonferroni post hoc. The level of significance was .05. The distance reached in the star excursion balance test increased significantly in all 8 directions evaluated in the 2 legs of the intervention group. Conclusions: The proprioceptive training program was able to improve dynamic neuromuscular control in fencing athletes.
Xin He, Hio Teng Leong, On Yue Lau, Michael Tim-Yun Ong, and Patrick Shu-Hang Yung
Context: Altered lower-limb biomechanics have been observed during landing task in patients with anterior cruciate ligament reconstruction (ACLR), which increases the risk of secondary anterior cruciate ligament injury. However, the alteration in neuromuscular activity of the lower-extremity during landing task is not clear. Objective: To compare the muscle activity pattern assessed by electromyography between the involved limb of patients with ACLR and the contralateral limb or control limb of matched healthy subjects during landing task. Evidence Acquisition: Database of PubMed, Ovid, Scopus, and Web of Science from the inception of the databases until July 2019, using a combination of keywords and their variations: (anterior cruciate ligament OR ACL) AND (electromyography OR EMG) AND (landing OR land). Studies that assessed lower-extremity muscle activity patterns during landing task in patients with ACLR and compared them either with the contralateral side or healthy controls were included. Evidence Synthesis: Of the 21 studies, 16 studies reported altered muscle activity pattern during landing tasks when compared with either the healthy controls or the contralateral side. For the specific muscle activity patterns, the majority of the studies showed no significant difference in reactive muscle activity, and comparisons across studies revealed a possible trend toward the early onset of quadriceps and hamstring activity and increased cocontraction of the involved limb. There are inconsistent findings regarding the alteration in muscle timing and preparatory muscle activity. Conclusions: Patients with ACLR displayed an altered muscle activity pattern during landing tasks, even though they were considered to be capable for sport return. Nevertheless, a firm conclusion could not be drawn due to great heterogeneity in the subject selection and study methods.
Christopher J. Burcal, Sunghoon Chung, Madison L. Johnston, and Adam B. Rosen
Background: Region-specific patient-reported outcomes (PROs) are commonly used in rehabilitation medicine. Digital versions of PROs may be implemented into electronic medical records and are also commonly used in research, but the validity of this method of administration (MOA) must be established. Purpose: To determine the agreement between and compare the test–retest reliability of a paper version (FAAM-P) and digital version (FAAM-D) of the Foot and Ankle Ability Measure (FAAM). Study Design: Randomized, nonblinded, crossover observational study. Methods: A total of 90 adults were randomized to complete the FAAM-P or FAAM-D first, and then completed the second MOA (first day [D1]). The FAAM-D was a digital adaptation of both FAAM-P subscales on Qualtrics. Identical test procedures were completed 1 week later (D2). Data were removed if a participant scored 100% on both MOA, reported injury between D1 and D2, or did not complete both MOA. Agreement was assessed on 46 participants between the 2 MOA using intraclass correlation coefficients (ICC) at D1. There was good-to-excellent test–retest reliability for the FAAM activities of daily living. Results: The authors observed good agreement between the FAAM-P and FAAM-D for the activities of daily living (ICC = .88) and sport scales (ICC = .87). Test–retest reliability was good-to-excellent for the FAAM activities of daily living (FAAM-P: ICC = .87; FAAM-D: ICC = .89) and sport (FAAM-P: ICC = .71; FAAM-D: ICC = .91). Conclusions: The MOA does not appear to affect the responses on the FAAM; however, the authors observed slightly higher reliability on the FAAM-D. The FAAM-D is sufficient to be used for generating practice-based evidence in rehabilitation medicine.
Kazuo Saito and Hitoshi Kihara
Context: Many patients report poor therapeutic outcomes following mallet finger fracture surgery. A more reliable technique is urgently needed. Objective: To present a novel treatment for mallet finger fractures using a 2-step orthosis method. Design: Prospective, observational study. Setting: Hospital. Participants: Patients with mallet finger fractures. Interventions: The finger is fixed with splints for 6 weeks, including 3 weeks for the proximal interphalangeal joint in the flexion position and the distal interphalangeal joint in the hyperextension position (first splint) and 3 weeks for the distal interphalangeal joint in the hyperextension position (second splint). Up to week 8, the second splint was attached at night and during physical exertion. Main Outcome Measures: Crawford criteria, Abouna–Brown criteria, bone fusion, grip strength, Doyle classification, Ishiguro classification. Results: Sufficient bone fusion was achieved 12 weeks after fixation; at which time, the range of motion with the distal interphalangeal joint flexed, and extended in the 3 patients was 50° and 0°, 70° and −3°, and 60° and 0°, respectively. The right and left hand grip strengths in the 3 patients were 58 and 55 kg, 62 and 58 kg, and 31 and 29 kg, respectively; there were no problems with respect to function or work. The first 2 patients could start sports again with partial return after 1 week and complete return after 12 weeks and 8 weeks, respectively. For the third patient, rehabilitation was complete after 16 weeks. Evaluation of the fracture sites based on the Crawford criteria showed the condition to be perfect, and evaluation based on the Abouna–Brown criteria showed success. Conclusions: This method provides satisfactory fixation and can prevent proximal interphalangeal joint contracture. Favorable long-term outcomes were confirmed in all patients, suggesting that this method may be effective for previously untreated mallet finger fractures with little displacement.
Claudia G. Levenig, Michael Kellmann, Jens Kleinert, Johanna Belz, Tobias Hesselmann, Jahan Heidari, and Monika I. Hasenbring
Context: Low back pain (LBP) is a serious health problem, both in the general population as well as in athletes. Research has shown that psychosocial aspects, such as dysfunctional pain responses, play a significant role in the chronification of LBP. Recent research supports the relevance of the multidisciplinary concept of body image in the interpretation of LBP. Objective: To examine the differences in 2 psychosocial aspects, body image and pain responses, between athletes and nonathletes with LBP. Design: Cross-sectional design. Setting: The questionnaires were distributed in the course of LBP treatment. Participants: Data from 163 athletes (mean age = 28.69 [9.6] y) and 75 nonathletes (mean age = 39.34 [12.63] y) were collected. Interventions: Data were collected by questionnaires assessing body image, pain behavior, training activity, and LBP. Main Outcome Measures: To examine group differences between athletes and nonathletes regarding body image and pain behavior, the authors performed 2-way analyses of variance with Bonferroni post hoc tests. Results: The results showed (1) a significant main effect regarding pain responses and body image, showing that participants with eustress endurance or adaptive pain behavior revealed a more positive body image in both groups compared with participants with distress endurance or fear-avoidance behavior, and (2) a significant main effect for the factor group in the body image dimension of physical efficacy, indicating a more positive body image for athletes. Conclusion: These results suggest that considering multiple risk factors for LBP, such as body image and dysfunctional pain behavior, as well as subgrouping, might be valuable for research and for broadening therapy options.