Matthew Hamilton and James R. Velasquez
Edited by Michael G. Dolan
Marta Stepien-Slodkowska, Krzysztof Ficek, Pawel Zietek, Mariusz Kaczmarczyk, Wioletta Lubkowska, Miroslawa Szark-Eckardt and Pawel Cieszczyk
The most commonly injured body part for skiing has been found to be the knee. The rupture of the anterior cruciate ligament (ACL) was the most frequent diagnosis. ACL ruptures are determined by several extrinsic and intrinsic risk factors including those that are hormonal, neuromuscular, anatomical, or genetic.
To examine the association of both COL1A1 rs1800012 (+1245G/T) and COL1A1 rs1107946 (–1997G/T) polymorphisms, individually and as haplotypes, with ACL ruptures in recreational Polish skiers.
Genomic DNA was extracted from buccal cells donated by the subjects, and genotyping was carried out using real-time polymerase chain reaction.
138 male recreational skiers with surgically diagnosed primary ruptures and 183 apparently healthy male recreational skiers not differing markedly in age or level of exposure to ACL injury.
Main Outcome Measures:
COL1A1 rs1800012 and COL1A1 rs1107946 polymorphisms.
There were significant differences in genotype distribution of the COL1A1 rs1800012 polymorphism between the ACL rupture group and the control group. The GG homozygotes were underrepresented in the ACL rupture group compared with the control group. There were no significant differences in genotype distribution or allele frequency of COL1A1 rs1107946 polymorphisms between the ACL rupture group and the control group. The G-G (COL1A1 rs1800012G and COL1A1 rs1107946G) haplotype was the most common. There were no significant differences in haplotype distribution between the ACL-rupture and control groups.
The study showed that GG homozygotes were underrepresented in the ACL-rupture group compared with the control group, which suggests an association with reduced risk of ACL injury.
Caroline Westwood, Carolyn Killelea, Mallory Faherty and Timothy Sell
Context: Concussions are consequence of sports participation. Recent reports indicate there is an increased risk of lower-extremity musculoskeletal injury when returning to sport after concussion suggesting that achieving “normal” balance may not fully indicate the athlete is ready for competition. The increased risk of injury may indicate the need to refine a screening tool for clearance. Objective: Assess the between-session reliability and the effects of adding a cognitive task to static and dynamic postural stability testing in a healthy population. Setting: Clinical laboratory. Participants: Twelve healthy subjects (6 women; age 22.3 [2.9] y, height 174.4 [7.5] cm, weight 70.1 [12.7] kg) participated in this study. Design: Subjects underwent static and dynamic postural stability testing with and without the addition of a cognitive task (Stroop test). Test battery was repeated 10 days later. Dynamic postural stability testing consisted of a forward jump over a hurdle with a 1-legged landing. A stability index was calculated. Static postural stability was also assessed with and without the cognitive task during single-leg balance. Variability of each ground reaction force component was averaged. Main Outcome Measures: Interclass correlation coefficients (ICC2,1) were computed to determine the reliability. Standard error of measure, mean standard error, mean detectable change, and 95% confidence interval were all calculated. Results: Mean differences between sessions were low, with the majority of variables having moderate to excellent reliability (static .583–.877, dynamic .581–.939). The addition of the dual task did not have any significant effect on reliability of the task; however, generally, the ICC values improved (eyes open .583–.770, dual task .741–.808). Conclusions: The addition of a cognitive load to postural stability assessments had moderate to excellent reliability in a healthy population. These results provide initial evidence on the feasibility of dual-task postural stability testing when examining risk of lower-extremity musculoskeletal injury following return to sport in a concussed population.
Nicola Relph and Lee Herrington
Context: Knee joint-position sense (JPS) plays a critical role in controlled and stable joint movement. Poor ability to sense position of the knee can therefore increase risk of injury. There is no agreed consensus on JPS measurement techniques and a lack of reliability statistics on methods. Objective: To identify the most reliable knee JPS measurement technique using image capture. Design: Interexaminer, intraexaminer, and test-retest reliability of knee JPS measurements. Setting: Biomechanics laboratory. Participants: 10 asymptomatic participants. Interventions: None. Main Outcome Measures: Relative and absolute error scores of knee JPS in 3 conditions (sitting, prone, active) through 3 ranges of movement (10-30°, 30-60°, 60-90°), into 2 directions (flexion and extension) using both legs (dominant and nondominant) collected during 15 trials and repeated 7 d after the first data collection. Results: Statistical analysis by intraclass correlations revealed excellent interexaminer reliability between researchers (.98) and intraexaminer reliability within 1 researcher (.96). Test-retest reliability was highest in the sitting condition from a starting angle of 0°, target angle through 60-90° of flexion, using the dominant leg and absolute-error-score variables (ICC = .92). However, it was noted smallest detectable differences were a high percentage of mean values for all measures. Conclusions: The most reliable JPS measurement for asymptomatic participants has been identified. Practitioners should use this protocol when collecting JPS data during prescreening sessions. However, generalizability of findings to a class/group of clients exhibiting knee pathologies should be done with caution.
Carly May Green, Paul Comfort and Lee Herrington
A reduction in joint position sense (JPS) is sometimes a consequence of shoulder injury that may adversely affect the ability to maintain dynamic joint stability.
To compare shoulder JPS between previously injured and noninjured judokas.
Twenty-nine noninjured subjects (10.93 ± 3.45 years) and eleven injured subjects (15.09 ± 3.39 years).
Main Outcome Measures:
JPS was tested at 45° and 80°of shoulder external rotation at 90° of abduction.
No signifcant difference in JPS was found between previously injured and noninjured judokas at either joint position.
Despite evidence that JPS acuity decreases following shoulder injury, this study did not demonstrate a difference in average error between previously injured and noninjured judokas. Uncontrolled confounding factors, such as age and time since injury, may have affected the results. Sport-specifc shoulder joint loading patterns may also be an important factor that affects JPS.
Jenna Ratka, Jamie Mansell and Anne Russ
Clinical Question: In rugby players, does using a mouthguard reduce the risk of concussion? Clinical Bottom Line: After examining the three studies included in this critical appraisal, results are inconclusive. This translates to limited evidence to support the use of a mouthguard to decrease concussion incidence in the sport of rugby
Rochelle L. Nicholls, Bruce C. Elliott, Karol Miller and Michael Koh
Ball exit velocity (BEV) was measured from 17 experienced baseball hitters using wood and metal bats of similar length and mass but different moments of inertia. This research was conducted in response to safety issues for defensive players related to high BEV from metal baseball bats reported in the literature. Our purpose was to determine whether metal bats, with their lower swing moment of inertia, produce a higher linear bat tip velocity than wooden bats swung by the same players. Analysis using high-speed videography indicated significant differences in the x-component of velocity for both the proximal (metal = 5.4 m s−1; wood = 3.9 m s−1) and distal ends of the bats (metal = 37.2 m s−1; wood = 35.2 m s−1), p < 0.01. The orientation of the bats with respect to the horizontal plane was also significantly more “square” 0.005 s prior to impact (270°) for the metal (264.3°) compared with the wood bat (251.5°), p < 0.01. Mean BEV from metal bats (44.3 m s−1) was higher than the 41 m s−1 velocity which corresponds to the minimum movement time for a pitcher to avoid a ball hit in his direction (Cassidy & Burton, 1989).
Darin A. Padua, Michelle C. Boling, Lindsay J. DiStefano, James A. Onate, Anthony I. Beutler and Stephen W. Marshall
There is a need for reliable clinical assessment tools that can be used to identify individuals who may be at risk for injury. The Landing Error Scoring System (LESS) is a reliable and valid clinical assessment tool that was developed to identify individuals at risk for lower extremity injuries. One limitation of this tool is that it cannot be assessed in real time and requires the use of video cameras.
To determine the interrater reliability of a real-time version of the LESS, the LESS-RT.
Controlled research laboratory.
43 healthy volunteers (24 women, 19 men) between the ages of 18 and 23.
The LESS-RT evaluates 10 jump-landing characteristics that may predispose an individual to lower extremity injuries. Two sets of raters used the LESS-RT to evaluate participants as they performed 4 trials of a jump-landing task.
Main Outcome Measures:
Intraclass correlation coefficient (ICC2,1) values for the final composite score of the LESS-RT were calculated to assess interrater reliability of the LESS-RT.
Interrater reliability (ICC2,1) for the LESS-RT ranged from .72 to .81 with standard error of measurements ranging from .69 to .79.
The LESS-RT is a quick, easy, and reliable clinical assessment tool that may be used by clinicians to identify individuals who may be at risk for lower extremity injuries.
Gary B. Wilkerson
Prevention of a lower extremity sprain or strain requires some basis for predicting that an individual athlete will sustain such an injury unless a modifiable risk factor is addressed.
To assess the possible existence of an association between reaction time measured during completion of a computerized neurocognitive test battery and subsequent occurrence of a lower extremity sprain or strain.
Prospective cohort study.
Preparticipation screening conducted in a computer laboratory on the day prior to initiation of preseason practice sessions.
76 NCAA Division I-FCS football players.
Main Outcome Measures:
Lower extremity sprains and strains sustained between initiation of preseason practice sessions and the end of an 11-game season. Receiver operating characteristic analysis identified the optimal reaction time cut-point for discrimination between injured versus noninjured status. Stratified analyses were performed to evaluate any differential influence of reaction time on injury incidence between starters and nonstarters.
A total of 29 lower extremity sprains and strains were sustained by 23 of the 76 players. A reaction time cut-point of ≥ .545 s provided good discrimination between injured and noninjured cases: 74% sensitivity, 51% specificity, relative risk = 2.17 (90% CI: 1.10, 4.30), and odds ratio = 2.94 (90% CI: 1.19, 7.25).
Neurocognitive reaction time appears to be an indicator of elevated risk for lower extremity sprains and strains among college football players, which may be modifiable through performance of exercises designed to accelerate neurocognitive processing of visual input.
David L. Carey, Justin Crow, Kok-Leong Ong, Peter Blanch, Meg E. Morris, Ben J. Dascombe and Kay M. Crossley
Training-load prescription in team-sport athletes is a balance between performance improvement 1 , 2 and injury-risk reduction. 3 – 6 The manipulation of training intensity, duration, and frequency to induce improvements in athletic performance is a fundamental objective of training