Purpose: To examine the relationship between player internal workloads, daily wellness monitoring, and injury and illness in a group of elite adolescent cricketers during overseas competitions. Methods: A total of 39 male international adolescent cricketers (17.5 [0.8] y) took part in the study. Data were collected over 5 tours across a 3-y period (2014–2016). Measures of wellness were recorded and daily training loads were calculated using session rating of perceived exertion. The injury and illness status of each member of the squad was recorded daily. Acute and chronic workloads were calculated using 3-d and 14-d moving averages. Acute workloads, chronic workloads, and acute chronic workload ratios were independently modeled as fixed effects predictor variables. Results: In the subsequent week, a high 3-d workload was significantly associated with an increased risk of injury (relative risk = 2.51; CI = 1.70–3.70). Similarly, a high 14-d workload was also associated with an increased risk of injury (relative risk = 1.48; CI = 1.01–2.70). Individual differences in the load–injury relationship were also found. No clear relationship between the acute chronic workload ratios and injury risk was found, but high chronic workloads combined with high or low acute chronic workload ratios showed an increased probability of injury compared with moderate chronic workloads. There were also trends for sleep quality and cold symptoms worsening the week before an injury occurred. Conclusion: Although there is significant individual variation, short-term high workloads and change in wellness status appear to be associated with injury risk.
Robert Ahmun, Steve McCaig, Jamie Tallent, Sean Williams and Tim Gabbett
Eva Piatrikova, Ana C. Sousa, Javier T. Gonzalez and Sean Williams
Purpose: To assess the concurrent and predictive validity of the 3-minute all-out test (3MT) against conventional methods (CM) of determining critical speed (CS) and curvature constant (D′) and to examine the test–retest reliability of the 3MT in highly trained swimmers. Methods: Thirteen highly trained swimmers (age 16  y, weight 64.7 [8.5] kg, height 1.76 [0.07] m) completed 4 time trials and two 3MTs over 2 wk. The distance–time (DT) and speed–1/time (1/T) models were used to determine CS and D′ from 4 time trials. CS3MT and
Eva Piatrikova, Nicholas J. Willsmer, Ana C. Sousa, Javier T. Gonzalez and Sean Williams
Purpose: To monitor physiological, technical, and performance responses to individualized high-intensity interval training (HIIT) prescribed using the critical speed (CS) and critical stroke rate (CSR) concepts in swimmers completing a reduced training volume program (≤30 km·wk−1) for 15 weeks. Methods: Over the 15-week period, 12 highly trained swimmers (age 16  y, height 179  cm, weight 66  kg) completed four 3-minute all-out tests to determine CS and the finite capacity to work above CS (D′), and four 200-m tests at CS to establish a CSR estimate. Combining CS and D′, 2 HIIT sessions designed as 5 × 3-minute intervals depleting 60% of D′ and 3 × 3.5-minute intervals depleting 80% of D′ were prescribed once per week, respectively. An additional HIIT session was prescribed using CS and CSR as 10 × 150 m or 200 m at CS with 2 cycles per minute lower stroke rate than the CSR estimate. Additional monitored variables included peak speed, average speed for 150 seconds (speed150s) and 180 seconds (speed180s), competition performance and stroke length (SL), stroke count (SC), and stroke index (SI) adopted at CS. Results: At the end of the intervention, swimmers demonstrated faster CS (mean change ± 90% confidence limits: +5.4 ± 1.6%), speed150s (+2.5 ± 0.9%), speed180s (+3.0 ± 0.9%), and higher stroke rate (+6.4 ± 3.0%) and stroke index (+4.2 ± 3.6%). D′ was reduced (−25.2 ± 7.5%), whereas peak speed, SL, and SC changed only trivially. The change in the swimmers’ personal best times in the first and second main event was −1.2 ± 1.3% and −1.6 ± 0.9%, respectively. Conclusion: HIIT prescribed based on the CS and CSR concepts was associated with improvements in several physiological, technical, and performance parameters in highly trained swimmers while utilizing time- and resource-efficient approach. This was achieved despite a ≥25% reduction in training volume.
Matthew J. Cross, Sean Williams, Grant Trewartha, Simon P.T. Kemp and Keith A. Stokes
To explore the association between in-season training-load (TL) measures and injury risk in professional rugby union players.
This was a 1-season prospective cohort study of 173 professional rugby union players from 4 English Premiership teams. TL (duration × session-RPE) and time-loss injuries were recorded for all players for all pitch- and gym-based sessions. Generalized estimating equations were used to model the association between in-season TL measures and injury in the subsequent week.
Injury risk increased linearly with 1-wk loads and week-to-week changes in loads, with a 2-SD increase in these variables (1245 AU and 1069 AU, respectively) associated with odds ratios of 1.68 (95% CI 1.05–2.68) and 1.58 (95% CI 0.98–2.54). When compared with the reference group (<3684 AU), a significant nonlinear effect was evident for 4-wk cumulative loads, with a likely beneficial reduction in injury risk associated with intermediate loads of 5932–8651 AU (OR 0.55, 95% CI 0.22–1.38) (this range equates to around 4 wk of average in-season TL) and a likely harmful effect evident for higher loads of >8651 AU (OR 1.39, 95% CI 0.98–1.98).
Players had an increased risk of injury if they had high 1-wk cumulative loads (1245 AU) or large week-to-week changes in TL (1069 AU). In addition, a U-shaped relationship was observed for 4-wk cumulative loads, with an apparent increase in risk associated with higher loads (>8651 AU). These measures should therefore be monitored to inform injury-risk-reduction strategies.
Sean Williams, Grant Trewartha, Matthew J. Cross, Simon P.T. Kemp and Keith A. Stokes
Numerous derivative measures can be calculated from the simple session rating of perceived exertion (sRPE), a tool for monitoring training loads (eg, acute:chronic workload and cumulative loads). The challenge from a practitioner’s perspective is to decide which measures to calculate and monitor in athletes for injury-prevention purposes. The aim of the current study was to outline a systematic process of data reduction and variable selection for such training-load measures.
Training loads were collected from 173 professional rugby union players during the 2013–14 English Premiership season, using the sRPE method, with injuries reported via an established surveillance system. Ten derivative measures of sRPE training load were identified from existing literature and subjected to principal-component analysis. A representative measure from each component was selected by identifying the variable that explained the largest amount of variance in injury risk from univariate generalized linear mixed-effects models.
Three principal components were extracted, explaining 57%, 24%, and 9% of the variance. The training-load measures that were highly loaded on component 1 represented measures of the cumulative load placed on players, component 2 was associated with measures of changes in load, and component 3 represented a measure of acute load. Four-week cumulative load, acute:chronic workload, and daily training load were selected as the representative measures for each component.
The process outlined in the current study enables practitioners to monitor the most parsimonious set of variables while still retaining the variation and distinct aspects of “load” in the data.
Robert McCunn, Hugh H.K. Fullagar, Sean Williams, Travis J. Halseth, John A. Sampson and Andrew Murray
Purpose: American football is widely played by college student-athletes throughout the United States; however, the associated injury risk is greater than in other team sports. Numerous factors likely contribute to this risk, yet research identifying these risk factors is limited. The present study sought to explore the relationship between playing experience and position on injury risk in NCAA Division I college football players. Methods: Seventy-six male college student-athletes in the football program of an American NCAA Division I university participated. Injuries were recorded over 2 consecutive seasons. Players were characterized based on college year (freshman, sophomore, junior, or senior) and playing position. The effect of playing experience and position on injury incidence rates was analyzed using a generalized linear mixed-effects model, with a Poisson distribution, log-linear link function, and offset for hours of training exposure or number of in-game plays (for training and game injuries, respectively). Results: The overall rates of non-time-loss and time-loss game-related injuries were 2.1 (90% CI: 1.8–2.5) and 0.6 (90% CI: 0.4–0.8) per 1000 plays, respectively. The overall rates of non-time-loss and time-loss training-related injuries were 26.0 (90% CI: 22.6–29.9) and 7.1 (90% CI: 5.9–8.5) per 1000 h, respectively. During training, seniors and running backs displayed the greatest risk. During games, sophomores, juniors, and wide receivers were at greatest risk. Conclusions: Being aware of the elevated injury risk experienced by certain player groups may help coaches make considered decisions related to training design and player selection.
Katrina L. Piercy, Frances Bevington, Alison Vaux-Bjerke, Sandra Williams Hilfiker, Sean Arayasirikul and Elizabeth Y. Barnett
Background: The Office of Disease Prevention and Health Promotion completed research to understand factors that could encourage Americans to follow the Physical Activity Guidelines for Americans, second edition, released in 2018. This study describes survey research assessing demographic characteristics that might be related to knowledge and awareness of the guidelines. Methods: An online survey of 2050 adult physical activity contemplators assessed knowledge of physical activity, awareness of the guidelines, and knowledge of dosage recommendations. Univariate and bivariate analyses were performed, and demographic differences in knowledge and awareness were analyzed using Pearson chi-square tests and Fisher exact tests. Results: Respondents had medium to high knowledge of physical activity, although knowledge varied significantly by socioeconomic factors. Knowledge of dosage recommendations was very low, with 2% and 3% of respondents correctly identifying recommended moderate- and vigorous-intensity doses, respectively. Only 22% were aware of the guidelines; awareness was greater among those with a higher education or income and those without a disability. Conclusions: These findings guided the development of the Office of Disease Prevention and Health Promotion’s Move Your Way campaign and reinforced the need to raise awareness of the guidelines and promote behavior change among physical activity contemplators—particularly those from lower socioeconomic groups.
Leslie W. Podlog, John Heil, Ryan D. Burns, Sean Bergeson, Tom Iriye, Brad Fawver and A. Mark Williams
The authors used a quasi-experimental design to examine the efficacy of a cognitive-behavioral-therapy (CBT) intervention for enhancing psychological well-being (positive and negative affect, vitality, self-esteem), rehabilitation adherence, and clinical rehabilitation outcomes (pain, physical function) in 16 NCAA (National Collegiate Athletics Association) Division I athletes experiencing a range of severe injuries. ANCOVAs, with adjusted baseline scores, revealed significant differences between the experimental and control groups for positive affect at rehabilitation midpoint (T2; adjusted mean difference (AMD) = 0.41, p = .04, η2 = .34) and return to play (T3; AMD = 0.67, p < .001, η2 = .70), negative affect at T3 (AMD = −0.81, p = .01, η2 = .47), and vitality at T2 (AMD = 0.99, p = .01, η2 = .48) and T3 (AMD = 1.08, p = .02, η2 = .33). Given decrements in emotional functioning after injury, the data support the use of CBT-based interventions for facilitating the emotional well-being of athletes with severe injuries.
Dale B. Read, Ben Jones, Sean Williams, Padraic J. Phibbs, Josh D. Darrall-Jones, Greg A.B. Roe, Jonathon J.S. Weakley, Andrew Rock and Kevin Till
Purpose: To quantify the frequencies and timings of rugby union match-play phases (ie, attacking, defending, ball in play [BIP], and ball out of play [BOP]) and then compare the physical characteristics of attacking, defending, and BOP between forwards and backs. Methods: Data were analyzed from 59 male rugby union academy players (259 observations). Each player wore a microtechnology device (OptimEye S5; Catapult, Melbourne, Australia) with video footage analyzed for phase timings and frequencies. Dependent variables were analyzed using a linear mixed-effects model and assessed with magnitude-based inferences and Cohen d effect sizes (ES). Results: Attack, defense, BIP, and BOP times were 12.7 (3.1), 14.7 (2.5), 27.4 (2.9), and 47.4 (4.1) min, respectively. Mean attack (26  s), defense (26  s), and BIP (33  s) phases were shorter than BOP phases (59  s). The relative distance in attacking phases was similar (112.2 [48.4] vs 114.6 [52.3] m·min−1, ES = 0.00 ± 0.23) between forwards and backs but greater in forwards (114.5 [52.7] vs 109.0 [54.8] m·min−1, ES = 0.32 ± 0.23) during defense and greater in backs during BOP (ES = −0.66 ± 0.23). Conclusions: Total time in attack, defense, and therefore BIP was less than BOP. Relative distance was greater in forwards during defense, whereas it was greater in backs during BOP and similar between positions during attack. Players should be exposed to training intensities from in-play phases (ie, attack and defense) rather than whole-match data and practice technical skills during these intensities.