We propose that physiological and performance tests used in sport science research and professional practice should be developed following a rigorous validation process, as is done in other scientific fields, such as clinimetrics, an area of research that focuses on the quality of clinical measurement and uses methods derived from psychometrics. In this commentary, we briefly review some of the attributes that must be explored when validating a test: the conceptual model, validity, reliability, and responsiveness. Examples from the sport science literature are provided.
Search Results
Franco M. Impellizzeri and Samuele M. Marcora
Erik H. Arve, Emily Madrak, and Aric J. Warren
Focused Clinical Question: Is there evidence to suggest that blood flow restriction (BFR) training improves strength, cross-sectional area, and thigh girth of the quadriceps musculature in patients after arthroscopic surgical procedures of the knee? Clinical Bottom Line: There is moderate consistent, but low-level, evidence supporting the use of BFR training to improve knee extensor muscular outcomes (strength, cross-sectional area, and/or thigh girth) immediately after arthroscopic knee surgery.
Billy T. Hulin, Tim J. Gabbett, Nathan J. Pickworth, Rich D. Johnston, and David G. Jenkins
Purpose: To examine relationships among physical performance, workload, and injury risk in professional rugby league players. Methods: Maximal-effort (n = 112) and submaximal (n = 1084) running performances of 45 players were recorded from 1 club over 2 consecutive seasons. Poorer and better submaximal running performance was determined by higher and lower exercise heart rates, respectively. Exponentially weighted moving averages and daily rolling averages were used to assess microtechnology-derived acute and chronic field-based workloads. The associations among within-individual submaximal running performance, workload, and noncontact lower-limb injury were then investigated. Results: The injury risk associated with poorer submaximal performance was “likely” greater than stable (relative risk = 1.8; 90% confidence interval, 0.9–3.7) and better submaximal performance (relative risk = 2.0; 90% confidence interval, 0.9–4.4). Compared with greater submaximal performance, poorer performance was associated with lower chronic workloads (effect size [d] = 0.82 [0.13], large) and higher acute:chronic workload ratios (d = 0.49 [0.14], small). Chronic workload demonstrated a “nearly perfect” positive relationship with maximal-effort running performance (exponentially weighted moving average, R 2 = .91 [.15]; rolling average, R 2 = .91 [.14]). At acute:chronic workload ratios >1.9, no differences in injury risk were found between rolling average and exponentially weighted moving average methods (relative risk = 1.1; 90% confidence interval, 0.3–3.8; unclear). Conclusions: Reductions in submaximal running performance are related with low chronic workloads, high acute:chronic workload ratios, and increased injury risk. These findings demonstrate that a submaximal running assessment can be used to provide information on physical performance and injury risk in professional rugby league players.
Carl Foster, Jos J. de Koning, Christian Thiel, Bram Versteeg, Daniel A. Boullosa, Daniel Bok, and John P. Porcari
Background: Pacing studies suggest the distribution of effort for optimizing performance. Cross-sectional studies of 1-mile world records (WRs) suggest that WR progression includes a smaller coefficient of variation of velocity. Purpose: This study evaluates whether intraindividual pacing used by elite runners to break their own WR (1 mile, 5 km, and 10 km) is related to the evolution of pacing strategy. We provide supportive data from analysis in subelite runners. Methods: Men’s WR performances (with 400-m or 1-km splits) in 1 mile, 5 km, and 10 km were retrieved from the IAAF database (from 1924 to present). Data were analyzed relative to pacing pattern when a runner improved their own WR. Similar analyses are presented for 10-km performance in subelite runners before and after intensified training. Results: WR performance was improved in 1 mile (mean [SD]: 3:59.4 [11.2] to 3:57.2 [8.6]), 5 km (13:27 [0:33] to 13:21 [0:33]), and 10 km (28:35 [1:27] to 28:21 [1:21]). The average coefficient of variation did not change in the 1 mile (3.4% [1.8%] to 3.6% [1.6%]), 5 km (2.4% [0.9%] to 2.2% [0.8%]), or 10 km (1.4% [0.1%] to 1.5% [0.6%]) with improved WR. When velocity was normalized to the percentage mean velocity for each race, the pacing pattern was almost identical. Very similar patterns were observed in subelite runners in the 10 km. When time improved from 49:20 (5:30) to 45:56 (4:58), normalized velocity was similar, terminal RPE increased (8.4 [1.6] to 9.1 [0.8]), coefficient of variation was unchanged (4.4% [1.1%] to 4.8% [2.1%]), and VO2max increased (49.8 [7.4] to 55.3 [8.8] mL·min−1·kg−1). Conclusion: The results suggest that when runners break their own best performances, they employ the same pacing pattern, which is different from when WRs are improved in cross-sectional data.
Hiroyuki Sagayama, Makiko Toguchi, Jun Yasukata, Kazunari Yonaha, Yasuki Higaki, and Hiroaki Tanaka
Prior studies have examined offshore sailing and energy strategies using accurate total energy expenditure (TEE) measurement in free-living conditions. However, no research has studied energy and water requirements during dinghy class sailing such as an Olympic event with concentrated training. This study aimed to investigate the TEE, physical activity level (PAL), and water turnover (rH2O) of collegiate dinghy sailors in a training camp using the doubly labeled water method. Eleven dinghy sailing collegiate athletes (nine males and two females) participated. The doubly labeled water method was used to determine the participants’ TEE and PAL over 8 days (six training and two nontraining days). Participants trained approximately 7 hr/day on water. Body fat was measured using a stable isotope dilution method. The rH2O was estimated using deuterium turnover. The mean TEE, PAL, and rH2O were 17.30 ± 4.22 MJ/day (4,133 ± 1,009 kcal/day), 2.8 ± 0.3 (range: 2.1–4.1), and 3.3 ± 0.7 L/day (range: 2.6–4.5 L/day), respectively. To our knowledge, this was the first study to use the doubly labeled water method to determine TEE, PAL, and rH2O as references for competitive dinghy sailors in a spring training camp. Our results may serve as a reference to assist competitive dinghy sailors in determining their required nutritional support.
James A. Betts, Javier T. Gonzalez, Louise M. Burke, Graeme L. Close, Ina Garthe, Lewis J. James, Asker E. Jeukendrup, James P. Morton, David C. Nieman, Peter Peeling, Stuart M. Phillips, Trent Stellingwerff, Luc J.C. van Loon, Clyde Williams, Kathleen Woolf, Ron Maughan, and Greg Atkinson
Tiaki B. Smith, Will G. Hopkins, and Tim E. Lowe
There is a need for markers that would help determine when an athlete’s training load is either insufficient or excessive. In this study we examined the relationship between changes in performance and changes in physiological and psychological markers during and following a period of overload training in 10 female and 10 male elite rowers. Change in performance during a 4-wk overload was determined with a weekly 30-min time-trial on a rowing ergometer, whereas an incremental test provided change in lactate-threshold power between the beginning of the study and following a 1-wk taper after the overload. Various psychometric, steroid-hormone, muscle-damage, and inflammatory markers were assayed throughout the overload. Plots of change in performance versus the 4-wk change in each marker were examined for evidence of an inverted-U relationship that would characterize undertraining and excessive training. Linear modeling was also used to estimate the effect of changes in the marker on changes in performance. There was a suggestion of an inverted U only for performance in the incremental test versus some inflammatory markers, due to the relative underperformance of one rower. There were some clear linear relationships between changes in markers and changes in performance, but relationships were inconsistent within classes of markers. For some markers, changes considered to predict excessive training (eg, creatine kinase, several proinflammatory cytokines) had small to large positive linear relationships with performance. In conclusion, some of the markers investigated in this study may be useful for adjusting the training load in individual elite rowers.
Nathan Elsworthy and Ben J. Dascombe
Purpose:
The main purpose of the present study was to quantify the match running demands and physiological intensities of AF field and boundary umpires during match play.
Methods:
Thirty-five AF umpires [20 field (age: 24.7 ± 7.7 y, body mass: 74.3 ± 7.1 kg, Σ7 skinfolds: 67.8 ± 18.8 mm); 15 boundary (age: 29.6 ± 13.6 y, body mass: 71.9 ± 3.1 kg, Σ7 skinfolds: 65.6 ± 8.8 mm)] volunteered to participate in the study. Movement characteristics [total distance (TD), average running speed, high-intensity activity (HIA; >14.4 km·h–1) distance] and physiological measures [heart rate, blood lactate concentration ([BLa–]), and rating of perceived exertion] were collected during 20 state-based AF matches.
Results:
The mean (± SD) TD covered by field umpires was 11,492 ± 1,729 m, with boundary umpires covering 15,061 ± 1,749 m. The average running speed in field umpires was 103 ± 14 m·min-1, and was 134 ± 14 m·min-1 in boundary umpires. Field and boundary umpires covered 3,095 ± 752 m and 5,875 ± 1,590 m, during HIA, respectively. In the first quarter, HIA distance (field: P = .004, η2 = 0.071, boundary: P < .001, η2 = 0.180) and average running speed (field: P = .002, η2 = 0.078, boundary: P < .001, η2 = 0.191) were significantly greater than in subsequent quarters.
Conclusions:
The results demonstrate that both AF field and boundary umpires complete similar running demands to elite AF players and are subject to physical fatigue. Further research is warranted to see if this physical fatigue impacts on the cognitive function of AF umpires during match play.
Robert J. Aughey
Global positioning system (GPS) technology was made possible after the invention of the atomic clock. The first suggestion that GPS could be used to assess the physical activity of humans followed some 40 y later. There was a rapid uptake of GPS technology, with the literature concentrating on validation studies and the measurement of steady-state movement. The first attempts were made to validate GPS for field sport applications in 2006. While GPS has been validated for applications for team sports, some doubts continue to exist on the appropriateness of GPS for measuring short high-velocity movements. Thus, GPS has been applied extensively in Australian football, cricket, hockey, rugby union and league, and soccer. There is extensive information on the activity profile of athletes from field sports in the literature stemming from GPS, and this includes total distance covered by players and distance in velocity bands. Global positioning systems have also been applied to detect fatigue in matches, identify periods of most intense play, different activity profiles by position, competition level, and sport. More recent research has integrated GPS data with the physical capacity or fitness test score of athletes, game-specific tasks, or tactical or strategic information. The future of GPS analysis will involve further miniaturization of devices, longer battery life, and integration of other inertial sensor data to more effectively quantify the effort of athletes.
Robert J. Aughey
Background:
Australian football (AF) is a highly intermittent sport, requiring athletes to accelerate hundreds of times with repeated bouts of high-intensity running (HIR). Players aim to be in peak physical condition for finals, with anecdotal evidence of increased speed and pressure of these games.
Purpose:
However, no data exists on the running demands of finals games, and therefore the aim of this study was to compare the running demands of finals to regular season games with matched players and opponents.
Methods:
Player movement was recorded by GPS at 5 Hz and expressed per period of the match (rotation), for total distance, high-intensity running (HIR, 4.17-10.00 m·s-1) and maximal accelerations (2.78-10.00 m·s–2). All data was compared for regular season and finals games and the magnitude of effects was analyzed with the effect size (ES) statistic and expressed with confidence intervals.
Results:
Each of the total distance (11%; ES: 0.78 ± 0.30), high-intensity running distance (9%; ES: 0.29 ± 0.25) and number of maximal accelerations (97%; ES: 1.30 ± 0.20) increased in finals games. The largest percentage increases in maximal accelerations occurred from a commencement velocity of between 3–4 (47%; ES: 0.56 ± 0.21) and 4–5 m·s-1 (51%; ES: 0.72 ± 0.26), and with <19 s between accelerations (53%; ES: 0.63 ± 0.27).
Conclusion:
Elite AF players nearly double the number of maximal accelerations in finals compared with regular season games. This large increase is superimposed on requirements to cover a greater total distance and spend more time at high velocity during finals games. Players can be effectively conditioned to cope with these increased demands, even during a long competitive season.