In an attempt to better identify and inform the energy requirements of elite soccer players, we quantified the energy expenditure (EE) of players from the English Premier League (n = 6) via the doubly labeled water method (DLW) over a 7-day in-season period. Energy intake (EI) was also assessed using food diaries, supported by the remote food photographic method and 24 hr recalls. The 7-day period consisted of 5 training days (TD) and 2 match days (MD). Although mean daily EI (3186 ± 367 kcals) was not different from (p > .05) daily EE (3566 ± 585 kcals), EI was greater (p < .05) on MD (3789 ± 532 kcal; 61.1 ± 11.4 kcal.kg-1 LBM) compared with TD (2956 ± 374 kcal; 45.2 ± 9.3 kcal.kg-1 LBM, respectively). Differences in EI were reflective of greater (p < .05) daily CHO intake on MD (6.4 ± 2.2 g.kg-1) compared with TD (4.2 ± 1.4 g.kg-1). Exogenous CHO intake was also different (p < .01) during training sessions (3.1 ± 4.4 g.h-1) versus matches (32.3 ± 21.9 g.h-1). In contrast, daily protein (205 ± 30 g.kg-1, p = .29) and fat intake (101 ± 20 g, p = .16) did not display any evidence of daily periodization as opposed to g.kg-1, Although players readily achieve current guidelines for daily protein and fat intake, data suggest that CHO intake on the day before and in recovery from match play was not in accordance with guidelines to promote muscle glycogen storage.
Liam Anderson, Patrick Orme, Robert J. Naughton, Graeme L. Close, Jordan Milsom, David Rydings, Andy O’Boyle, Rocco Di Michele, Julien Louis, Catherine Hambly, John Roger Speakman, Ryland Morgans, Barry Drust and James P. Morton
Nicola Furlan, Mark Waldron, Mark Osborne and Adrian J. Gray
To assess the ecological validity of the Rugby Sevens Simulation Protocol (R7SP) and to evaluate its interday reliability.
Ten male participants (20 ± 2 y, 74 ± 11 kg) completed 2 trials of the R7SP, separated by 7 d. The R7SP comprised typical running and collision activities, based on data recorded during international rugby sevens match play. Heart rate (HR) was monitored continuously during the R7SP, and the participants’ movements were recorded through a 20-Hz global positioning system unit. Blood lactate and rating of perceived exertion were collected before and immediately after the 1st and 2nd halves of the R7SP.
The average activity profile was 117 ± 5 m/min, of which 27 ± 2 m/min was covered at high speed, with a calculated energetic demand of 1037 ± 581 J/kg, of which ~40% was expended at a rate above 19 W/kg. Mean HR was 88% ± 4% of maximal HR. Participants spent ~45% ± 27% of time above 90% of maximal HR (t >90%HRmax). There were no significant differences between trials, except for lactate between the halves of the R7SP. The majority of the measured variables demonstrated a between-trials coefficient of variation (CV%) lower than 5%. Blood lactate measurements (14–20% CV) and t >90%HRmax (26% CV) were less reliable variables. In most cases, the calculated moderate worthwhile change was higher than the CV%.
The R7SP replicates the activity profile and HR responses of rugby sevens match play. It is a reliable simulation protocol that can be used in a research environment to detect systematic worthwhile changes in selected performance variables.
Matthias W. Hoppe, Christian Baumgart, Jutta Bornefeld, Billy Sperlich, Jürgen Freiwald and Hans-Christer Holmberg
The aims of this study were (1) to assess the running activities of adolescent tennis players during match play with respect to velocity, acceleration, and deceleration; (2) to characterize changes in these activities during the course of a match; and (3) to identify potential differences between winners and losers. Twenty well-trained adolescent male athletes (13 ± 1 y) played one simulated match each (giving a total of 10 matches), during which distances covered at different velocity categories (0 to < 1, 1 to < 2, 2 to < 3, 3 to < 4, and ≥ 4 m·s−1) and number of running activities involving high velocity (≥ 3 m·s−1), acceleration (≥ 2 m·s−2), and deceleration (≤ −2 m·s−2) were monitored using a global positioning system (10 Hz). Heart rate was also assessed. The total match time, total distance covered, peak velocity, and mean heart rate were 81.2 ± 14.6 min, 3362 ± 869 m, 4.4 ± 0.8 ms−1, and 159 ± 12 beats min−1, respectively. Running activities involving high acceleration (0.6 ± 0.2 n·min−1) or deceleration (0.6 ± 0.2 n·min−1) were three times as frequent as those involving high velocity (0.2 ± 0.1 n·min−1). No change in the pattern of running activities (P ≥ .13, d ≤ 0.39) and no differences between winners and losers (P ≥ .22, d ≤ 0.53) were evident during match play. We conclude that training of well-trained adolescent male tennis players need not focus on further development of their running abilities, since this physical component of multifactorial tennis performance does not change during the course of a match and does not differ between the winners and losers.
Jamie Highton, Thomas Mullen, Jonathan Norris, Chelsea Oxendale and Craig Twist
This aim of this study was to examine the validity of energy expenditure derived from microtechnology when measured during a repeated-effort rugby protocol. Sixteen male rugby players completed a repeated-effort protocol comprising 3 sets of 6 collisions during which movement activity and energy expenditure (EEGPS) were measured using microtechnology. In addition, energy expenditure was estimated from open-circuit spirometry (EEVO2). While related (r = .63, 90%CI .08–.89), there was a systematic underestimation of energy expenditure during the protocol (–5.94 ± 0.67 kcal/min) for EEGPS (7.2 ± 1.0 kcal/min) compared with EEVO2 (13.2 ± 2.3 kcal/min). High-speed-running distance (r = .50, 95%CI –.66 to .84) was related to EEVO2, while PlayerLoad was not (r = .37, 95%CI –.81 to .68). While metabolic power might provide a different measure of external load than other typically used microtechnology metrics (eg, high-speed running, PlayerLoad), it underestimates energy expenditure during intermittent team sports that involve collisions.
Nicola Marsh, Nick Dobbin, Craig Twist and Chris Curtis
This study assessed energy intake and expenditure of international female touch players during an international tournament. Energy intake (food diary) and expenditure (accelerometer, global positioning system) were recorded for 16 female touch players during a four-day tournament, competing in 8.0 ± 1.0 matches; two on Days 1, 2, and 4, and three on Day 3. Total daily energy expenditure (43.6 ± 3.1 Kcal·kg-1 body mass (BM)) was not different (p > .05) from energy intake (39.9 ± 9.4 Kcal·kg-1 BM). Carbohydrate intakes were below current recommendations (6–10 g·kg-1 BM) on Days 1 (4.4 ± 0.6 g·kg-1 BM) and 3 (4.7 ± 1.0 g·kg-1 BM) and significantly below (p < .05) on Day 2 (4.1 ± 1.0 g·kg-1 BM). Protein and fat intakes were consistent with recommendations (protein, 1.2–2.0 g·kg-1 BM: fat, 20–35% total Kcal) across Days 1–3 (protein, 1.9 ± 0.8, 2.2 ± 0.8, and 2.0 ± 0.7 g·kg-1 BM; fat, 35.6 ± 6.8, 38.5 ± 6.4, and 35.9 ± 5.4% total Kcal). Saturated fat intakes were greater (p < .05) than recommendations (10% total Kcal) on Days 1–3 (12.4 ± 2.9, 14.2 ± 5.1, and 12.7 ± 3.5% total Kcal). On average, female touch players maintained energy balance. Carbohydrate intakes appeared insufficient and might have contributed to the reduction (p < .05) in high-intensity running on Day 3. Further research might investigate the applicability of current nutrition recommendations and the role of carbohydrate in multimatch, multiday tournaments.
Dean J. McNamara, Tim J. Gabbett, Geraldine Naughton, Patrick Farhart and Paul Chapman
This study investigated key fatigue and workload variables of cricket fast bowlers and nonfast bowlers during a 7-wk physical-preparation period and 10-d intensified competition period.
Twenty-six elite junior cricketers (mean ± SD age 17.7 ± 1.1 y) were classified as fast bowlers (n = 9) or nonfast bowlers (n = 17). Individual workloads were measured via global positioning system technology, and neuromuscular function (countermovement jump [relative power and flight time]), endocrine (salivary testosterone and cortisol concentrations), and perceptual well-being (soreness, mood, stress, sleep quality, and fatigue) markers were recorded.
Fast bowlers performed greater competition total distance (median [interquartile range] 7049  m vs 5062  m), including greater distances at low and high speeds, and more accelerations (40  vs 19 ) and had a higher player load (912  arbitrary units vs 697  arbitrary units) than nonfast bowlers. Cortisol concentrations were higher in the physical-preparation (mean ± 90% confidence intervals, % likelihood; d = –0.88 ± 0.39, 100%) and competition phases (d = –0.39 ± 0.30, 85%), and testosterone concentrations, lower (d = 0.56 ± 0.29, 98%), in the competition phase in fast bowlers. Perceptual well-being was poorer in nonfast bowlers during competition only (d = 0.36 ± 0.22, 88%). Differences in neuromuscular function between groups were unclear during physical preparation and competition.
These findings demonstrate differences in the physical demands of cricket fast bowlers and nonfast bowlers and suggest that these external workloads differentially affect the neuromuscular, endocrine, and perceptual fatigue responses of these players.
Dean J. McNamara, Tim J. Gabbett, Paul Chapman, Geraldine Naughton and Patrick Farhart
Bowling workload is linked to injury risk in cricket fast bowlers. This study investigated the validity of microtechnology in the automated detection of bowling counts and events, including run-up distance and velocity, in cricket fast bowlers.
Twelve highly skilled fast bowlers (mean ± SD age 23.5 ± 3.7 y) performed a series of bowling, throwing, and fielding activities in an outdoor environment during training and competition while wearing a microtechnology unit (MinimaxX). Sensitivity and specificity of a bowling-detection algorithm were determined by comparing the outputs from the device with manually recorded bowling counts. Run-up distance and run-up velocity were measured and compared with microtechnology outputs.
No significant differences were observed between direct measures of bowling and nonbowling events and true positive and true negative events recorded by the MinimaxX unit (P = .34, r = .99). The bowling-detection algorithm was shown to be sensitive in both training (99.0%) and competition (99.5%). Specificity was 98.1% during training and 74.0% during competition. Run-up distance was accurately recorded by the unit, with a percentage bias of 0.8% (r = .90). The final 10-m (–8.9%, r = .88) and 5-m (–7.3%, r = .90) run-up velocities were less accurate.
The bowling-detection algorithm from the MinimaxX device is sensitive to detect bowling counts in both cricket training and competition. Although specificity is high during training, the number of false positive events increased during competition. Additional bowling workload measures require further development.
Matthias W. Hoppe, Christian Baumgart and Jürgen Freiwald
To investigate differences in running activities between adolescent and adult tennis players during match play. Differences between winning and losing players within each age group were also examined.
Forty well-trained male players (20 adolescents, 13 ± 1 y; 20 adults, 25 ± 4 y) played a simulated singles match against an opponent of similar age and ability. Running activities were assessed using portable devices that sampled global positioning system (10 Hz) and inertial-sensor (accelerometer, gyroscope, and magnetometer; 100 Hz) data. Recorded data were examined in terms of velocity, acceleration, deceleration, metabolic power, PlayerLoad, and number of accelerations toward the net and the forehand and backhand corners.
Adult players spent more time at high velocity (≥4 m/s2), acceleration (≥4 m/s2), deceleration (≤–4 m/s2), and metabolic power (≥20 W/kg) (P ≤ .009, ES = 0.9–1.5) and performed more accelerations (≥2 m/s2) toward the backhand corner (P < .001, ES = 2.6–2.7). No differences between adolescent winning and losing players were evident overall (P ≥ .198, ES = 0.0–0.6). Adult winning players performed more accelerations (2 to <4 m/s2) toward the forehand corner (P = .026, ES = 1.2), whereas adult losing players completed more accelerations (≥2 m/s2) toward the backhand corner (P ≤ .042, ES = 0.9).
This study shows that differences in running activities between adolescent and adult tennis players exist in high-intensity measures during simulated match play. Furthermore, differences between adolescent and adult players, and also between adult winning and losing players, are present in terms of movement directions. Our findings may be helpful for coaches to design different training drills for both age groups of players.
Cloe Cummins and Rhonda Orr
To investigate the impact forces of collision events during both attack and defense in elite rugby league match play and to compare the collision profiles between playing positions.
26 elite rugby league players.
Player collisions were recorded using an integrated accelerometer in global positioning system units (SPI-Pro X, GPSports). Impact forces of collisions in attack (hit-ups) and defense (tackles) were analyzed from 359 files from outside backs (n = 78), adjustables (n = 97), wide-running forwards (n = 136), and hit-up forwards (n = 48) over 1 National Rugby League season.
Hit-up forwards were involved in 0.8 collisions/min, significantly more than all other positional groups (wide-running forwards P = .050, adjustables P = .042, and outside backs P = .000). Outside backs experienced 25% fewer collisions per minute than hit-up forwards. Hit-up forwards experienced a collision within the 2 highest classifications of force (≥10 g) every 2.5 min of match play compared with 1 every 5 and 9 min for adjustables and outside backs, respectively. Hit-up forwards performed 0.5 tackles per minute of match play, 5 times that of outside backs (ES = 1.90; 95% CI [0.26,3.16]), and 0.2 hit-ups per minute of match play, twice as many as adjustables.
During a rugby league match, players are exposed to a significant number of collision events. Positional differences exist, with hit-up and wide-running forwards experiencing greater collision events than adjustables and outside backs. Although these results may be unique to the individual team’s defensive- and attacking-play strategies, they are indicative of the significant collision profiles in professional rugby league.
Paolo Menaspà, Franco M. Impellizzeri, Eric C. Haakonssen, David T. Martin and Chris R. Abbiss
To determine the consistency of commercially available devices used for measuring elevation gain in outdoor activities and sports.
Two separate observational validation studies were conducted. Garmin (Forerunner 310XT, Edge 500, Edge 750, and Edge 800; with and without elevation correction) and SRM (Power Control 7) devices were used to measure total elevation gain (TEG) over a 15.7-km mountain climb performed on 6 separate occasions (6 devices; study 1) and during a 138-km cycling event (164 devices; study 2).
TEG was significantly different between the Garmin and SRM devices (P < .05). The between-devices variability in TEG was lower when measured with the SRM than with the Garmin devices (study 1: 0.2% and 1.5%, respectively). The use of the Garmin elevation-correction option resulted in a 5–10% increase in the TEG.
While measurements of TEG were relatively consistent within each brand, the measurements differed between the SRM and Garmin devices by as much as 3%. Caution should be taken when comparing elevation-gain data recorded with different settings or with devices of different brands.