In an attempt to better identify and inform the energy requirements of elite soccer players, we quantified the energy expenditure (EE) of players from the English Premier League (n = 6) via the doubly labeled water method (DLW) over a 7-day in-season period. Energy intake (EI) was also assessed using food diaries, supported by the remote food photographic method and 24 hr recalls. The 7-day period consisted of 5 training days (TD) and 2 match days (MD). Although mean daily EI (3186 ± 367 kcals) was not different from (p > .05) daily EE (3566 ± 585 kcals), EI was greater (p < .05) on MD (3789 ± 532 kcal; 61.1 ± 11.4 kcal.kg-1 LBM) compared with TD (2956 ± 374 kcal; 45.2 ± 9.3 kcal.kg-1 LBM, respectively). Differences in EI were reflective of greater (p < .05) daily CHO intake on MD (6.4 ± 2.2 g.kg-1) compared with TD (4.2 ± 1.4 g.kg-1). Exogenous CHO intake was also different (p < .01) during training sessions (3.1 ± 4.4 g.h-1) versus matches (32.3 ± 21.9 g.h-1). In contrast, daily protein (205 ± 30 g.kg-1, p = .29) and fat intake (101 ± 20 g, p = .16) did not display any evidence of daily periodization as opposed to g.kg-1, Although players readily achieve current guidelines for daily protein and fat intake, data suggest that CHO intake on the day before and in recovery from match play was not in accordance with guidelines to promote muscle glycogen storage.
Liam Anderson, Patrick Orme, Robert J. Naughton, Graeme L. Close, Jordan Milsom, David Rydings, Andy O’Boyle, Rocco Di Michele, Julien Louis, Catherine Hambly, John Roger Speakman, Ryland Morgans, Barry Drust and James P. Morton
Luis Suarez-Arrones, Javier Núñez, Eduardo Sáez de Villareal, Javier Gálvez, Gabriel Suarez-Sanchez and Diego Munguía-Izquierdo
To describe the repeated-high-intensity activity and internal training load of rugby sevens players during international matches and to compare the differences between the 1st and 2nd halves.
Twelve international-level male rugby sevens players were monitored during international competitive matches (n = 30 match files) using global positioning system technology and heart-rate monitoring.
The relative total distance covered by the players throughout the match was 112.1 ± 8.4 m/min. As a percentage of total distance, 35.0% (39.2 ± 9.0 m/min) was covered at medium speed and 17.1% (19.2 ± 6.8 m/min) at high speed. A substantial decrease in the distance covered at >14.0 km/h and >18.0 km/h, the number of accelerations of >2.78 m/s and >4.0 m/s, repeated-sprint sequences interspersed with ≤60 s rest, and repeated-acceleration sequences interspersed with ≤30 s or ≤60 s rest was observed in the 2nd half compared with the 1st half. A substantial increase in the mean heart rate (HR), HRmax, percentage of time at >80% HRmax and at >90% HRmax, and Edwards training load was observed in the 2nd half compared with the 1st half.
This study provides evidence of a pronounced reduction in high-intensity and repeated-highintensity activities and increases in internal training load in rugby sevens players during the 2nd half of international matches.
Kieran Cooke, Tom Outram, Raph Brandon, Mark Waldron, Will Vickery, James Keenan and Jamie Tallent
Purpose: First, to assess changes in neuromuscular function via alterations in countermovement-jump strategy after training and 2 forms of competition and second, to compare the relationship between workloads and fatigue in seam bowlers and nonseam bowlers. Methods: Twenty-two professional cricketers’ neuromuscular function was assessed at baseline, immediately post and +24 h posttraining, and after multiday and 1-day cricket events. In addition, perceptual (rating of perceived exertion [RPE] and soreness) measures and external loads (PlayerLoad™, number of sprints, total distance, and overs) were monitored across all formats. Results: Seam bowlers covered more distance, completed more sprints, and had a higher RPE in training (P < .05), without any difference in soreness compared with nonseam bowlers. Compared with seam bowlers, the nonseam bowlers’ peak force decreased post-24 h compared with baseline only in 1-d cricket (95% CI, 2.1–110.0 N; P < .04). There were no pre–post training or match differences in jump height or alterations in jump strategy (P > .05). Seam bowlers increased their peak jumping force from baseline to immediately posttraining or game (95% CI, 28.8–132.4 N; P < .01) but decreased between postcricket to +24 h (95% CI, 48.89–148.0 N; P < .001). Conclusion: Seam bowlers were more accustomed to high workloads than nonseamers and thus more fatigue resistant. Changes in jump height or strategy do not appear to be effective methods of assessing fatigue in professional crickets. More common metrics such as peak force are more sensitive.
Liam Anderson, Patrick Orme, Rocco Di Michele, Graeme L. Close, Jordan Milsom, Ryland Morgans, Barry Drust and James P. Morton
To quantify the accumulative training and match load during an annual season in English Premier League soccer players classified as starters (n = 8, started ≥60% of games), fringe players (n = 7, started 30–60% of games) and nonstarters (n = 4, started <30% of games).
Players were monitored during all training sessions and games completed in the 2013–14 season with load quantified using global positioning system and Prozone technology, respectively.
When including both training and matches, total duration of activity (10,678 ± 916, 9955 ± 947, 10,136 ± 847 min; P = .50) and distance covered (816.2 ± 92.5, 733.8 ± 99.4, 691.2 ± 71.5 km; P = .16) were not different between starters, fringe players, and nonstarters, respectively. However, starters completed more (all P < .01) distance running at 14.4–19.8 km/h (91.8 ± 16.3 vs 58.0 ± 3.9 km; effect size [ES] = 2.5), high-speed running at 19.9–25.1 km/h (35.0 ± 8.2 vs 18.6 ± 4.3 km; ES = 2.3), and sprinting at >25.2 km/h (11.2 ± 4.2 vs 2.9 ± 1.2 km; ES = 2.3) than nonstarters. In addition, starters also completed more sprinting (P < .01, ES = 2.0) than fringe players, who accumulated 4.5 ± 1.8 km. Such differences in total high-intensity physical work done were reflective of differences in actual game time between playing groups as opposed to differences in high-intensity loading patterns during training sessions.
Unlike total seasonal volume of training (ie, total distance and duration), seasonal high-intensity loading patterns are dependent on players’ match starting status, thereby having potential implications for training program design.
Nicola Furlan, Mark Waldron, Mark Osborne and Adrian J. Gray
To assess the ecological validity of the Rugby Sevens Simulation Protocol (R7SP) and to evaluate its interday reliability.
Ten male participants (20 ± 2 y, 74 ± 11 kg) completed 2 trials of the R7SP, separated by 7 d. The R7SP comprised typical running and collision activities, based on data recorded during international rugby sevens match play. Heart rate (HR) was monitored continuously during the R7SP, and the participants’ movements were recorded through a 20-Hz global positioning system unit. Blood lactate and rating of perceived exertion were collected before and immediately after the 1st and 2nd halves of the R7SP.
The average activity profile was 117 ± 5 m/min, of which 27 ± 2 m/min was covered at high speed, with a calculated energetic demand of 1037 ± 581 J/kg, of which ~40% was expended at a rate above 19 W/kg. Mean HR was 88% ± 4% of maximal HR. Participants spent ~45% ± 27% of time above 90% of maximal HR (t >90%HRmax). There were no significant differences between trials, except for lactate between the halves of the R7SP. The majority of the measured variables demonstrated a between-trials coefficient of variation (CV%) lower than 5%. Blood lactate measurements (14–20% CV) and t >90%HRmax (26% CV) were less reliable variables. In most cases, the calculated moderate worthwhile change was higher than the CV%.
The R7SP replicates the activity profile and HR responses of rugby sevens match play. It is a reliable simulation protocol that can be used in a research environment to detect systematic worthwhile changes in selected performance variables.
Paolo Menaspà, Franco M. Impellizzeri, Eric C. Haakonssen, David T. Martin and Chris R. Abbiss
To determine the consistency of commercially available devices used for measuring elevation gain in outdoor activities and sports.
Two separate observational validation studies were conducted. Garmin (Forerunner 310XT, Edge 500, Edge 750, and Edge 800; with and without elevation correction) and SRM (Power Control 7) devices were used to measure total elevation gain (TEG) over a 15.7-km mountain climb performed on 6 separate occasions (6 devices; study 1) and during a 138-km cycling event (164 devices; study 2).
TEG was significantly different between the Garmin and SRM devices (P < .05). The between-devices variability in TEG was lower when measured with the SRM than with the Garmin devices (study 1: 0.2% and 1.5%, respectively). The use of the Garmin elevation-correction option resulted in a 5–10% increase in the TEG.
While measurements of TEG were relatively consistent within each brand, the measurements differed between the SRM and Garmin devices by as much as 3%. Caution should be taken when comparing elevation-gain data recorded with different settings or with devices of different brands.
Cloe Cummins and Rhonda Orr
To investigate the impact forces of collision events during both attack and defense in elite rugby league match play and to compare the collision profiles between playing positions.
26 elite rugby league players.
Player collisions were recorded using an integrated accelerometer in global positioning system units (SPI-Pro X, GPSports). Impact forces of collisions in attack (hit-ups) and defense (tackles) were analyzed from 359 files from outside backs (n = 78), adjustables (n = 97), wide-running forwards (n = 136), and hit-up forwards (n = 48) over 1 National Rugby League season.
Hit-up forwards were involved in 0.8 collisions/min, significantly more than all other positional groups (wide-running forwards P = .050, adjustables P = .042, and outside backs P = .000). Outside backs experienced 25% fewer collisions per minute than hit-up forwards. Hit-up forwards experienced a collision within the 2 highest classifications of force (≥10 g) every 2.5 min of match play compared with 1 every 5 and 9 min for adjustables and outside backs, respectively. Hit-up forwards performed 0.5 tackles per minute of match play, 5 times that of outside backs (ES = 1.90; 95% CI [0.26,3.16]), and 0.2 hit-ups per minute of match play, twice as many as adjustables.
During a rugby league match, players are exposed to a significant number of collision events. Positional differences exist, with hit-up and wide-running forwards experiencing greater collision events than adjustables and outside backs. Although these results may be unique to the individual team’s defensive- and attacking-play strategies, they are indicative of the significant collision profiles in professional rugby league.
Ademir F.S. Arruda, Christopher Carling, Vinicius Zanetti, Marcelo S. Aoki, Aaron J. Coutts and Alexandre Moreira
To analyze the effects of a very congested match schedule on the total distance (TD) covered, high-intensity-running (HIR) distance, and frequency of accelerations and body-load impacts (BLIs) performed in a team of under-15 soccer players (N = 10; 15.1 ± 0.2 y, 171.8 ± 4.7 cm, 61 ± 6.0 kg) during an international youth competition.
Using global positioning systems, player performances were repeatedly monitored in 5 matches performed over 3 successive days.
Significant differences were observed between matches (P < .05) for the frequency of accelerations per minute, BLIs, and BLIs per minute. No differences were observed for the TD covered, TD run per minute, number of high-intensity runs, distance covered in HIR, per-minute peak running speed attained, or frequency of accelerations. The frequency of accelerations per minute decreased across the competition while BLIs were higher during the final than in all other matches.
These results suggest that BLIs and acceleration might be used as an alternative means to represent the external load during congested match schedules rather than measures related to running speed or distance covered.
Nicola Marsh, Nick Dobbin, Craig Twist and Chris Curtis
This study assessed energy intake and expenditure of international female touch players during an international tournament. Energy intake (food diary) and expenditure (accelerometer, global positioning system) were recorded for 16 female touch players during a four-day tournament, competing in 8.0 ± 1.0 matches; two on Days 1, 2, and 4, and three on Day 3. Total daily energy expenditure (43.6 ± 3.1 Kcal·kg-1 body mass (BM)) was not different (p > .05) from energy intake (39.9 ± 9.4 Kcal·kg-1 BM). Carbohydrate intakes were below current recommendations (6–10 g·kg-1 BM) on Days 1 (4.4 ± 0.6 g·kg-1 BM) and 3 (4.7 ± 1.0 g·kg-1 BM) and significantly below (p < .05) on Day 2 (4.1 ± 1.0 g·kg-1 BM). Protein and fat intakes were consistent with recommendations (protein, 1.2–2.0 g·kg-1 BM: fat, 20–35% total Kcal) across Days 1–3 (protein, 1.9 ± 0.8, 2.2 ± 0.8, and 2.0 ± 0.7 g·kg-1 BM; fat, 35.6 ± 6.8, 38.5 ± 6.4, and 35.9 ± 5.4% total Kcal). Saturated fat intakes were greater (p < .05) than recommendations (10% total Kcal) on Days 1–3 (12.4 ± 2.9, 14.2 ± 5.1, and 12.7 ± 3.5% total Kcal). On average, female touch players maintained energy balance. Carbohydrate intakes appeared insufficient and might have contributed to the reduction (p < .05) in high-intensity running on Day 3. Further research might investigate the applicability of current nutrition recommendations and the role of carbohydrate in multimatch, multiday tournaments.
Matthias W. Hoppe, Christian Baumgart and Jürgen Freiwald
To investigate differences in running activities between adolescent and adult tennis players during match play. Differences between winning and losing players within each age group were also examined.
Forty well-trained male players (20 adolescents, 13 ± 1 y; 20 adults, 25 ± 4 y) played a simulated singles match against an opponent of similar age and ability. Running activities were assessed using portable devices that sampled global positioning system (10 Hz) and inertial-sensor (accelerometer, gyroscope, and magnetometer; 100 Hz) data. Recorded data were examined in terms of velocity, acceleration, deceleration, metabolic power, PlayerLoad, and number of accelerations toward the net and the forehand and backhand corners.
Adult players spent more time at high velocity (≥4 m/s2), acceleration (≥4 m/s2), deceleration (≤–4 m/s2), and metabolic power (≥20 W/kg) (P ≤ .009, ES = 0.9–1.5) and performed more accelerations (≥2 m/s2) toward the backhand corner (P < .001, ES = 2.6–2.7). No differences between adolescent winning and losing players were evident overall (P ≥ .198, ES = 0.0–0.6). Adult winning players performed more accelerations (2 to <4 m/s2) toward the forehand corner (P = .026, ES = 1.2), whereas adult losing players completed more accelerations (≥2 m/s2) toward the backhand corner (P ≤ .042, ES = 0.9).
This study shows that differences in running activities between adolescent and adult tennis players exist in high-intensity measures during simulated match play. Furthermore, differences between adolescent and adult players, and also between adult winning and losing players, are present in terms of movement directions. Our findings may be helpful for coaches to design different training drills for both age groups of players.