Purpose: To characterize the weekly training load (TL) and well-being of college basketball players during the in-season phase. Methods: Ten (6 guards and 4 forwards) male basketball players (age 20.9 [0.9] y, stature 195.0 [8.2] cm, and body mass 91.3 [11.3] kg) from the same Division I National Collegiate Athletic Association team were recruited to participate in this study. Individualized training and game loads were assessed using the session rating of perceived exertion at the end of each training and game session, and well-being status was collected before each session. Weekly changes (%) in TL, acute-to-chronic workload ratio, and well-being were determined. Differences in TL and well-being between starting and bench players and between 1-game and 2-game weeks were calculated using magnitude-based statistics. Results: Total weekly TL and acute-to-chronic workload ratio demonstrated high week-to-week variation, with spikes up to 226% and 220%, respectively. Starting players experienced a higher (most likely negative) total weekly TL and similar (unclear) well-being status compared with bench players. Game scheduling influenced TL, with 1-game weeks demonstrating a higher (likely negative) total weekly TL and similar (most likely trivial) well-being status compared with 2-game weeks. Conclusions: These findings provide college basketball coaches information to optimize training strategies during the in-season phase. Basketball coaches should concurrently consider the number of weekly games and player status (starting vs bench player) when creating individualized periodization plans, with increases in TL potentially needed in bench players, especially in 2-game weeks.
Daniele Conte, Nicholas Kolb, Aaron T. Scanlan, and Fabrizio Santolamazza
Alberto Franceschi, Daniele Conte, Marco Airale, and Jaime Sampaio
Purpose: To describe and identify individual trends and changes in training load, neuromuscular readiness, and perceptual fatigue measures in 2 youth elite long jumpers, finalists at the European Athletics U18 (Under 18) Championships (athlete A, female, age 16.5 y, long-jump record 6.25 m; athlete B, male, age 16.0 y, long-jump record 7.28 m). Methods: Data were collected from both training sessions and athletics competitions during a 16-week period, divided into a preparation (weeks 1–8) and a competitive phase (weeks 9–16). Training load was computed through training diaries (training time, sprint, jumping, and weights volume). The countermovement jump and the 10-to-5 repeated-jump test were executed on a weekly basis to assess neuromuscular readiness, and perceptual fatigue measures were collected through a wellness questionnaire. Statistical analysis was conducted using a magnitude-based decisions approach. Results: The results highlighted a decrease in training load during the competitive period with moderate to large differences for training time, sprint, and jump volume. Moreover, data showed an upward trend and very likely higher scores in vertical-jump performance across the competitive phase together with a very likely lower perceptual fatigue. Conclusions: This scenario seemed to be favorable to achieve competition performance very close to the personal record during the competitive season. This study provided an example of application of a comprehensive monitoring system with young athletes involved in track-and-field jumping events.
Montassar Tabben, Daniele Conte, Monoem Haddad, and Karim Chamari
Purpose: To assess the technical and tactical demands of elite karate athletes in relation to 3 match sequences (ie, advantage, disadvantage, and drawing) and match outcome (ie, win/defeat).Methods: One hundred twenty elite seniors’ (60 men and 60 women) World Karate Federation combats were analyzed during 2 World Championships (2012 and 2014). Specific karate attributes (strategy, technique, tactic, target, and effectiveness) were evaluated and classified into 3 sequences: advantage, disadvantage, and drawing. Results: Karatekas performed more combination techniques in disadvantage sequences than in drawing sequences (P = .011). A higher number of timed-attack actions were reported during advantage sequences than during drawing sequences (P = .048). Winners of the whole combat had higher lower-limb technique rate (1.0 [0.9] vs 0.1 [0.3]; P = .044) and less rate of timed attack (0.3 [0.5] vs 0.6 [1.0]; P = .030) than defeated karatekas during advantage and drawing sequences, respectively. Conclusions: Winners used higher lower-limb technique and less timed-attack rates than defeated karatekas in advantage and drawing sequences, respectively. Indeed, using lower-limb technique during advantageous situations could be a powerful strategy to increase the lead. Therefore, it seems fundamental for coaches of top elite karatekas to put their athletes in simulated situations and push them to increase their use of lower-limb techniques.
Henrikas Paulauskas, Rasa Kreivyte, Aaron T. Scanlan, Alexandre Moreira, Laimonas Siupsinskas, and Daniele Conte
Purpose: To assess the weekly fluctuations in workload and differences in workload according to playing time in elite female basketball players. Methods: A total of 29 female basketball players (mean [SD] age 21  y, stature 181  cm, body mass 71  kg, playing experience 12  y) belonging to the 7 women’s basketball teams competing in the first-division Lithuanian Women’s Basketball League were recruited. Individualized training loads (TLs) and game loads (GLs) were assessed using the session rating of perceived exertion after each training session and game during the entire in-season phase (24 wk). Percentage changes in total weekly TL (weekly TL + GL), weekly TL, weekly GL, chronic workload, acute:chronic workload ratio, training monotony, and training strain were calculated. Mixed linear models were used to assess differences for each dependent variable, with playing time (low vs high) used as fixed factor and subject, week, and team as random factors. Results: The highest changes in total weekly TL, weekly TL, and acute:chronic workload ratio were evident in week 13 (47%, 120%, and 49%, respectively). Chronic workload showed weekly changes ≤10%, whereas monotony and training strain registered highest fluctuations in weeks 17 (34%) and 15 (59%), respectively. A statistically significant difference in GL was evident between players completing low and high playing times (P = .026, moderate), whereas no significant differences (P > .05) were found for all other dependent variables. Conclusions: Coaches of elite women’s basketball teams should monitor weekly changes in workload during the in-season phase to identify weeks that may predispose players to unwanted spikes and adjust player workload according to playing time.
Davide Ferioli, Aaron T. Scanlan, Daniele Conte, Emanuele Tibiletti, and Ermanno Rampinini
Purpose: To quantify and compare the internal workloads experienced during the playoffs and regular season in basketball. Methods: A total of 10 professional male basketball players competing in the Italian first division were monitored during the final 6 weeks of the regular season and the entire 6-week playoff phase. Internal workload was quantified using the session rating of perceived exertion (s-RPE) method for all training sessions and games. A 2-way repeated-measures analysis of variance (day type × period) was utilized to assess differences in daily s-RPE between game days, days within 24 hours of games, and days >24 hours from games during the playoffs and regular season. Comparisons in weekly training, game, and total workloads were made between the playoffs and regular season using paired t tests and effect sizes. Results: A significant interaction between day and competitive period for s-RPE was found (P = .003, moderate). Lower s-RPE was apparent during playoff and regular-season days within 24 hours of games than all other days (P < .001, very large). Furthermore, s-RPE across days >24 hours from playoff games was different than all other days (P ≤ .01, moderate–very large). Weekly training (P = .009, very large) and total (P < .001, moderate) s-RPE were greater during the regular season than playoffs, whereas weekly game s-RPE was greater during the playoffs than the regular season (P < .001, very large). Conclusions: This study presents an exploratory investigation of internal workload during the playoffs in professional basketball. Players experienced greater training and total weekly workloads during the regular season than during the playoffs with similar daily game workloads between periods.
Davide Ferioli, Diego Rucco, Ermanno Rampinini, Antonio La Torre, Marco M. Manfredi, and Daniele Conte
Purpose: To examine the physiological, physical, and technical demands of game-based drills (GBDs) with regular dribble (RD) or no dribble (ND) involving a different number of players (3 vs 3, 4 vs 4, and 5 vs 5). Methods: Ten regional-level male basketball players performed 6 full-court GBD formats (each consisting of 3 bouts of 4 min and 2 min rest) on multiple occasions. The physiological and perceptual responses were measured through heart rate and rating of perceived exertion. Video-based time–motion analysis was performed to assess the GBD physical demands. The frequencies of occurrence and the duration were calculated for high-intensity, moderate-intensity, low-intensity, and recovery activities. Technical demands were assessed with a notional-analysis technique. A 2-way repeated-measures analysis of variance was used to assess statistical differences between GBD formats. Results: A greater perceptual response (rating of perceived exertion) was recorded during 3 versus 3 than 5 versus 5 formats (P = .005). Significant interactions were observed for the number of recovery (P = .021), low-intensity activity (P = .007), and all movements (P = .001) completed. Greater time was spent performing low-intensity and high-intensity activities during RD than ND format. Greater technical demands were observed for several variables during 3 versus 3 than 4 versus 4 or 5 versus 5. A greater number of turnovers (P = .027), total (P ≤ .001), and correct passes (P ≤ .001) were recorded during ND than RD format. Conclusions: The number of players predominantly affected the perceptual response to GBD, while both the number of players and rule modification (RD vs ND) affected activities performed during GBD. Reducing the number of players increases the GBD technical elements, while ND format promotes a greater number of turnovers and passes.
Oliver O. Badin, Mitchell R. Smith, Daniele Conte, and Aaron J. Coutts
To assess the effects of mental fatigue on physical and technical performance in small-sided soccer games.
Twenty soccer players (age 17.8 ± 1.0 y, height 179 ± 5 cm, body mass 72.4 ± 6.8 kg, playing experience 8.3 ± 1.4 y) from an Australian National Premier League soccer club volunteered to participate in this randomized crossover investigation. Participants played 15-min 5-vs-5 small-sided games (SSGs) without goalkeepers on 2 occasions separated by 1 wk. Before the SSG, 1 team watched a 30-min emotionally neutral documentary (control), while the other performed 30 min of a computer-based Stroop task (mental fatigue). Subjective ratings of mental and physical fatigue were recorded before and after treatment and after the SSG. Motivation was assessed before treatment and SSG; mental effort was assessed after treatment and SSG. Player activity profiles and heart rate (HR) were measured throughout the SSG, whereas ratings of perceived exertion (RPEs) were recorded before the SSG and immediately after each half. Video recordings of the SSG allowed for notational analysis of technical variables.
Subjective ratings of mental fatigue and effort were higher after the Stroop task, whereas motivation for the upcoming SSG was similar between conditions. HR during the SSG was possibly higher in the control condition, whereas RPE was likely higher in the mental-fatigue condition. Mental fatigue had an unclear effect on most physical-performance variables but impaired most technical-performance variables.
Mental fatigue impairs technical but not physical performance in small-sided soccer games.
Daniele Conte, Ademir Felipe Schultz Arruda, Filipe M. Clemente, Daniel Castillo, Paulius Kamarauskas, and Aristide Guerriero
Purpose: To assess the relationship between external and internal load during official women’s rugby seven matches. Methods: Six backs (age = 24.2 [3.2] y; height = 161.5 [7.3] cm; body mass = 59.5 [5.0] kg; playing experience = 5.3 [1.5] y) and 8 forwards (age = 22.4 [2.7] y; height = 167.0 [4.8] cm; body mass = 70.6 [5.6] kg; playing experience = 5.0 [1.5] y) belonging to the women’s rugby seven Brazilian national team were monitored across 3 international tournaments during the 2019–20 season, with 2 players excluded from the analysis since they did not participate in any investigated match. Total distance (TD), distance during high-intensity running 18.1 to 20.0 km·h−1 and >20 km·h−1 (sprinting), number of accelerations >1.8 m·s−2 (ACC), and of decelerations <1.8 m·s−2 were used as match load volume measures, while their relative values (TD per minute, high-intensity running per minute, sprinting per minute, ACC per minute, and decelerations per minute) were used as external load match intensity measures. Internal load intensity and volume were assessed using the session rating of perceived exertion (sRPE) and its value multiplied by match duration (sRPE-ML), respectively. Spearman correlations and linear mixed models were used to assess the relationships between internal and external load measures. Results: A very large relationship (ρ = .830, P < .001) was found between sRPE-ML and TD, with linear mixed models showing that TD statistically affected sRPE-ML (P < .001). Linear mixed models analysis showed that ACC per minute was also affecting the sRPE-ML (P = .017), while sprinting (P = .007) and ACC per minute (P = .005) were the only 2 measures statistically affecting sRPE. However, weak relationships (trivial to large) were found for these and all other measures. Conclusions: These results highlight that TD is the main external load measure able to anticipate the internal load responses measured via sRPE-ML in elite women’s rugby sevens.
Paulius Kamarauskas, Inga Lukonaitienė, Aaron T. Scanlan, Davide Ferioli, Henrikas Paulauskas, and Daniele Conte
Purpose: To assess weekly fluctuations in hormonal responses and their relationships with load and well-being during a congested in-season phase in basketball players. Methods: Ten semiprofessional, male basketball players were monitored during 4 congested in-season phase weeks consisting of 3 weekly matches. Salivary hormone variables (testosterone [T], cortisol [C], and T:C ratio) were measured weekly, and external load (PlayerLoad™ and PlayerLoad per minute), internal load session rating of perceived exertion, percentage of maximum heart rate (HR), summated HR zones, and well-being were assessed for each training session and match. Results: Significant (P < .05) moderate to large decreases in T were found in the third and fourth weeks compared with the first week. Nonsignificant moderate to large decreases in C were apparent in the last 2 weeks compared with previous weeks. Summated HR zones and perceived sleep significantly (P < .05) decreased in the fourth week compared with the first week; whereas, percentage of maximum HR significantly (P < .05) decreased in the fourth week compared with the second week. No significant relationships were found between weekly changes in hormonal responses and weekly changes in load and overall wellness. Conclusions: A congested schedule during the in-season negatively impacted the hormonal responses of players, suggesting that T and C measurements may be useful to detect fluctuations in hormone balance in such scenarios. The nonsignificant relationships between weekly changes in hormonal responses and changes in load and well-being indicate that other factors might induce hormonal changes across congested periods in basketball players.
Alice Iannaccone, Andrea Fusco, Antanas Skarbalius, Audinga Kniubaite, Cristina Cortis, and Daniele Conte
Purpose: Assessing the relationship between external load (EL) and internal load (IL) in youth male beach handball players. Methods: A total of 11 field players from the Lithuanian U17 beach handball team were monitored across 14 training sessions and 7 matches. The following EL variables were assessed by means of inertial movement units: PlayerLoad™, accelerations, decelerations, changes of direction, and jumps and total of inertial movements. IL was assessed objectively and subjectively using the summated heart rate zones and training load calculated via session rating of perceived exertion, respectively. Spearman correlations (ρ) were used to assess the relationship between EL and IL. The interindividual variability was investigated using linear mixed models with random intercepts with IL as dependent variable, PlayerLoad™ as the independent variable, and players as random effect. Results: The lowest significant (P < .05) relationship was for high jumps with objective (ρ = .56) and subjective (ρ = .49) IL. The strongest relationship was for PlayerLoad™ with objective (ρ = .9) and subjective (ρ = .84) IL. From the linear mixed model, the estimated SD of the random intercepts was 19.78 arbitrary units (95% confidence interval, 11.75–33.31); SE = 5.26, and R 2 = .47 for the objective IL and 6.03 arbitrary units (95% confidence interval, 0.00–7330.6); SE = 21.87; and R 2 = .71 for the subjective IL. Conclusions: Objective and subjective IL measures can be used as a monitoring tool when EL monitoring is not possible. Coaches can predict IL based on a given EL by using the equations proposed in this study.