Search Results

You are looking at 1 - 8 of 8 items for :

  • Author: Keith Tolfrey x
  • Physical Education and Coaching x
  • Refine by Access: All Content x
Clear All Modify Search
Restricted access

Victoria L. Goosey-Tolfrey, Nicholas J. Diaper, Jeanette Crosland, and Keith Tolfrey

Background:

Wheelchair tennis players, competing in hot and humid environments, are faced with an increased risk of heat-related illness and impaired performance. This study examined the effects of head and neck cooling garments on perceptions of exertion (RPE), thermal sensation (TS), and water consumption during wheelchair exercise at 30.4 ± 0.6°C.

Methods:

Eight highly trained wheelchair tennis players (1 amputee and 7 spinal cord injured) completed two 60-min, intermittent sprint trials; once with cooling (COOL) and once without cooling (CON) in a balanced cross-over design. Players could drink water ad libitum at five predetermined intervals during each trial. Heart rate, blood lactate concentration, peak speed, TS, and RPE were recorded during the trials. Body mass and water consumption were measured before and after each trial.

Results:

Water consumption was lower in COOL compared with CON (700 ± 393 mL vs. 1198 ± 675 mL respectively; P = 0.042). Trends in data suggested lower RPE and TS under COOL conditions (N.S.). Total sweat losses ranged from 200 to 1300 mL; this equated to ~1% dehydration after water consumption had been accounted for when averaged across all trials. The ad libitum drinking volumes matched and, in some cases, were greater than the total sweat losses.

Conclusions:

These results suggest that there is a counterproductive effect of head and neck cooling garments on water consumption. However, despite consuming volumes of water at least equivalent to total sweat loss, changes in body mass suggest an incidence of mild dehydration during wheelchair tennis in the heat.

Restricted access

Ben T. Stephenson, Sven P. Hoekstra, Keith Tolfrey, and Victoria L. Goosey-Tolfrey

Purpose: Paratriathletes may display impairments in autonomic (sudomotor and/or vasomotor function) or behavioral (drinking and/or pacing of effort) thermoregulation. As such, this study aimed to describe the thermoregulatory profile of athletes competing in the heat. Methods: Core temperature (T c) was recorded at 30-second intervals in 28 mixed-impairment paratriathletes during competition in a hot environment (air temperature = 33°C, relative humidity = 35%–41%, and water temperature = 25°C–27°C), via an ingestible temperature sensor (BodyCap e-Celsius). Furthermore, in a subset of 9 athletes, skin temperature was measured. Athletes’ wetsuit use was noted while heat illness symptoms were self-reported postrace. Results: In total, 22 athletes displayed a T c ≥ 39.5°C with 8 athletes ≥40.0°C. There were increases across the average T c for swim, bike, and run sections (P ≤ .016). There was no change in skin temperature during the race (P ≥ .086). Visually impaired athletes displayed a significantly greater T c during the run section than athletes in a wheelchair (P ≤ .021). Athletes wearing a wetsuit (57% athletes) had a greater T c when swimming (P ≤ .032), whereas those reporting heat illness symptoms (57% athletes) displayed a greater T c at various time points (P ≤ .046). Conclusions: Paratriathletes face significant thermal strain during competition in the heat, as evidenced by high T c, relative to previous research in able-bodied athletes and a high incidence of self-reported heat illness symptomatology. Differences in the T c profile exist depending on athletes’ race category and wetsuit use.

Restricted access

Ben T. Stephenson, Christof A. Leicht, Keith Tolfrey, and Victoria L. Goosey-Tolfrey

Purpose: In able-bodied athletes, several hormonal, immunological, and psychological parameters are commonly assessed in response to intensified training due to their potential relationship to acute fatigue and training/nontraining stress. This has yet to be studied in Paralympic athletes. Methods: A total of 10 elite paratriathletes were studied for 5 wk around a 14-d overseas training camp whereby training load was 137% of precamp levels. Athletes provided 6 saliva samples (1 precamp, 4 during camp, and 1 postcamp) for cortisol, testosterone, and secretory immunoglobulin A; weekly psychological questionnaires (Profile of Mood State [POMS] and Recovery-Stress Questionnaire for Athletes [RESTQ-Sport]); and daily resting heart rate and subjective wellness measures including sleep quality and quantity. Results: There was no significant change in salivary cortisol, testosterone, cortisol:testosterone ratio, or secretory immunoglobulin A during intensified training (P ≥ .090). Likewise, there was no meaningful change in resting heart rate or subjective wellness measures (P ≥ .079). Subjective sleep quality and quantity increased during intensified training (P ≤ .003). There was no significant effect on any POMS subscale other than lower anger (P = .049), whereas there was greater general recovery and lower sport and general stress from RESTQ-Sport (P ≤ .015). Conclusions: There was little to no change in parameters commonly associated with the fatigued state, which may relate to the training-camp setting minimizing external life stresses and the careful management of training loads from coaches. This is the first evidence of such responses in Paralympic athletes.

Restricted access

Ben T. Stephenson, Eleanor Hynes, Christof A. Leicht, Keith Tolfrey, and Victoria L. Goosey-Tolfrey

Purpose: To gain an exploratory insight into the relation between training load (TL), salivary secretory immunoglobulin A (sIgA), and upper respiratory tract illness (URI) in elite paratriathletes. Methods: Seven paratriathletes were recruited. Athletes provided weekly saliva samples for the measurement of sIgA over 23 consecutive weeks (February to July) and a further 11 consecutive weeks (November to January). sIgA was compared to individuals’ weekly training duration, external TL, and internal TL, using time spent in predetermined heart-rate zones. Correlations were assessed via regression analyses. URI was quantified via weekly self-report symptom questionnaire. Results: There was a significant negative relation between athletes’ individual weekly training duration and sIgA secretion rate (P = .028), with changes in training duration accounting for 12.7% of the variance (quartiles: 0.2%, 19.2%). There was, however, no significant relation between external or internal TL and sIgA parameters (P ≥ .104). There was no significant difference in sIgA when URI was present or not (101% vs 118% healthy median concentration; P ≥ .225); likewise, there was no difference in sIgA when URI occurred within 2 wk of sampling or not (83% vs 125% healthy median concentration; P ≥ .120). Conclusions: Paratriathletes’ weekly training duration significantly affects sIgA secretion rate, yet the authors did not find a relation between external or internal TL and sIgA parameters. Furthermore, it was not possible to detect any link between sIgA and URI occurrence, which throws into question the potential of using sIgA as a monitoring tool for early detection of illness.

Restricted access

Johanna S. Rosén, Victoria L. Goosey-Tolfrey, Keith Tolfrey, Anton Arndt, and Anna Bjerkefors

The purpose of this study was to examine the interrater reliability of a new evidence-based classification system for Para Va'a. Twelve Para Va'a athletes were classified by three classifier teams each consisting of a medical and a technical classifier. Interrater reliability was assessed by calculating intraclass correlation for the overall class allocation and total scores of trunk, leg, and on-water test batteries and by calculating Fleiss’s kappa and percentage of total agreement in the individual tests of each test battery. All classifier teams agreed with the overall class allocation of all athletes, and all three test batteries exhibited excellent interrater reliability. At a test level, agreement between classifiers was almost perfect in 14 tests, substantial in four tests, moderate in four tests, and fair in one test. The results suggest that a Para Va'a athlete can expect to be allocated to the same class regardless of which classifier team conducts the classification.

Restricted access

Vicky L. Goosey-Tolfrey, Sonja de Groot, Keith Tolfrey, and Tom A.W. Paulson

Purpose: To confirm whether peak aerobic capacity determined during laboratory testing could be replicated during an on-court field-based test in wheelchair rugby players. Methods: Sixteen wheelchair rugby players performed an incremental speed-based peak oxygen uptake ( V ˙ O 2 peak ) test on a motorized treadmill (TM) and completed a multistage fitness test (MFT) on a basketball court in a counterbalanced order, while spirometric data were recorded. A paired t test was performed to check for systematic error between tests. A Bland–Altman plot for V ˙ O 2 peak illustrated the agreement between the TM and MFT results and how this related to the boundaries of practical equivalence. Results: No significant differences between mean V ˙ O 2 peak were reported (TM: 1.85 [0.63] vs MFT: 1.81 [0.63] L·min−1; P = .33). Bland–Altman plot for V ˙ O 2 peak suggests that the mean values are in good agreement at the group level; that is, the exact 95% confidence limits for the ratio systematic error (0.95–1.02) are within the boundaries of practical equivalence (0.88–1.13) showing that the group average TM and MFT values are interchangeable. However, consideration of the data at the level of the individual athlete suggests that the TM and MFT results were not interchangeable because the 95% ratio limits of agreement either coincide with the boundaries of practical equivalence (upper limit) or fall outside (lower limit). Conclusions: Results suggest that the MFT provides a suitable test at a group level with this cohort of wheelchair rugby players for the assessment of V ˙ O 2 peak (range 0.97–3.64 L·min–1), yet caution is noted for interchangeable use of values between tests for individual players.

Restricted access

Paul Sindall, John P. Lenton, Keith Tolfrey, Rory A. Cooper, Michelle Oyster, and Victoria L. Goosey-Tolfrey

Purpose:

To examine the heart-rate (HR) response and court-movement variables during wheelchair tennis match play for high- (HIGH) and low- (LOW) performance-ranked players. Analysis of physiological and movement-based responses during match play offers an insight into the demands of tennis, allowing practical recommendations to be made.

Methods:

Fourteen male open-class players were monitored during tournament match play. A data logger was used to record distance and speed. HR was recorded during match play.

Results:

Significant rank-by-result interactions revealed that HIGH winners covered more forward distance than HIGH losers (P < .05) and had higher average (P < .05) and minimum (P < .01) HRs than LOW winners. LOW losers had higher average (P < .01) and minimum (P < .001) HRs than LOW winners. Independent of result, a significant main effect for rank was identified for maximum (P < .001) and average (P < .001) speed and total (P < .001), reverse (P < .001), and forward-to-reverse (P < .001) distance, with higher values for HIGH. Independent of rank, losing players experienced higher minimum HRs (P < .05). Main effects for maximum HR and actual playing time were not significant. Average playing time was 52.0 (9.1) min.

Conclusions:

These data suggest that independent of rank, tennis players were active for sufficient time to confer health-enhancing effects. While the relative playing intensity is similar, HIGH players push faster and farther than LOW players. HIGH players are therefore more capable of responding to ball movement and the challenges of competitive match play. Adjustments to the sport may be required to encourage skill developmental in LOW players, who move at significantly lower speeds and cover less distance.

Restricted access

Thomas J. O’Brien, Simon J. Briley, Barry S. Mason, Christof A. Leicht, Keith Tolfrey, and Victoria L. Goosey-Tolfrey

Purpose: To compare the effects of typical competition versus high-intensity intermittent warm-up (WU) on thermoregulatory responses and repeated sprint performance during wheelchair rugby game play. Methods: An intermittent sprint protocol (ISP) simulating the demands of wheelchair rugby was performed by male wheelchair rugby players (7 with cervical spinal cord injury [SCI] and 8 without SCI) following 2 WU protocols. These included a typical competition WU (control) and a WU consisting of high-intensity efforts (INT). Core temperature (T core), thermal sensation, and thermal comfort were recorded. Wheelchair performance variables associated to power, speed, and fatigue were also calculated. Results: During the WU, T core was similar between conditions for both groups. During the ISP, a higher T core was found for SCI compared to NON-SCI (38.1 [0.3] vs 37.7 [0.3] °C: P = .036, d = 0.75), and the SCI group experienced a higher peak T core for INT compared with control (39.0 [0.4] vs 38.6 [0.6] °C; P = .004). Peak T core occurred later in the ISP for players with SCI (96 [5.8] vs 48 [2.7] min; P < .001). All players reported a higher thermal sensation and thermal comfort following INT (P < .001), with no differences between conditions throughout the ISP. No significant differences were found in wheelchair performance variables during the ISP between conditions (P ≥ .143). Conclusions: The high-INT WU increased thermal strain in the SCI group during the ISP, potentially due to increased metabolic heat production and impaired thermoregulation, while not impacting on repeated sprint performance. It may be advisable to limit high-INT bouts during a WU in players with SCI to mitigate issues related to hyperthermia in subsequent performance.