How much is enough in rehabilitation? High running workloads following lower limb muscle injury delay return to play but protect against subsequent injury
- Stares, Jordan, Dawson, Brian, Peeling, Peter, Drew, Michael, Heasman, Jarryd, Rogalski, Brent, Colby, Marcus
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Drew, Michael , Heasman, Jarryd , Rogalski, Brent , Colby, Marcus
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1019-1024
- Full Text:
- Reviewed:
- Description: Objectives: Examine the influence of rehabilitation training loads on return to play (RTP) time and subsequent injury in elite Australian footballers. Design: Prospective cohort study. Methods: Internal (sessional rating of perceived exertion: sRPE) and external (distance, sprint distance) workload and lower limb non-contact muscle injury data was collected from 58 players over 5 seasons. Rehabilitation periods were analysed for running workloads and time spent in 3 rehabilitation stages (1: off-legs training, 2: non-football running, 3: group football training) was calculated. Multi-level survival analyses with random effects accounting for player and season were performed. Hazard ratios (HR) and 95% confidence intervals (CI) for each variable were produced for RTP time and time to subsequent injury. Results: Of 85 lower limb muscle injuries, 70 were rehabilitated to RTP, with 30 cases of subsequent injury recorded (recurrence rate = 11.8%, new site injury rate = 31.4%). Completion of high rehabilitation workloads delayed RTP (distance: >49,775 m [reference: 34,613–49,775 m]: HR 0.12, 95%CI 0.04–0.36, sRPE: >1266 AU [reference: 852–1266 AU]: HR 0.09, 95%CI 0.03–0.32). Return to running within 4 days increased subsequent injury risk (3–4 days [reference: 5–6 days]: HR 25.88, 95%CI 2.06–324.4). Attaining moderate-high sprint distance (427–710 m) was protective against subsequent injury (154–426 m: [reference: 427–710 m]: HR 37.41, 95%CI 2.70–518.64). Conclusions: Training load monitoring can inform player rehabilitation programs. Higher rehabilitation training loads delayed RTP; however, moderate-high sprint running loads can protect against subsequent injury. Shared-decision making regarding RTP should include accumulated training loads and consider the trade-off between expedited RTP and lower subsequent injury risk.
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Drew, Michael , Heasman, Jarryd , Rogalski, Brent , Colby, Marcus
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1019-1024
- Full Text:
- Reviewed:
- Description: Objectives: Examine the influence of rehabilitation training loads on return to play (RTP) time and subsequent injury in elite Australian footballers. Design: Prospective cohort study. Methods: Internal (sessional rating of perceived exertion: sRPE) and external (distance, sprint distance) workload and lower limb non-contact muscle injury data was collected from 58 players over 5 seasons. Rehabilitation periods were analysed for running workloads and time spent in 3 rehabilitation stages (1: off-legs training, 2: non-football running, 3: group football training) was calculated. Multi-level survival analyses with random effects accounting for player and season were performed. Hazard ratios (HR) and 95% confidence intervals (CI) for each variable were produced for RTP time and time to subsequent injury. Results: Of 85 lower limb muscle injuries, 70 were rehabilitated to RTP, with 30 cases of subsequent injury recorded (recurrence rate = 11.8%, new site injury rate = 31.4%). Completion of high rehabilitation workloads delayed RTP (distance: >49,775 m [reference: 34,613–49,775 m]: HR 0.12, 95%CI 0.04–0.36, sRPE: >1266 AU [reference: 852–1266 AU]: HR 0.09, 95%CI 0.03–0.32). Return to running within 4 days increased subsequent injury risk (3–4 days [reference: 5–6 days]: HR 25.88, 95%CI 2.06–324.4). Attaining moderate-high sprint distance (427–710 m) was protective against subsequent injury (154–426 m: [reference: 427–710 m]: HR 37.41, 95%CI 2.70–518.64). Conclusions: Training load monitoring can inform player rehabilitation programs. Higher rehabilitation training loads delayed RTP; however, moderate-high sprint running loads can protect against subsequent injury. Shared-decision making regarding RTP should include accumulated training loads and consider the trade-off between expedited RTP and lower subsequent injury risk.
Identifying high risk loading conditions for in-season injury in elite Australian football players
- Stares, Jordan, Dawson, Brian, Peeling, Peter, Heasman, Jarryd, Rogalski, Brent, Drew, Michael, Colby, Marcus, Dupont, Gregory, Lester, Leanne
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Colby, Marcus , Dupont, Gregory , Lester, Leanne
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 1 (2018), p. 46-51
- Full Text: false
- Reviewed:
- Description: Objectives To examine different timeframes for calculating acute to chronic workload ratio (ACWR) and whether this variable is associated with intrinsic injury risk in elite Australian football players. Design Prospective cohort study. Methods Internal (session rating of perceived exertion: sRPE) and external (GPS distance and sprint distance) workload and injury data were collected from 70 players from one AFL club over 4 seasons. Various acute (1–2 weeks) and chronic (3–8 weeks) timeframes were used to calculate ACWRs: these and chronic load categories were then analysed to determine the injury risk in the subsequent month. Poisson regression with robust errors within a generalised estimating equation were utilised to determine incidence rate ratios (IRR). Results Altering acute and/or chronic timeframes did not improve the ability to detect high injury risk conditions above the commonly used 1:4 week ACWR. Twenty-seven ACWR/chronic load combinations were found to be “high risk conditions” (IRR > 1, p < 0.05) for injury within 7 days. Most (93%) of these conditions occurred when chronic load was low or very low and ACWR was either low (<0.6) or high (>1.5). Once a high injury risk condition was entered, the elevated risk persisted for up to 28 days. Conclusions Injury risk was greatest when chronic load was low and ACWR was either low or high. This heightened risk remained for up to 4 weeks. There was no improvement in the ability to identify high injury risk situations by altering acute or chronic time periods from 1:4 weeks.
Multivariate modelling of subjective and objective monitoring data improve the detection of non-contact injury risk in elite Australian footballers
- Colby, Marcus, Dawson, Brian, Peeling, Peter, Heasman, Jarryd, Rogalski, Brent, Drew, Michael, Stares, Jordan, Zouhal, Hassane, Lester, Leanne
- Authors: Colby, Marcus , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Stares, Jordan , Zouhal, Hassane , Lester, Leanne
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 20, no. 12 (2017), p. 1068-1074
- Full Text:
- Reviewed:
- Description: Objectives: To assess the association between workload, subjective wellness, musculoskeletal screening measures and non-contact injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Across 4 seasons in 70 players from one club, cumulative weekly workloads (acute; 1 week, chronic; 2-, 3-, 4-week) and acute:chronic workload ratio’s (ACWR: 1-week load/average 4-weekly load) for session-Rating of Perceived Exertion (sRPE) and GPS-derived distance and sprint distance were calculated. Wellness, screening and non-contact injury data were also documented. Univariate and multivariate regression models determined injury incidence rate ratios (IRR) while accounting for interaction/moderating effects. Receiver operating characteristics determined model predictive accuracy (area under curve: AUC). Results: Very low cumulative chronic (2-, 3-, 4- week) workloads were associated with the greatest injury risk (univariate IRR = 1.71–2.16, 95% CI = 1.10–4.52) in the subsequent week. In multivariate analysis, the interaction between a low chronic load and a very high distance (adj-IRR = 2.60, 95% CI = 1.07–6.34) or low sRPE ACWR (adj-IRR = 2.52, 95% CI = 1.01–6.29) was associated with increased injury risk. Subjectively reporting “yes” (vs. “no”) for old lower limb pain and heavy non-football activity in the previous 7 days (multivariate adj-IRR = 2.01–2.25, 95% CI = 1.02–4.95) and playing experience (>9 years) (multivariate adj- IRR = 2.05, 95% CI = 1.03–4.06) was also associated with increased injury risk, but screening data were not. Predictive capacity of multivariate models was significantly better than univariate (AUCmultivariate = 0.70, 95% CI 0.64–0.75; AUCunivariate range = 0.51–0.60). Conclusions: Chronic load is an important moderating factor in the workload–injury relationship. Low chronic loads coupled with low or very high ACWR are associated with increased injury risk.
- Description: Objectives: To assess the association between workload, subjective
- Authors: Colby, Marcus , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Stares, Jordan , Zouhal, Hassane , Lester, Leanne
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 20, no. 12 (2017), p. 1068-1074
- Full Text:
- Reviewed:
- Description: Objectives: To assess the association between workload, subjective wellness, musculoskeletal screening measures and non-contact injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Across 4 seasons in 70 players from one club, cumulative weekly workloads (acute; 1 week, chronic; 2-, 3-, 4-week) and acute:chronic workload ratio’s (ACWR: 1-week load/average 4-weekly load) for session-Rating of Perceived Exertion (sRPE) and GPS-derived distance and sprint distance were calculated. Wellness, screening and non-contact injury data were also documented. Univariate and multivariate regression models determined injury incidence rate ratios (IRR) while accounting for interaction/moderating effects. Receiver operating characteristics determined model predictive accuracy (area under curve: AUC). Results: Very low cumulative chronic (2-, 3-, 4- week) workloads were associated with the greatest injury risk (univariate IRR = 1.71–2.16, 95% CI = 1.10–4.52) in the subsequent week. In multivariate analysis, the interaction between a low chronic load and a very high distance (adj-IRR = 2.60, 95% CI = 1.07–6.34) or low sRPE ACWR (adj-IRR = 2.52, 95% CI = 1.01–6.29) was associated with increased injury risk. Subjectively reporting “yes” (vs. “no”) for old lower limb pain and heavy non-football activity in the previous 7 days (multivariate adj-IRR = 2.01–2.25, 95% CI = 1.02–4.95) and playing experience (>9 years) (multivariate adj- IRR = 2.05, 95% CI = 1.03–4.06) was also associated with increased injury risk, but screening data were not. Predictive capacity of multivariate models was significantly better than univariate (AUCmultivariate = 0.70, 95% CI 0.64–0.75; AUCunivariate range = 0.51–0.60). Conclusions: Chronic load is an important moderating factor in the workload–injury relationship. Low chronic loads coupled with low or very high ACWR are associated with increased injury risk.
- Description: Objectives: To assess the association between workload, subjective
- Brown, Henry, Dawson, Brian, Binnie, Martyn, Pinnington, Hugh, Sim, Marc, Clemons, Tristan, Peeling, Peter
- Authors: Brown, Henry , Dawson, Brian , Binnie, Martyn , Pinnington, Hugh , Sim, Marc , Clemons, Tristan , Peeling, Peter
- Date: 2017
- Type: Text , Journal article
- Relation: European Journal of Sport Science Vol. 17, no. 6 (2017), p. 741-747
- Full Text: false
- Reviewed:
- Description: This study compared markers of muscle damage and inflammation elevated by a matched-intensity interval running session on soft sand and grass surfaces. In a counterbalanced, repeated-measures and crossover design, 10 well-trained female athletes completed 2 interval-based running sessions 1 week apart on either a grass or a sand surface. Exercise heart rate (HR) was fixed at 83–88% of HR maximum. Venous blood samples were collected pre-, post- and 24 h post-exercise, and analysed for myoglobin (Mb) and C-reactive protein (CRP). Perceptual ratings of exertion (RPE) and muscle soreness (DOMS) were recorded immediately post- and 24 h post-exercise. A significant time effect showed that Mb increased from pre- to post-exercise on grass (p =.008) but not on sand (p =.611). Furthermore, there was a greater relative increase in Mb on grass compared with that on sand (p =.026). No differences in CRP were reported between surfaces (p >.05). The HR, RPE and DOMS scores were not significantly different between conditions (p >.05). These results suggest that in response to a matched-intensity exercise bout, markers of post-exercise muscle damage may be reduced by running on softer ground surfaces. Such training strategy may be used to minimize musculoskeletal strain while still incurring an equivalent cardiovascular training stimulus. © 2017 European College of Sport Science.
Effect of tart cherry juice on recovery and next day performance in well-trained Water Polo players
- McCormick, Rachel, Peeling, Peter, Binnie, Martyn, Dawson, Brian, Sim, Marc
- Authors: McCormick, Rachel , Peeling, Peter , Binnie, Martyn , Dawson, Brian , Sim, Marc
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of the International Society of Sports Nutrition Vol. 13, no. 1 (2016), p. 1-8
- Full Text:
- Reviewed:
- Description: Background: Tart Montmorency cherries contain high concentrations of phytochemicals and anthocyanins, which have recently been linked to improved athletic recovery and subsequent performance. To date however, previous work reporting promising results has focused on land-based endurance sports, with any potential benefits to team sports remaining unknown. As such, this investigation set-out to examine the effect of supplemental tart cherry juice (CJ) on recovery and next day athletic performance in highly-trained water-based team sport athletes over seven days. Methods: In a randomised, double-blind, repeated measures, crossover design, nine male Water Polo athletes were supplemented with CJ or a placebo equivalent (PLA) for six consecutive days. Prior to, and at the completion of the supplementation period, water-based performance testing was conducted. On day 6, participants also undertook a fatiguing simulated team game activity. Venous blood samples were collected (Pre-exercise: day 1, 6 and 7; Post-exercise: day 6) to investigate markers of inflammation [Interleukin-6 (IL-6); C-reactive protein (CRP)] and oxidative stress [Uric Acid (UA); F2-Isoprostane (F2-IsoP)]. A daily diary was also completed (total quality of recovery, delayed onset muscle soreness) as a measure of perceptual recovery. Results: In both conditions, day 6 post-exercise IL-6 was significantly higher than pre-exercise and day 7 (p<0.05); CRP was greater on day 7 as compared to day 6 pre- and post-exercise (p<0.05); F2-IsoP was significantly lower on day 7 as compared to day 1 and day 6 (p<0.05); UA remained unchanged (p>0.05). No differences were found for any performance or recovery measures. Conclusions: The lack of difference observed in the blood markers between groups may reflect the intermittent, non-weight bearing demands of Water Polo, with such activity possibly unable to create a substantial inflammatory response or oxidative stress (over 7 days) to impede performance; thereby negating any potential beneficial effects associated with CJ supplementation. Trial registration: This trial was registered with the Australian and New Zealand Clinical Trials Registry (ANZCTR). Registration number: ACTRN12616001080415. Date registered: 11/08/2016, retrospectively registered. © 2016 The Author(s).
- Authors: McCormick, Rachel , Peeling, Peter , Binnie, Martyn , Dawson, Brian , Sim, Marc
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of the International Society of Sports Nutrition Vol. 13, no. 1 (2016), p. 1-8
- Full Text:
- Reviewed:
- Description: Background: Tart Montmorency cherries contain high concentrations of phytochemicals and anthocyanins, which have recently been linked to improved athletic recovery and subsequent performance. To date however, previous work reporting promising results has focused on land-based endurance sports, with any potential benefits to team sports remaining unknown. As such, this investigation set-out to examine the effect of supplemental tart cherry juice (CJ) on recovery and next day athletic performance in highly-trained water-based team sport athletes over seven days. Methods: In a randomised, double-blind, repeated measures, crossover design, nine male Water Polo athletes were supplemented with CJ or a placebo equivalent (PLA) for six consecutive days. Prior to, and at the completion of the supplementation period, water-based performance testing was conducted. On day 6, participants also undertook a fatiguing simulated team game activity. Venous blood samples were collected (Pre-exercise: day 1, 6 and 7; Post-exercise: day 6) to investigate markers of inflammation [Interleukin-6 (IL-6); C-reactive protein (CRP)] and oxidative stress [Uric Acid (UA); F2-Isoprostane (F2-IsoP)]. A daily diary was also completed (total quality of recovery, delayed onset muscle soreness) as a measure of perceptual recovery. Results: In both conditions, day 6 post-exercise IL-6 was significantly higher than pre-exercise and day 7 (p<0.05); CRP was greater on day 7 as compared to day 6 pre- and post-exercise (p<0.05); F2-IsoP was significantly lower on day 7 as compared to day 1 and day 6 (p<0.05); UA remained unchanged (p>0.05). No differences were found for any performance or recovery measures. Conclusions: The lack of difference observed in the blood markers between groups may reflect the intermittent, non-weight bearing demands of Water Polo, with such activity possibly unable to create a substantial inflammatory response or oxidative stress (over 7 days) to impede performance; thereby negating any potential beneficial effects associated with CJ supplementation. Trial registration: This trial was registered with the Australian and New Zealand Clinical Trials Registry (ANZCTR). Registration number: ACTRN12616001080415. Date registered: 11/08/2016, retrospectively registered. © 2016 The Author(s).
Relationships between reactive agility movement time and unilateral vertical, horizontal, and lateral jumps
- Henry, Greg, Dawson, Brian, Lay, Brendan, Young, Warren
- Authors: Henry, Greg , Dawson, Brian , Lay, Brendan , Young, Warren
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Strength and Conditioning Research Vol. 30, no. 9 (2016), p. 2514-2521
- Full Text:
- Reviewed:
- Description: Henry, GJ, Dawson, B, Lay, BS, and Young, WB. Relationships between reactive agility movement time and unilateral vertical, horizontal, and lateral jumps. J Strength Cond Res 30(9): 2514-2521, 2016 - This study compared reactive agility movement time and unilateral (vertical, horizontal, and lateral) jump performance and kinetics between dominant and nondominant legs in Australian rules footballers (n 31) to investigate the role of leg strength characteristics in reactive agility performance. Jumps involved jumping forward on 1 leg, then for maximum height or horizontal or lateral distance. Agility and movement time components of reactive agility were assessed using a video-based test. Correlations between each of the jumps were strong (r -0.62 to -0.77), but between the jumps and agility movement time the relationships were weak (r -0.25 to -0.33). Dominant leg performance was superior in reactive agility movement time (4.5%; p 0.04), lateral jump distance (3%; p 0.008), and lateral reactive strength index (4.4%; p 0.03) compared with the nondominant leg. However, when the subjects were divided into faster and slower performers (based on their agility movement times) the movement time was significantly quicker in the faster group (n 15; 12%; p < 0.001), but no differences in jump performance or kinetics were observed. Therefore, although the capacity for jumps to predict agility performance seems limited, factors involved in producing superior lateral jump performance in the dominant leg may also be associated with advantages in agility performance in that leg. However, because reactive strength as measured by unilateral jumps seems to play a limited role in reactive agility performance and other factors such as skill, balance, and coordination, and also cognitive and decision-making factors, are likely to be more important. © 2013 National Strength and Conditioning Association.
- Authors: Henry, Greg , Dawson, Brian , Lay, Brendan , Young, Warren
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Strength and Conditioning Research Vol. 30, no. 9 (2016), p. 2514-2521
- Full Text:
- Reviewed:
- Description: Henry, GJ, Dawson, B, Lay, BS, and Young, WB. Relationships between reactive agility movement time and unilateral vertical, horizontal, and lateral jumps. J Strength Cond Res 30(9): 2514-2521, 2016 - This study compared reactive agility movement time and unilateral (vertical, horizontal, and lateral) jump performance and kinetics between dominant and nondominant legs in Australian rules footballers (n 31) to investigate the role of leg strength characteristics in reactive agility performance. Jumps involved jumping forward on 1 leg, then for maximum height or horizontal or lateral distance. Agility and movement time components of reactive agility were assessed using a video-based test. Correlations between each of the jumps were strong (r -0.62 to -0.77), but between the jumps and agility movement time the relationships were weak (r -0.25 to -0.33). Dominant leg performance was superior in reactive agility movement time (4.5%; p 0.04), lateral jump distance (3%; p 0.008), and lateral reactive strength index (4.4%; p 0.03) compared with the nondominant leg. However, when the subjects were divided into faster and slower performers (based on their agility movement times) the movement time was significantly quicker in the faster group (n 15; 12%; p < 0.001), but no differences in jump performance or kinetics were observed. Therefore, although the capacity for jumps to predict agility performance seems limited, factors involved in producing superior lateral jump performance in the dominant leg may also be associated with advantages in agility performance in that leg. However, because reactive strength as measured by unilateral jumps seems to play a limited role in reactive agility performance and other factors such as skill, balance, and coordination, and also cognitive and decision-making factors, are likely to be more important. © 2013 National Strength and Conditioning Association.
Seven days of high carbohydrate ingestion does not attenuate post-exercise IL-6 and hepcidin levels
- Badenhorst, Claire, Dawson, Brian, Cox, Gregory, Sim, Marc, Laarakkers, Coby, Swinkels, Dorine, Peeling, Peter
- Authors: Badenhorst, Claire , Dawson, Brian , Cox, Gregory , Sim, Marc , Laarakkers, Coby , Swinkels, Dorine , Peeling, Peter
- Date: 2016
- Type: Text , Journal article
- Relation: European Journal of Applied Physiology Vol. 116, no. 9 (2016), p. 1715-1724
- Full Text: false
- Reviewed:
- Description: PURPOSE: This investigation examined if a high carbohydrate (CHO) diet, maintained across a seven-day training period, could attenuate post-exercise interleukin-6 (IL-6) and serum hepcidin levels. METHODS: Twelve endurance-trained male athletes completed two seven-day running training blocks whilst consuming either a high (8 g kg(-1)) versus a low (3 g kg(-1)) CHO isoenergetic diet. Each training block consisted of five running sessions performed on days 1, 2, 4, 5, and 7, with the intensity and duration of each session matched between training weeks. Serum levels of Interleukin-6 (IL-6) and hepcidin were measured pre- and either immediately (IL-6) or 3-h (hepcidin) post-exercise on days 1 and 7 of each training week. RESULTS: During each training week, the immediate post-exercise IL-6 and 3-h post-exercise serum hepcidin levels were significantly elevated (both p = 0.001) from pre-exercise on days 1 and 7. These increases were not different between trials. CONCLUSIONS: These results suggest that the ingestion of a high (compared to low) CHO diet over a seven-day training period is ineffective in attenuating post-exercise IL-6 and hepcidin responses. Such results may be due to the modest training load, the increased protein intake in the low-CHO trial, and a 48 h recovery period prior to sample collection on day 7, allowing a full recovery of muscle glycogen status between exercise sessions.
Oral contraception does not alter typical post-exercise interleukin-6 and hepcidin levels in females
- Sim, Marc, Dawson, Brian, Landers, Grant, Swinkels, Dorine, Tjalsma, Harold, Yeap, Bu, Trinder, Debbie, Peeling, Peter
- Authors: Sim, Marc , Dawson, Brian , Landers, Grant , Swinkels, Dorine , Tjalsma, Harold , Yeap, Bu , Trinder, Debbie , Peeling, Peter
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 18, no. 1 (2015), p. 8-12
- Full Text: false
- Reviewed:
- Description: OBJECTIVES: The post-exercise interleukin-6 (IL-6) and hepcidin response was investigated during the hormone-deplete and hormone-replete phases of an estradiol and progestogen regulated oral contraceptive cycle (OCC). DESIGN: Counterbalanced, repeated measures cross-over study. METHODS: Ten active female monophasic oral contraceptive pill (OCP) users completed two 40 min treadmill running trials at 75% of their pre-determined peak oxygen uptake velocity (vVO2peak). These trials were randomly performed in two specific phases of the OCC: (a) Day 2-4, representing a hormone-free withdrawal period (D-0); (b) Day 12-14, representing the end of the first week of active hormone therapy (D+7). Venous blood samples were drawn pre-, post- and 3h post-exercise. RESULTS: In both trials, serum IL-6 was significantly elevated (p<0.05) immediately post-exercise, while serum hepcidin was significantly elevated (p<0.05) 3h post-exercise, with no significant differences recorded between trials. CONCLUSIONS: These findings suggest that exercise performed during the different phases (D-0 vs. D+7) of a monophasic OCP regulated cycle does not alter exercise induced IL-6 or hepcidin production. As such, future studies looking to investigate similar variables post-exercise, may not need to 'control' for different phases of the OCC, provided participants are current monophasic OCP users.
A seven day running training period increases basal urinary hepcidin levels as compared to cycling
- Sim, Marc, Dawson, Brian, Landers, Grant, Swinkels, Dorine, Tjalsma, Harold, Wiegerinck, Erwin, Trinder, Debbie, Peeling, Peter
- Authors: Sim, Marc , Dawson, Brian , Landers, Grant , Swinkels, Dorine , Tjalsma, Harold , Wiegerinck, Erwin , Trinder, Debbie , Peeling, Peter
- Date: 2014
- Type: Text , Journal article
- Relation: Journal of the International Society of Sports Nutrition Vol. 11, no. 1 (2014), p. 1-9
- Full Text:
- Reviewed:
- Description: BACKGROUND: This investigation compared the effects of an extended period of weight-bearing (running) vs. non-weight-bearing (cycling) exercise on hepcidin production and its implications for iron status. METHODS: Ten active males performed two separate exercise training blocks with either running (RTB) or cycling (CTB) as the exercise mode. Each block consisted of five training sessions (Day 1, 2, 4, 5, 6) performed over a seven day period that were matched for exercise intensity. Basal venous blood samples were obtained on Day 1 (D1), and on Recovery Days 3 (R3) and 7 (R7) to assess iron status, while basal and 3 h post-exercise urinary hepcidin levels were measured on D1, D2, D6, as well as R3 and R7 (basal levels only) for each condition. RESULTS: Basal urinary hepcidin levels were significantly elevated (p = 0.05) at D2, R3 and R7 as compared to D1 in RTB. Furthermore, 3 h post-exercise urinary hepcidin levels on D1 were also significantly higher in RTB compared to CTB (p = 0.05). In CTB, urinary hepcidin levels were not statistically different on D1 as compared to R7. Iron parameters were not significantly different at D1 compared to R3 and R7 during both conditions. CONCLUSIONS: These results suggest that basal hepcidin levels may increase over the course of an extended training program, especially if a weight-bearing exercise modality is undertaken. However, despite any variations in hepcidin production, serum iron parameters in both RTB and CTB were unaffected, possibly due to the short duration of each training block. In comparing running to cycling, non-weight-bearing activity may require more training sessions, or sessions of extended duration, before any significant changes in basal hepcidin levels appear. Chronic elevations in hepcidin levels may help to explain the high incidence of iron deficiency in athletes.
- Authors: Sim, Marc , Dawson, Brian , Landers, Grant , Swinkels, Dorine , Tjalsma, Harold , Wiegerinck, Erwin , Trinder, Debbie , Peeling, Peter
- Date: 2014
- Type: Text , Journal article
- Relation: Journal of the International Society of Sports Nutrition Vol. 11, no. 1 (2014), p. 1-9
- Full Text:
- Reviewed:
- Description: BACKGROUND: This investigation compared the effects of an extended period of weight-bearing (running) vs. non-weight-bearing (cycling) exercise on hepcidin production and its implications for iron status. METHODS: Ten active males performed two separate exercise training blocks with either running (RTB) or cycling (CTB) as the exercise mode. Each block consisted of five training sessions (Day 1, 2, 4, 5, 6) performed over a seven day period that were matched for exercise intensity. Basal venous blood samples were obtained on Day 1 (D1), and on Recovery Days 3 (R3) and 7 (R7) to assess iron status, while basal and 3 h post-exercise urinary hepcidin levels were measured on D1, D2, D6, as well as R3 and R7 (basal levels only) for each condition. RESULTS: Basal urinary hepcidin levels were significantly elevated (p = 0.05) at D2, R3 and R7 as compared to D1 in RTB. Furthermore, 3 h post-exercise urinary hepcidin levels on D1 were also significantly higher in RTB compared to CTB (p = 0.05). In CTB, urinary hepcidin levels were not statistically different on D1 as compared to R7. Iron parameters were not significantly different at D1 compared to R3 and R7 during both conditions. CONCLUSIONS: These results suggest that basal hepcidin levels may increase over the course of an extended training program, especially if a weight-bearing exercise modality is undertaken. However, despite any variations in hepcidin production, serum iron parameters in both RTB and CTB were unaffected, possibly due to the short duration of each training block. In comparing running to cycling, non-weight-bearing activity may require more training sessions, or sessions of extended duration, before any significant changes in basal hepcidin levels appear. Chronic elevations in hepcidin levels may help to explain the high incidence of iron deficiency in athletes.
Influence of post-exercise hypoxic exposure on hepcidin response in athletes
- Badenhorst, Claire, Dawson, Brian, Goodman, Carmel, Sim, Marc, Cox, Gregory, Gore, Christopher, Tjalsma, Harold, Swinkels, Dorine, Peeling, Peter
- Authors: Badenhorst, Claire , Dawson, Brian , Goodman, Carmel , Sim, Marc , Cox, Gregory , Gore, Christopher , Tjalsma, Harold , Swinkels, Dorine , Peeling, Peter
- Date: 2014
- Type: Text , Journal article
- Relation: European Journal of Applied Physiology Vol. 114, no. 5 (2014), p. 951-959
- Full Text: false
- Reviewed:
- Description: PURPOSE: To assess the influence of a simulated altitude exposure (~2,900 m above sea level) for a 3 h recovery period following intense interval running on post-exercise inflammation, serum iron, ferritin, erythropoietin, and hepcidin response. METHODS: In a cross-over design, ten well-trained male endurance athletes completed two 8 x 3 min interval running sessions at 85 % of their maximal aerobic velocity on a motorized treadmill, before being randomly assigned to either a hypoxic (HYP: F IO2 ~0.1513) or a normoxic (NORM: F IO2 0.2093) 3 h recovery period. Venous blood was collected pre- and immediately post-exercise, and after 3 and 24 h of recovery. Blood was analyzed for interleukin-6, serum iron, ferritin, erythropoietin, and hepcidin. RESULTS: Interleukin-6 was significantly elevated (p < 0.01) immediately post-exercise compared to baseline (NORM: 1.08 +/- 0.061 to 3.12 +/- 1.80) (HYP: 1.32 +/- 0.86 to 2.99 +/- 2.02), but was not different between conditions. Hepcidin levels were significantly elevated (p < 0.01) at 3 h post-exercise for both conditions when compared to baseline (NORM: 3.25 +/- 1.23 to 7.40 +/- 4.00) (HYP: 3.24 +/- 1.94 to 5.42 +/- 3.20), but were significantly lower (p < 0.05) in the HYP trial compared to NORM. No significant differences existed between HYP and NORM for erythropoietin, serum iron, or ferritin. CONCLUSION: Simulated altitude exposure (~2,900 m) for 3 h following intense interval running attenuates the peak hepcidin levels recorded at 3 h post-exercise. Consequently, a hypoxic recovery after exercise may be useful for athletes with compromised iron status to potentially increase acute dietary iron absorption.
Effect of exercise modality and intensity on post-exercise interleukin-6 and hepcidin levels
- Sim, Marc, Dawson, Brian, Landers, Grant, Swinkels, Dorine, Tjalsma, Harold, Trinder, Debbie, Peeling, Peter
- Authors: Sim, Marc , Dawson, Brian , Landers, Grant , Swinkels, Dorine , Tjalsma, Harold , Trinder, Debbie , Peeling, Peter
- Date: 2013
- Type: Text , Journal article
- Relation: International Journal of Sport Nutrition and Exercise Metabolism Vol. 23, no. 2 (2013), p. 178-186
- Full Text: false
- Reviewed:
- Description: The effect of exercise modality and intensity on Interleukin-6 (IL-6), iron status, and hepcidin levels was investigated. Ten trained male triathletes performed 4 exercise trials including low-intensity continuous running (L-R), low-intensity continuous cycling (L-C), high-intensity interval running (H-R), and high-intensity interval cycling (H-C). Both L-R and L-C consisted of 40 min continuous exercise performed at 65% of peak running velocity (vVO2peak) and cycling power output (pVO2peak), while H-R and H-C consisted of 8 x 3-min intervals performed at 85% vVO2peak and pVO2peak. Venous blood samples were drawn pre-, post-, and 3 hr postexercise. Significant increases in postexercise IL-6 were seen within each trial (p < .05) and were significantly greater in H-R than L-R (p < .05). Hepcidin levels were significantly elevated at 3 hr postexercise within each trial (p < .05). Serum iron levels were significantly elevated (p < .05) immediately postexercise in all trials except L-C. These results suggest that, regardless of exercise mode or intensity, postexercise increases in IL-6 may be expected, likely influencing a subsequent elevation in hepcidin. Regardless, the lack of change in postexercise serum iron levels in L-C may indicate that reduced hemolysis occurs during weight-supported, low-intensity activity.
Effects of a feint on reactive agility performance
- Henry, Greg, Dawson, Brian, Lay, Brendan, Young, Warren
- Authors: Henry, Greg , Dawson, Brian , Lay, Brendan , Young, Warren
- Date: 2012
- Type: Text , Journal article
- Relation: Journal of Sports Sciences Vol. 30, no. 8 (2012), p. 787-795
- Full Text: false
- Reviewed:
- Description: This study compared reactive agility between higher-standard (n = 14) and lower-standard (n = 14) Australian footballers using a reactive agility test incorporating a life-size video image of another player changing direction, including and excluding a feint. Mean agility time in the feint trials was 34% (509 ± 243 ms; p < 0.001; effect size 3.06) longer than non-feint trials. In higher-standard players, agility time was shorter than for lower-standard players in both feint (114 ± 140 ms; p = 0.18; effect size 0.52; likely beneficial) and non-feint (32 ± 44 ms; p = 0.22; effect size 0.47; possibly beneficial) trials. Additionally, the inclusion of a feint resulted in movement time increasing over three times more in the lower-standard group (197 ± 91 ms; p = 0.001; effect size 1.07; almost certainly detrimental) than the higher-standard group (62 ± 86 ms; p = 0.23; effect size 0.66; likely detrimental). There were weak correlations between the feint and non-feint trials (r = -0.13-0.14; p > 0.05), suggesting that reactive agility involving a feint is a unique skill. Also, higher-standard players are more agile than their lower-standard peers, whose movement speed deteriorates more as task complexity increases with the inclusion of a feint. These results support the need for specific training in multi-turn reactive agility tasks. © 2012 Copyright Taylor and Francis Group, LLC.
- Sim, Marc, Dawson, Brian, Landers, Grant, Wiegerinck, Erwin, Swinkels, Dorine, Townsend, Mary-Anne, Trinder, Debbie, Peeling, Peter
- Authors: Sim, Marc , Dawson, Brian , Landers, Grant , Wiegerinck, Erwin , Swinkels, Dorine , Townsend, Mary-Anne , Trinder, Debbie , Peeling, Peter
- Date: 2012
- Type: Text , Journal article
- Relation: European Journal of Applied Physiology Vol. 112, no. 5 (2012), p. 1889-1898
- Full Text: false
- Reviewed:
- Description: The effect of carbohydrate (CHO) consumption during prolonged endurance running on post-exercise inflammation and hepcidin levels was investigated. Eleven well-trained male endurance athletes completed a graded exercise test, followed by two experimental running trials in a randomized order. The two experimental trials consisted of a 90 min run at 75% of the peak oxygen uptake velocity (vVO(2peak)), while consuming a solution with either 6% CHO or a placebo (PLA) equivalent at 3 ml kg(-1) every 20 min. Serum interleukin-6 (IL-6), free hemoglobin (Hb), haptoglobin (Hp), hepcidin and iron parameters were assessed throughout the post-run recovery period. Serum iron and IL-6 were significantly elevated immediately post-run in both CHO and PLA (p = 0.05), with no differences between trials. Serum-free Hb increased and Hp decreased significantly immediately post-run in both conditions (p = 0.05). Serum soluble transferrin receptor levels were significantly below the baseline at 3 and 24 h post-run in both conditions (p = 0.05). Serum hepcidin concentration recorded 3 h post-run in both conditions was significantly elevated (p = 0.05), and had returned to the baseline by 24 h post-run (p = 0.05). The use of a 6% CHO solution at 3 ml kg(-1) 20 min(-1) during endurance running did not attenuate the inflammatory response and subsequent increase in serum hepcidin levels during the post-run recovery period.
Physiological and anthropometric characteristics of starters and non-starters and playing positions in elite Australian Rules football : A case study
- Young, Warren, Newton, Robert, Doyle, Tim, Chapman, Dale, Cormack, Stuart, Stewart, Glenn, Dawson, Brian
- Authors: Young, Warren , Newton, Robert , Doyle, Tim , Chapman, Dale , Cormack, Stuart , Stewart, Glenn , Dawson, Brian
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 8, no. 3 (Sep 2005), p. 333-345
- Full Text:
- Reviewed:
- Description: A purpose of this study was to determine if pre-season anthropometric and physiological measures were significantly different for the players from one Australian Football League (AFL) club selected to play in the first game of the season compared to the players not selected. Another purpose was to compare fitness test results for defenders, forwards and mid-fielders in the same AFL club. Thirty-four players were tested for isolated quadriceps and hamstrings strength, leg extensor muscle strength and power, upper body strength, sprinting speed, vertical jump (VJ), endurance, skinfolds and hamstring flexibility. The starters who were selected to play the first game were a significantly older and more experienced playing group, and were significantly better (p < 0.05) in measures of leg power, sprinting speed and the distance covered in the Yo Yo intermittent recovery test compared to the non-starters. Although there were trends for the superiority of the starters, the differences in lower and upper body strength, VJ and predicted VO(2)max were nonsignificant. The forwards generally produced the worst fitness scores of the playing positions with the midfielders having significantly lower skinfolds and the defenders possessing better hamstring strength and VJ compared to the forwards. It was concluded that some fitness qualities can differentiate between starters and non-starters, at least in one AFL club. Comparisons of playing positions and the development of fitness norms for AFL players require further research.
- Description: C1
- Description: 2003001187
- Authors: Young, Warren , Newton, Robert , Doyle, Tim , Chapman, Dale , Cormack, Stuart , Stewart, Glenn , Dawson, Brian
- Date: 2005
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 8, no. 3 (Sep 2005), p. 333-345
- Full Text:
- Reviewed:
- Description: A purpose of this study was to determine if pre-season anthropometric and physiological measures were significantly different for the players from one Australian Football League (AFL) club selected to play in the first game of the season compared to the players not selected. Another purpose was to compare fitness test results for defenders, forwards and mid-fielders in the same AFL club. Thirty-four players were tested for isolated quadriceps and hamstrings strength, leg extensor muscle strength and power, upper body strength, sprinting speed, vertical jump (VJ), endurance, skinfolds and hamstring flexibility. The starters who were selected to play the first game were a significantly older and more experienced playing group, and were significantly better (p < 0.05) in measures of leg power, sprinting speed and the distance covered in the Yo Yo intermittent recovery test compared to the non-starters. Although there were trends for the superiority of the starters, the differences in lower and upper body strength, VJ and predicted VO(2)max were nonsignificant. The forwards generally produced the worst fitness scores of the playing positions with the midfielders having significantly lower skinfolds and the defenders possessing better hamstring strength and VJ compared to the forwards. It was concluded that some fitness qualities can differentiate between starters and non-starters, at least in one AFL club. Comparisons of playing positions and the development of fitness norms for AFL players require further research.
- Description: C1
- Description: 2003001187
- «
- ‹
- 1
- ›
- »