How much is enough in rehabilitation? High running workloads following lower limb muscle injury delay return to play but protect against subsequent injury
- Stares, Jordan, Dawson, Brian, Peeling, Peter, Drew, Michael, Heasman, Jarryd, Rogalski, Brent, Colby, Marcus
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Drew, Michael , Heasman, Jarryd , Rogalski, Brent , Colby, Marcus
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1019-1024
- Full Text:
- Reviewed:
- Description: Objectives: Examine the influence of rehabilitation training loads on return to play (RTP) time and subsequent injury in elite Australian footballers. Design: Prospective cohort study. Methods: Internal (sessional rating of perceived exertion: sRPE) and external (distance, sprint distance) workload and lower limb non-contact muscle injury data was collected from 58 players over 5 seasons. Rehabilitation periods were analysed for running workloads and time spent in 3 rehabilitation stages (1: off-legs training, 2: non-football running, 3: group football training) was calculated. Multi-level survival analyses with random effects accounting for player and season were performed. Hazard ratios (HR) and 95% confidence intervals (CI) for each variable were produced for RTP time and time to subsequent injury. Results: Of 85 lower limb muscle injuries, 70 were rehabilitated to RTP, with 30 cases of subsequent injury recorded (recurrence rate = 11.8%, new site injury rate = 31.4%). Completion of high rehabilitation workloads delayed RTP (distance: >49,775 m [reference: 34,613–49,775 m]: HR 0.12, 95%CI 0.04–0.36, sRPE: >1266 AU [reference: 852–1266 AU]: HR 0.09, 95%CI 0.03–0.32). Return to running within 4 days increased subsequent injury risk (3–4 days [reference: 5–6 days]: HR 25.88, 95%CI 2.06–324.4). Attaining moderate-high sprint distance (427–710 m) was protective against subsequent injury (154–426 m: [reference: 427–710 m]: HR 37.41, 95%CI 2.70–518.64). Conclusions: Training load monitoring can inform player rehabilitation programs. Higher rehabilitation training loads delayed RTP; however, moderate-high sprint running loads can protect against subsequent injury. Shared-decision making regarding RTP should include accumulated training loads and consider the trade-off between expedited RTP and lower subsequent injury risk.
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Drew, Michael , Heasman, Jarryd , Rogalski, Brent , Colby, Marcus
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1019-1024
- Full Text:
- Reviewed:
- Description: Objectives: Examine the influence of rehabilitation training loads on return to play (RTP) time and subsequent injury in elite Australian footballers. Design: Prospective cohort study. Methods: Internal (sessional rating of perceived exertion: sRPE) and external (distance, sprint distance) workload and lower limb non-contact muscle injury data was collected from 58 players over 5 seasons. Rehabilitation periods were analysed for running workloads and time spent in 3 rehabilitation stages (1: off-legs training, 2: non-football running, 3: group football training) was calculated. Multi-level survival analyses with random effects accounting for player and season were performed. Hazard ratios (HR) and 95% confidence intervals (CI) for each variable were produced for RTP time and time to subsequent injury. Results: Of 85 lower limb muscle injuries, 70 were rehabilitated to RTP, with 30 cases of subsequent injury recorded (recurrence rate = 11.8%, new site injury rate = 31.4%). Completion of high rehabilitation workloads delayed RTP (distance: >49,775 m [reference: 34,613–49,775 m]: HR 0.12, 95%CI 0.04–0.36, sRPE: >1266 AU [reference: 852–1266 AU]: HR 0.09, 95%CI 0.03–0.32). Return to running within 4 days increased subsequent injury risk (3–4 days [reference: 5–6 days]: HR 25.88, 95%CI 2.06–324.4). Attaining moderate-high sprint distance (427–710 m) was protective against subsequent injury (154–426 m: [reference: 427–710 m]: HR 37.41, 95%CI 2.70–518.64). Conclusions: Training load monitoring can inform player rehabilitation programs. Higher rehabilitation training loads delayed RTP; however, moderate-high sprint running loads can protect against subsequent injury. Shared-decision making regarding RTP should include accumulated training loads and consider the trade-off between expedited RTP and lower subsequent injury risk.
The efficacy of an iterative “sequence of prevention” approach to injury prevention by a multidisciplinary team in professional rugby union
- Tee, Jason, Bekker, Sheree, Collins, Rob, Klingbiel, Jannie, van Rooyen, Ivan, van Wyk, David, Till, Kevin, Jones, Ben
- Authors: Tee, Jason , Bekker, Sheree , Collins, Rob , Klingbiel, Jannie , van Rooyen, Ivan , van Wyk, David , Till, Kevin , Jones, Ben
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 9 (2018), p. 899-904
- Full Text:
- Reviewed:
- Description: Objectives: Due to the complex systems nature of injuries, the responsibility for injury risk management cannot lie solely within a single domain of professional practice. Interdisciplinary collaboration between technical/tactical coaches, strength and conditioning coaches, team doctors, physical therapists and sport scientists is likely to have a meaningful impact on injury risk. This study describes the application and efficacy of a multidisciplinary approach to reducing team injury risk in professional rugby union. Design: Observational longitudinal cohort study. Methods: Epidemiological injury data was collected from a professional rugby union team for 5 consecutive seasons. Following each season, these data informed multidisciplinary intervention strategies to reduce injury risk. The effectiveness of these strategies was iteratively assessed to inform future interventions. Specific examples of intervention strategies are provided. Results: Overall team injury burden displayed a likely beneficial decrease (−8%; injury rate ratio (IRR) 0.9, 95%CI 0.9–1.0) from 2012 to 2016. This was achieved through a most likely beneficial improvement in non-contact injury burden (−39%; IRR 0.6, 95%CI 0.6–0.7). Contact injury burden was increased, but to a lesser extent (+18%; IRR 1.2, 95%CI 1.1–1.3, most likely harmful) during the same period. Conclusions: The range of skills required to effectively manage complex injury phenomena in professional collision sport crosses disciplinary boundaries. The evidence presented here points to the effectiveness of a multidisciplinary approach to reducing injury risk. This model will likely be applicable across a range of team and individual sports.
- Authors: Tee, Jason , Bekker, Sheree , Collins, Rob , Klingbiel, Jannie , van Rooyen, Ivan , van Wyk, David , Till, Kevin , Jones, Ben
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 9 (2018), p. 899-904
- Full Text:
- Reviewed:
- Description: Objectives: Due to the complex systems nature of injuries, the responsibility for injury risk management cannot lie solely within a single domain of professional practice. Interdisciplinary collaboration between technical/tactical coaches, strength and conditioning coaches, team doctors, physical therapists and sport scientists is likely to have a meaningful impact on injury risk. This study describes the application and efficacy of a multidisciplinary approach to reducing team injury risk in professional rugby union. Design: Observational longitudinal cohort study. Methods: Epidemiological injury data was collected from a professional rugby union team for 5 consecutive seasons. Following each season, these data informed multidisciplinary intervention strategies to reduce injury risk. The effectiveness of these strategies was iteratively assessed to inform future interventions. Specific examples of intervention strategies are provided. Results: Overall team injury burden displayed a likely beneficial decrease (−8%; injury rate ratio (IRR) 0.9, 95%CI 0.9–1.0) from 2012 to 2016. This was achieved through a most likely beneficial improvement in non-contact injury burden (−39%; IRR 0.6, 95%CI 0.6–0.7). Contact injury burden was increased, but to a lesser extent (+18%; IRR 1.2, 95%CI 1.1–1.3, most likely harmful) during the same period. Conclusions: The range of skills required to effectively manage complex injury phenomena in professional collision sport crosses disciplinary boundaries. The evidence presented here points to the effectiveness of a multidisciplinary approach to reducing injury risk. This model will likely be applicable across a range of team and individual sports.
Multivariate modelling of subjective and objective monitoring data improve the detection of non-contact injury risk in elite Australian footballers
- Colby, Marcus, Dawson, Brian, Peeling, Peter, Heasman, Jarryd, Rogalski, Brent, Drew, Michael, Stares, Jordan, Zouhal, Hassane, Lester, Leanne
- Authors: Colby, Marcus , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Stares, Jordan , Zouhal, Hassane , Lester, Leanne
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 20, no. 12 (2017), p. 1068-1074
- Full Text:
- Reviewed:
- Description: Objectives: To assess the association between workload, subjective wellness, musculoskeletal screening measures and non-contact injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Across 4 seasons in 70 players from one club, cumulative weekly workloads (acute; 1 week, chronic; 2-, 3-, 4-week) and acute:chronic workload ratio’s (ACWR: 1-week load/average 4-weekly load) for session-Rating of Perceived Exertion (sRPE) and GPS-derived distance and sprint distance were calculated. Wellness, screening and non-contact injury data were also documented. Univariate and multivariate regression models determined injury incidence rate ratios (IRR) while accounting for interaction/moderating effects. Receiver operating characteristics determined model predictive accuracy (area under curve: AUC). Results: Very low cumulative chronic (2-, 3-, 4- week) workloads were associated with the greatest injury risk (univariate IRR = 1.71–2.16, 95% CI = 1.10–4.52) in the subsequent week. In multivariate analysis, the interaction between a low chronic load and a very high distance (adj-IRR = 2.60, 95% CI = 1.07–6.34) or low sRPE ACWR (adj-IRR = 2.52, 95% CI = 1.01–6.29) was associated with increased injury risk. Subjectively reporting “yes” (vs. “no”) for old lower limb pain and heavy non-football activity in the previous 7 days (multivariate adj-IRR = 2.01–2.25, 95% CI = 1.02–4.95) and playing experience (>9 years) (multivariate adj- IRR = 2.05, 95% CI = 1.03–4.06) was also associated with increased injury risk, but screening data were not. Predictive capacity of multivariate models was significantly better than univariate (AUCmultivariate = 0.70, 95% CI 0.64–0.75; AUCunivariate range = 0.51–0.60). Conclusions: Chronic load is an important moderating factor in the workload–injury relationship. Low chronic loads coupled with low or very high ACWR are associated with increased injury risk.
- Description: Objectives: To assess the association between workload, subjective
- Authors: Colby, Marcus , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Stares, Jordan , Zouhal, Hassane , Lester, Leanne
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 20, no. 12 (2017), p. 1068-1074
- Full Text:
- Reviewed:
- Description: Objectives: To assess the association between workload, subjective wellness, musculoskeletal screening measures and non-contact injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Across 4 seasons in 70 players from one club, cumulative weekly workloads (acute; 1 week, chronic; 2-, 3-, 4-week) and acute:chronic workload ratio’s (ACWR: 1-week load/average 4-weekly load) for session-Rating of Perceived Exertion (sRPE) and GPS-derived distance and sprint distance were calculated. Wellness, screening and non-contact injury data were also documented. Univariate and multivariate regression models determined injury incidence rate ratios (IRR) while accounting for interaction/moderating effects. Receiver operating characteristics determined model predictive accuracy (area under curve: AUC). Results: Very low cumulative chronic (2-, 3-, 4- week) workloads were associated with the greatest injury risk (univariate IRR = 1.71–2.16, 95% CI = 1.10–4.52) in the subsequent week. In multivariate analysis, the interaction between a low chronic load and a very high distance (adj-IRR = 2.60, 95% CI = 1.07–6.34) or low sRPE ACWR (adj-IRR = 2.52, 95% CI = 1.01–6.29) was associated with increased injury risk. Subjectively reporting “yes” (vs. “no”) for old lower limb pain and heavy non-football activity in the previous 7 days (multivariate adj-IRR = 2.01–2.25, 95% CI = 1.02–4.95) and playing experience (>9 years) (multivariate adj- IRR = 2.05, 95% CI = 1.03–4.06) was also associated with increased injury risk, but screening data were not. Predictive capacity of multivariate models was significantly better than univariate (AUCmultivariate = 0.70, 95% CI 0.64–0.75; AUCunivariate range = 0.51–0.60). Conclusions: Chronic load is an important moderating factor in the workload–injury relationship. Low chronic loads coupled with low or very high ACWR are associated with increased injury risk.
- Description: Objectives: To assess the association between workload, subjective
Seasonal time-loss match injury rates and burden in South African under-16 rugby teams
- Sewry, Nicola, Verhagen, Evert, Lambert, Mike, van Mechelen, Willem, Readhead, Clint, Viljoen, Wayne, Brown, James
- Authors: Sewry, Nicola , Verhagen, Evert , Lambert, Mike , van Mechelen, Willem , Readhead, Clint , Viljoen, Wayne , Brown, James
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 1 (2019), p. 54-58
- Full Text:
- Reviewed:
- Description: Objectives: Youth rugby union is a popular sport with a high injury incidence density (IID) and burden. This high risk has called for further research into the factors affecting the injuries in youth rugby. The aim of the study was to analyse time-loss IID and burden in multiple schoolboy rugby teams over a season and the potential factors associated with injury. Design: Prospective cohort Methods: All time-loss injuries were recorded from three schools for the whole season. Overall IID and injury burden were calculated, as well as for injury event, type, location and the match quarter in which they occurred and Poisson regression analyses were performed to determine differences. Results: IID was 28.8 (18.9–38.6) injuries per 1000 player hours over the season, with an injury burden of 379.2 (343.6–414.9) days lost per 1000 player hours. The ball-carrier had a significantly higher IID (11.3 (5.2–17.5) per 1000 player hours) compared to other events, and the joint (non-bone)/ligament injuries were the most common (IID of 12.2 (5.8–18.6) per 1000 player hours) and severe type of injury (burden of 172.6 (148.5–196.6) days lost per 1000 player hours). Conclusions: The IID was similar to previous youth rugby studies, however the injury burden was much lower. The South African youth cohort showed similar factors associated with injury for inciting event (the tackle) and injury type (joint (non-bone)/ligament) and location (lower limb) as seen in other studies in both youth and senior players.
- Authors: Sewry, Nicola , Verhagen, Evert , Lambert, Mike , van Mechelen, Willem , Readhead, Clint , Viljoen, Wayne , Brown, James
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 1 (2019), p. 54-58
- Full Text:
- Reviewed:
- Description: Objectives: Youth rugby union is a popular sport with a high injury incidence density (IID) and burden. This high risk has called for further research into the factors affecting the injuries in youth rugby. The aim of the study was to analyse time-loss IID and burden in multiple schoolboy rugby teams over a season and the potential factors associated with injury. Design: Prospective cohort Methods: All time-loss injuries were recorded from three schools for the whole season. Overall IID and injury burden were calculated, as well as for injury event, type, location and the match quarter in which they occurred and Poisson regression analyses were performed to determine differences. Results: IID was 28.8 (18.9–38.6) injuries per 1000 player hours over the season, with an injury burden of 379.2 (343.6–414.9) days lost per 1000 player hours. The ball-carrier had a significantly higher IID (11.3 (5.2–17.5) per 1000 player hours) compared to other events, and the joint (non-bone)/ligament injuries were the most common (IID of 12.2 (5.8–18.6) per 1000 player hours) and severe type of injury (burden of 172.6 (148.5–196.6) days lost per 1000 player hours). Conclusions: The IID was similar to previous youth rugby studies, however the injury burden was much lower. The South African youth cohort showed similar factors associated with injury for inciting event (the tackle) and injury type (joint (non-bone)/ligament) and location (lower limb) as seen in other studies in both youth and senior players.
Abrasion injuries on artificial turf : A systematic review
- Twomey, Dara, Petrass, Lauren, Fleming, Paul, Lenehan, Kurt
- Authors: Twomey, Dara , Petrass, Lauren , Fleming, Paul , Lenehan, Kurt
- Date: 2019
- Type: Text , Journal article , Review
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 5 (2019), p. 550-556
- Full Text:
- Reviewed:
- Description: Objectives: To review the incidence of abrasion injuries sustained on artificial turf playing fields and the level of evidence existing on player perceptions of abrasion injuries on these surfaces. Design: Systematic review. Method: A systematic search was performed using SPORTDiscus, Medline, Web of Science, Scopus and Science Direct databases. Inclusion criteria included: abrasion type injuries measured; conducted on artificial/synthetic turf; type of sport reported; peer-reviewed original research; English language search terms, but no language restrictions. A quality assessment was conducted using the Newcastle-Ottawa quality scale. Results: The search yielded 76 potential articles, with 25 meeting all inclusion criteria. Twenty articles were injury-based and five were perception–based. The differences in injury definition and the lack of details of the playing surfaces produced varying results on the rate of injuries on artificial turf. Regardless of the condition of the surface, the level of play, or the sport, players perceived the fear of abrasion injuries as a major disadvantage of artificial turf surfaces. Conclusions: The review highlighted the current disparity that exists between players’ perceptions of abrasion injuries and the level of evidence of abrasion injury risk on artificial turf playing surfaces. There is a need for the inclusion of greater detail of playing surfaces’ specifications and condition, and an injury definition sufficiently sensitive to better measure abrasion injury incidence and severity. Without this more detailed information, it is likely that the strongly perceived risk of abrasion injuries will continue as a barrier to the adoption of artificial playing surfaces.
- Authors: Twomey, Dara , Petrass, Lauren , Fleming, Paul , Lenehan, Kurt
- Date: 2019
- Type: Text , Journal article , Review
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 5 (2019), p. 550-556
- Full Text:
- Reviewed:
- Description: Objectives: To review the incidence of abrasion injuries sustained on artificial turf playing fields and the level of evidence existing on player perceptions of abrasion injuries on these surfaces. Design: Systematic review. Method: A systematic search was performed using SPORTDiscus, Medline, Web of Science, Scopus and Science Direct databases. Inclusion criteria included: abrasion type injuries measured; conducted on artificial/synthetic turf; type of sport reported; peer-reviewed original research; English language search terms, but no language restrictions. A quality assessment was conducted using the Newcastle-Ottawa quality scale. Results: The search yielded 76 potential articles, with 25 meeting all inclusion criteria. Twenty articles were injury-based and five were perception–based. The differences in injury definition and the lack of details of the playing surfaces produced varying results on the rate of injuries on artificial turf. Regardless of the condition of the surface, the level of play, or the sport, players perceived the fear of abrasion injuries as a major disadvantage of artificial turf surfaces. Conclusions: The review highlighted the current disparity that exists between players’ perceptions of abrasion injuries and the level of evidence of abrasion injury risk on artificial turf playing surfaces. There is a need for the inclusion of greater detail of playing surfaces’ specifications and condition, and an injury definition sufficiently sensitive to better measure abrasion injury incidence and severity. Without this more detailed information, it is likely that the strongly perceived risk of abrasion injuries will continue as a barrier to the adoption of artificial playing surfaces.
- «
- ‹
- 1
- ›
- »