Measuring children's self-reported sport participation, risk perception and injury history : Development and validation of a survey instrument
- Siesmaa, Emma, Blitvich, Jennifer, White, Peta, Finch, Caroline
- Authors: Siesmaa, Emma , Blitvich, Jennifer , White, Peta , Finch, Caroline
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 14, no. 1 (2011), p. 22-26
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text:
- Reviewed:
- Description: Despite the health benefits associated with children's sport participation, the occurrence of injury in this context is common. The extent to which sport injuries impact children's ongoing involvement in sport is largely unknown. Surveys have been shown to be useful for collecting children's injury and sport participation data; however, there are currently no published instruments which investigate the impact of injury on children's sport participation. This study describes the processes undertaken to assess the validity of two survey instruments for collecting self-reported information about child cricket and netball related participation, injury history and injury risk perceptions, as well as the reliability of the cricket-specific version. Face and content validity were assessed through expert feedback from primary and secondary level teachers and from representatives of peak sporting bodies for cricket and netball. Test-retest reliability was measured using a sample of 59 child cricketers who completed the survey on two occasions, 3-4 weeks apart. Based on expert feedback relating to face and content validity, modification and/or deletion of some survey items was undertaken. Survey items with low test-retest reliability (κ≤ 0.40) were modified or deleted, items with moderate reliability (κ=0.41-0.60) were modified slightly and items with higher reliability (κ≥ 0.61) were retained, with some undergoing minor modifications. This is the first survey of its kind which has been successfully administered to cricketers aged 10-16 years to collect information about injury risk perceptions and intentions for continued sport participation. Implications for its generalisation to other child sport participants are discussed. © 2010 Sports Medicine Australia.
- Authors: Siesmaa, Emma , Blitvich, Jennifer , White, Peta , Finch, Caroline
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 14, no. 1 (2011), p. 22-26
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text:
- Reviewed:
- Description: Despite the health benefits associated with children's sport participation, the occurrence of injury in this context is common. The extent to which sport injuries impact children's ongoing involvement in sport is largely unknown. Surveys have been shown to be useful for collecting children's injury and sport participation data; however, there are currently no published instruments which investigate the impact of injury on children's sport participation. This study describes the processes undertaken to assess the validity of two survey instruments for collecting self-reported information about child cricket and netball related participation, injury history and injury risk perceptions, as well as the reliability of the cricket-specific version. Face and content validity were assessed through expert feedback from primary and secondary level teachers and from representatives of peak sporting bodies for cricket and netball. Test-retest reliability was measured using a sample of 59 child cricketers who completed the survey on two occasions, 3-4 weeks apart. Based on expert feedback relating to face and content validity, modification and/or deletion of some survey items was undertaken. Survey items with low test-retest reliability (κ≤ 0.40) were modified or deleted, items with moderate reliability (κ=0.41-0.60) were modified slightly and items with higher reliability (κ≥ 0.61) were retained, with some undergoing minor modifications. This is the first survey of its kind which has been successfully administered to cricketers aged 10-16 years to collect information about injury risk perceptions and intentions for continued sport participation. Implications for its generalisation to other child sport participants are discussed. © 2010 Sports Medicine Australia.
Injury risk associated with ground hardness in junior cricket
- Twomey, Dara, White, Peta, Finch, Caroline
- Authors: Twomey, Dara , White, Peta , Finch, Caroline
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol.15 , no.2 (2011), p.110-115
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text:
- Reviewed:
- Description: To establish if there is an association between ground hardness and injury risk in junior cricket. Nested case-series of players who played matches on specific grounds with objective ground hardness measures, within a prospective cohort study of junior community club cricket players. Monitoring of injuries and playing exposure occurred during 434 matches over the 2007/2008 playing season. Objective assessment of the hardness of 38 grounds was undertaken using a Clegg hammer at 13 sites on 19 different junior cricket grounds on the match eve across the season. Hardness readings were classified from unacceptably low (<30 g) to unacceptably high (>120 g) and two independent raters assessed the likelihood of each injury being related to ground hardness. Injuries sustained on tested grounds were related to the ground hardness measures. Overall, 31 match injuries were reported; 6.5% were rated as likely to be related to ground hardness, 16.1% as possibly related and 74.2% as unlikely to be related and 3.2% unknown. The two injuries likely to be related to ground hardness were sustained while diving to catch a ball resulting, in a graze/laceration from contact with hard ground. Overall, 31/38 (82%) ground assessments were rated as having 'unacceptably high' hardness and all others as 'high/normal' hardness. Only one injury occurred on an objectively tested ground. It remains unclear if ground hardness is a contributing factor to the most common injury mechanism of being struck by the ball, and needs to be confirmed in future larger-scale studies. © 2011 Sports Medicine Australia.
- Authors: Twomey, Dara , White, Peta , Finch, Caroline
- Date: 2011
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol.15 , no.2 (2011), p.110-115
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text:
- Reviewed:
- Description: To establish if there is an association between ground hardness and injury risk in junior cricket. Nested case-series of players who played matches on specific grounds with objective ground hardness measures, within a prospective cohort study of junior community club cricket players. Monitoring of injuries and playing exposure occurred during 434 matches over the 2007/2008 playing season. Objective assessment of the hardness of 38 grounds was undertaken using a Clegg hammer at 13 sites on 19 different junior cricket grounds on the match eve across the season. Hardness readings were classified from unacceptably low (<30 g) to unacceptably high (>120 g) and two independent raters assessed the likelihood of each injury being related to ground hardness. Injuries sustained on tested grounds were related to the ground hardness measures. Overall, 31 match injuries were reported; 6.5% were rated as likely to be related to ground hardness, 16.1% as possibly related and 74.2% as unlikely to be related and 3.2% unknown. The two injuries likely to be related to ground hardness were sustained while diving to catch a ball resulting, in a graze/laceration from contact with hard ground. Overall, 31/38 (82%) ground assessments were rated as having 'unacceptably high' hardness and all others as 'high/normal' hardness. Only one injury occurred on an objectively tested ground. It remains unclear if ground hardness is a contributing factor to the most common injury mechanism of being struck by the ball, and needs to be confirmed in future larger-scale studies. © 2011 Sports Medicine Australia.
Immunohistochemical analysis of pancreatic islets of platypus (Ornithorhynchus anatinus) and echidna (Tachyglossus aculeatus ssp.)
- He, Chuan, Myers, Mark, Forbes, Briony, Grützner, Frank
- Authors: He, Chuan , Myers, Mark , Forbes, Briony , Grützner, Frank
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Anatomy Vol. 226, no. 4 (2015), p. 373-380
- Full Text:
- Reviewed:
- Description: Monotremes have undergone remarkable changes to their digestive and metabolic control system; however, the monotreme pancreas remains poorly characterized. Previous work in echidna demonstrated the presence of pancreatic islets, but no information is available for platypus and the fine structure has not been described for either monotreme. Based on our recent finding that monotremes lack the ghrelin gene, which is expressed in mouse and human pancreatic islets, we investigated the structure of monotreme islets in more detail. Generally, as in birds, the islets of monotremes were smaller but greater in number compared with mouse. b-cells were the most abundant endocrine cell population in platypus islets and were located peripherally, while a-cells were observed both in the interior and periphery of the islets. d-cells and pancreatic polypeptide (PP)-cells were mainly found in the islet periphery. Distinct PP-rich (PP-lobe) and PP-poor areas (non-PP-lobe) are present in therian mammals, and we identified these areas in echidna but not platypus pancreas. Interestingly, in some of the echidna islets, a- and b-cells tended to form two poles within the islets, which to our knowledge is the first time this has been observed in any species. Overall, monotreme pancreata share the feature of consisting of distinct PP-poor and PP-rich islets with other mammals. A higher number of islets and a- or b-cell only islets are shared between monotremes and birds. The islets of monotremes were larger than those of birds but smaller compared with therian mammals. This may indicate a trend of having fewer larger islets comprising several endocrine cell types during mammalian evolution. © 2015 Anatomical Society.
- Authors: He, Chuan , Myers, Mark , Forbes, Briony , Grützner, Frank
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Anatomy Vol. 226, no. 4 (2015), p. 373-380
- Full Text:
- Reviewed:
- Description: Monotremes have undergone remarkable changes to their digestive and metabolic control system; however, the monotreme pancreas remains poorly characterized. Previous work in echidna demonstrated the presence of pancreatic islets, but no information is available for platypus and the fine structure has not been described for either monotreme. Based on our recent finding that monotremes lack the ghrelin gene, which is expressed in mouse and human pancreatic islets, we investigated the structure of monotreme islets in more detail. Generally, as in birds, the islets of monotremes were smaller but greater in number compared with mouse. b-cells were the most abundant endocrine cell population in platypus islets and were located peripherally, while a-cells were observed both in the interior and periphery of the islets. d-cells and pancreatic polypeptide (PP)-cells were mainly found in the islet periphery. Distinct PP-rich (PP-lobe) and PP-poor areas (non-PP-lobe) are present in therian mammals, and we identified these areas in echidna but not platypus pancreas. Interestingly, in some of the echidna islets, a- and b-cells tended to form two poles within the islets, which to our knowledge is the first time this has been observed in any species. Overall, monotreme pancreata share the feature of consisting of distinct PP-poor and PP-rich islets with other mammals. A higher number of islets and a- or b-cell only islets are shared between monotremes and birds. The islets of monotremes were larger than those of birds but smaller compared with therian mammals. This may indicate a trend of having fewer larger islets comprising several endocrine cell types during mammalian evolution. © 2015 Anatomical Society.
Thermodynamic analysis questions claims of improved cardiac efficiency by dietary fish oil
- Loiselle, Denis, Han, June-Chiew, Goo, Eden, Chapman, Brian, Barclay, Christopher, Hickey, Anthony, Taberner, Andrew
- Authors: Loiselle, Denis , Han, June-Chiew , Goo, Eden , Chapman, Brian , Barclay, Christopher , Hickey, Anthony , Taberner, Andrew
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of General Physiology Vol. 148, no. 3 (2016), p. 183-193
- Full Text:
- Reviewed:
- Description: Studies in the literature describe the ability of dietary supplementation by omega-3 fish oil to increase the pumping efficiency of the left ventricle. Here we attempt to reconcile such studies with our own null results. We undertake a quantitative analysis of the improvement that could be expected theoretically, subject to physiological constraints, by posing the following question: By how much could efficiency be expected to increase if inefficiencies could be eliminated? Our approach utilizes thermodynamic analyses to investigate the contributions, both singly and collectively, of the major components of cardiac energetics to total cardiac efficiency. We conclude that it is unlikely that fish oils could achieve the required diminution of inefficiencies without greatly compromising cardiac performance.
- Authors: Loiselle, Denis , Han, June-Chiew , Goo, Eden , Chapman, Brian , Barclay, Christopher , Hickey, Anthony , Taberner, Andrew
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of General Physiology Vol. 148, no. 3 (2016), p. 183-193
- Full Text:
- Reviewed:
- Description: Studies in the literature describe the ability of dietary supplementation by omega-3 fish oil to increase the pumping efficiency of the left ventricle. Here we attempt to reconcile such studies with our own null results. We undertake a quantitative analysis of the improvement that could be expected theoretically, subject to physiological constraints, by posing the following question: By how much could efficiency be expected to increase if inefficiencies could be eliminated? Our approach utilizes thermodynamic analyses to investigate the contributions, both singly and collectively, of the major components of cardiac energetics to total cardiac efficiency. We conclude that it is unlikely that fish oils could achieve the required diminution of inefficiencies without greatly compromising cardiac performance.
Psychometric evaluation of the revised Michigan Diabetes Knowledge Test (V.2016) in Arabic : Translation and validation
- Alhaiti, Ali, Alotaibi, Alanod, Jones, Linda, Dacosta, Cliff, Lenon, George
- Authors: Alhaiti, Ali , Alotaibi, Alanod , Jones, Linda , Dacosta, Cliff , Lenon, George
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Diabetes Research Vol. 2016, no. (2016), p. 1-7
- Full Text:
- Reviewed:
- Description: Objective. To translate the revised Michigan Diabetes Knowledge Test into the Arabic language and examine its psychometric properties. Setting. Of the 139 participants recruited through King Fahad Medical City in Riyadh, Saudi Arabia, 34 agreed to the second-round sample for retesting purposes. Methods. The translation process followed the World Health Organization's guidelines for the translation and adaptation of instruments. All translations were examined for their validity and reliability. Results. The translation process revealed excellent results throughout all stages. The Arabic version received 0.75 for internal consistency via Cronbach's alpha test and excellent outcomes in terms of the test-retest reliability of the instrument with a mean of 0.90 infraclass correlation coefficient. It also received positive content validity index scores. The item-level content validity index for all instrument scales fell between 0.83 and 1 with a mean scale-level index of 0.96. Conclusion. The Arabic version is proven to be a reliable and valid measure of patient's knowledge that is ready to be used in clinical practices. © 2016 Ali Hassan Alhaiti et al.
- Authors: Alhaiti, Ali , Alotaibi, Alanod , Jones, Linda , Dacosta, Cliff , Lenon, George
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Diabetes Research Vol. 2016, no. (2016), p. 1-7
- Full Text:
- Reviewed:
- Description: Objective. To translate the revised Michigan Diabetes Knowledge Test into the Arabic language and examine its psychometric properties. Setting. Of the 139 participants recruited through King Fahad Medical City in Riyadh, Saudi Arabia, 34 agreed to the second-round sample for retesting purposes. Methods. The translation process followed the World Health Organization's guidelines for the translation and adaptation of instruments. All translations were examined for their validity and reliability. Results. The translation process revealed excellent results throughout all stages. The Arabic version received 0.75 for internal consistency via Cronbach's alpha test and excellent outcomes in terms of the test-retest reliability of the instrument with a mean of 0.90 infraclass correlation coefficient. It also received positive content validity index scores. The item-level content validity index for all instrument scales fell between 0.83 and 1 with a mean scale-level index of 0.96. Conclusion. The Arabic version is proven to be a reliable and valid measure of patient's knowledge that is ready to be used in clinical practices. © 2016 Ali Hassan Alhaiti et al.
Facilitators to support the implementation of injury prevention training in youth handball : A concept mapping approach
- Ageberg, Eva, Bunke, Sofia, Lucander, Karolina, Nilsen, Per, Donaldson, Alex
- Authors: Ageberg, Eva , Bunke, Sofia , Lucander, Karolina , Nilsen, Per , Donaldson, Alex
- Date: 2019
- Type: Text , Journal article
- Relation: Scandinavian Journal of Medicine and Science in Sports Vol. 29, no. 2 (2019), p. 275-285
- Full Text:
- Reviewed:
- Description: There is a need for research to identify effective implementation strategies for injury prevention training within real-world community sports. The aim of this ecological participatory study was to identify facilitators, among stakeholders at multiple levels, that could help injury prevention training become part of regular training routines in youth team handball. Concept mapping, a mixed-method approach for qualitative data collection and quantitative data analysis, was used. Stakeholders (n = 196) of two community team handball clubs (29% players, 13% coaches, 38% caregivers, 11% club, district and national handball administrators, 9% unknown) participated in a brainstorming process. After the research team synthesized the 235 generated statements, 50 stakeholders (34% players, 22% coaches, 24% caregivers, 20% administrators) sorted 89 unique facilitator statements into clusters and rated them for importance and feasibility. Multidimensional scaling and hierarchical cluster analysis yielded five clusters (stress value 0.231): “Understanding and applying knowledge,” “Education, knowledge, and consistency,” “Set-up and exercises,” “Inspiration, motivation, and routines,” and “Club policy and expert collaboration.” The cluster “Understanding and applying knowledge” had the highest mean importance (3.17 out of 4) and feasibility (2.93) ratings. The 32 statements rated as both highly important and feasible (Go-zone) indicate action is required at the individual (end-users) and organizational (policymakers) levels to implement injury prevention training. Results suggest that developing evidence-based context-specific injury prevention training, incorporating physiological, biomechanical and psychological components, and an associated context-specific implementation plan in partnership with all stakeholders should be a high priority to facilitate the implementation of injury prevention training in youth team handball.
- Authors: Ageberg, Eva , Bunke, Sofia , Lucander, Karolina , Nilsen, Per , Donaldson, Alex
- Date: 2019
- Type: Text , Journal article
- Relation: Scandinavian Journal of Medicine and Science in Sports Vol. 29, no. 2 (2019), p. 275-285
- Full Text:
- Reviewed:
- Description: There is a need for research to identify effective implementation strategies for injury prevention training within real-world community sports. The aim of this ecological participatory study was to identify facilitators, among stakeholders at multiple levels, that could help injury prevention training become part of regular training routines in youth team handball. Concept mapping, a mixed-method approach for qualitative data collection and quantitative data analysis, was used. Stakeholders (n = 196) of two community team handball clubs (29% players, 13% coaches, 38% caregivers, 11% club, district and national handball administrators, 9% unknown) participated in a brainstorming process. After the research team synthesized the 235 generated statements, 50 stakeholders (34% players, 22% coaches, 24% caregivers, 20% administrators) sorted 89 unique facilitator statements into clusters and rated them for importance and feasibility. Multidimensional scaling and hierarchical cluster analysis yielded five clusters (stress value 0.231): “Understanding and applying knowledge,” “Education, knowledge, and consistency,” “Set-up and exercises,” “Inspiration, motivation, and routines,” and “Club policy and expert collaboration.” The cluster “Understanding and applying knowledge” had the highest mean importance (3.17 out of 4) and feasibility (2.93) ratings. The 32 statements rated as both highly important and feasible (Go-zone) indicate action is required at the individual (end-users) and organizational (policymakers) levels to implement injury prevention training. Results suggest that developing evidence-based context-specific injury prevention training, incorporating physiological, biomechanical and psychological components, and an associated context-specific implementation plan in partnership with all stakeholders should be a high priority to facilitate the implementation of injury prevention training in youth team handball.
The reliability and sensitivity of performance measures in a novel pace-bowling test
- Feros, Simon, Young, Warren, O’Brien, Brendan
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 2 (2018), p. 151-155
- Full Text:
- Reviewed:
- Description: Objectives: To evaluate the reliability and sensitivity of performance measures in a novel pace-bowling test. Methods: Thirteen male amateur-club fast bowlers completed a novel pace-bowling test on 2 separate occasions, 4–7 d apart. Participants delivered 48 balls (8 overs) at 5 targets on a suspended sheet situated behind a live batter, who stood in a right-handed and left-handed stance for an equal number of deliveries. Delivery instruction was frequently changed, with all deliveries executed in a preplanned sequence. Data on ball-release speed were captured by radar gun. A high-speed camera captured the moment of ball impact on the target sheet for assessment of radial error and bivariate variable error. Delivery rating of perceived exertion (0–100%) was collected as a measure of intensity. Results: Intraclass correlation coefficients and coefficients of variation revealed excellent reliability for peak and mean ball-release speed, acceptable reliability for delivery rating of perceived exertion, and poor reliability for mean radial error, bivariate variable error, and variability of ball-release speed. The smallest worthwhile change indicated high sensitivity with peak and mean ball-release speed and lower sensitivity with mean radial error and bivariate variable error. Conclusions: The novel pace-bowling test incorporates improvements in ecological validity compared with its predecessors and can be used to provide a more comprehensive evaluation of pace-bowling performance. Data on the smallest worthwhile change can improve interpretation of pace-bowling research findings and may therefore influence recommendations for applied practice. © 2018 Human Kinetics, Inc.
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 2 (2018), p. 151-155
- Full Text:
- Reviewed:
- Description: Objectives: To evaluate the reliability and sensitivity of performance measures in a novel pace-bowling test. Methods: Thirteen male amateur-club fast bowlers completed a novel pace-bowling test on 2 separate occasions, 4–7 d apart. Participants delivered 48 balls (8 overs) at 5 targets on a suspended sheet situated behind a live batter, who stood in a right-handed and left-handed stance for an equal number of deliveries. Delivery instruction was frequently changed, with all deliveries executed in a preplanned sequence. Data on ball-release speed were captured by radar gun. A high-speed camera captured the moment of ball impact on the target sheet for assessment of radial error and bivariate variable error. Delivery rating of perceived exertion (0–100%) was collected as a measure of intensity. Results: Intraclass correlation coefficients and coefficients of variation revealed excellent reliability for peak and mean ball-release speed, acceptable reliability for delivery rating of perceived exertion, and poor reliability for mean radial error, bivariate variable error, and variability of ball-release speed. The smallest worthwhile change indicated high sensitivity with peak and mean ball-release speed and lower sensitivity with mean radial error and bivariate variable error. Conclusions: The novel pace-bowling test incorporates improvements in ecological validity compared with its predecessors and can be used to provide a more comprehensive evaluation of pace-bowling performance. Data on the smallest worthwhile change can improve interpretation of pace-bowling research findings and may therefore influence recommendations for applied practice. © 2018 Human Kinetics, Inc.
Telomere dynamics during aging in polygenic left ventricular hypertrophy
- Marques, Francine, Booth, Scott, Prestes, Priscilla, Curl, Claire, Delbridge, Lea, Lewandowski, Paul, Harrap, Stephen, Charchar, Fadi
- Authors: Marques, Francine , Booth, Scott , Prestes, Priscilla , Curl, Claire , Delbridge, Lea , Lewandowski, Paul , Harrap, Stephen , Charchar, Fadi
- Date: 2016
- Type: Text , Journal article
- Relation: Physiological Genomics Vol. 48, no. 1 (2016), p. 42-49
- Full Text:
- Reviewed:
- Description: Short telomeres are associated with increased risk of cardiovascular disease. Here we studied cardiomyocyte telomere length at key ages during the ontogeny of cardiac hypertrophy and failure in the hypertrophic heart rat (HHR) and compared these with the normal heart rat (NHR) control strain. Key ages corresponded with the pathophysiological sequence beginning with fewer cardiomyocytes (2 days), leading to left ventricular hypertrophy (LVH) (13 wk) and subsequently progression to heart failure (38 wk). We measured telomere length, tissue activity of telomerase, mRNA levels of telomerase reverse transcriptase (Tert) and telomerase RNA component (Terc), and expression of the telomeric regulator microRNA miR-34a. Cardiac telomere length was longer in the HHR compared with the control strain at 2 days and 38 wk, but shorter at 13 wk. Neonatal HHR had higher cardiac telomerase activity and expression of Tert and miR-34a. Telomerase activity was not different at 13 or 38 wk. Tert mRNA and Terc RNA were overexpressed at 38 wk, while miR-34a was overexpressed at 13 wk but downregulated at 38 wk. Circulating leukocytes were strongly correlated with cardiac telomere length in the HHR only. The longer neonatal telomeres in HHR are likely to reflect fewer fetal and early postnatal cardiomyocyte cell divisions and explain the reduced total cardiomyocyte complement that predisposes to later hypertrophy and failure. Although shorter telomeres were a feature of cardiac hypertrophy at 13 wk, they were not present at the progression to heart failure at 38 wk. © 2016 the American Physiological Society.
- Authors: Marques, Francine , Booth, Scott , Prestes, Priscilla , Curl, Claire , Delbridge, Lea , Lewandowski, Paul , Harrap, Stephen , Charchar, Fadi
- Date: 2016
- Type: Text , Journal article
- Relation: Physiological Genomics Vol. 48, no. 1 (2016), p. 42-49
- Full Text:
- Reviewed:
- Description: Short telomeres are associated with increased risk of cardiovascular disease. Here we studied cardiomyocyte telomere length at key ages during the ontogeny of cardiac hypertrophy and failure in the hypertrophic heart rat (HHR) and compared these with the normal heart rat (NHR) control strain. Key ages corresponded with the pathophysiological sequence beginning with fewer cardiomyocytes (2 days), leading to left ventricular hypertrophy (LVH) (13 wk) and subsequently progression to heart failure (38 wk). We measured telomere length, tissue activity of telomerase, mRNA levels of telomerase reverse transcriptase (Tert) and telomerase RNA component (Terc), and expression of the telomeric regulator microRNA miR-34a. Cardiac telomere length was longer in the HHR compared with the control strain at 2 days and 38 wk, but shorter at 13 wk. Neonatal HHR had higher cardiac telomerase activity and expression of Tert and miR-34a. Telomerase activity was not different at 13 or 38 wk. Tert mRNA and Terc RNA were overexpressed at 38 wk, while miR-34a was overexpressed at 13 wk but downregulated at 38 wk. Circulating leukocytes were strongly correlated with cardiac telomere length in the HHR only. The longer neonatal telomeres in HHR are likely to reflect fewer fetal and early postnatal cardiomyocyte cell divisions and explain the reduced total cardiomyocyte complement that predisposes to later hypertrophy and failure. Although shorter telomeres were a feature of cardiac hypertrophy at 13 wk, they were not present at the progression to heart failure at 38 wk. © 2016 the American Physiological Society.
Involvement of human monogenic cardiomyopathy genes in experimental polygenic cardiac hypertrophy
- Prestes, Priscilla, Marques, Francine, Lopez-Campos, Guillermo, Lewandowski, Paul, Delbridge, Lea, Charchar, Fadi, Harrap, Stephen
- Authors: Prestes, Priscilla , Marques, Francine , Lopez-Campos, Guillermo , Lewandowski, Paul , Delbridge, Lea , Charchar, Fadi , Harrap, Stephen
- Date: 2018
- Type: Text , Journal article
- Relation: Physiological Genomics Vol. 50, no. 9 (2018), p. 680-687
- Full Text:
- Reviewed:
- Description: Hypertrophic cardiomyopathy thickens heart muscles, reducing functionality and increasing risk of cardiac disease and morbidity. Genetic factors are involved, but their contribution is poorly understood. We used the hypertrophic heart rat (HHR), a unique normotensive polygenic model of cardiac hypertrophy and heart failure, to investigate the role of genes associated with monogenic human cardiomyopathy. We selected 42 genes involved in monogenic human cardiomyopathies to study: 1) DNA variants, by sequencing the whole genome of 13-wk-old HHR and age-matched normal heart rat (NHR), its genetic control strain; 2) mRNA expression, by targeted RNA-sequencing in left ventricles of HHR and NHR at 5 ages (2 days old and 4, 13, 33, and 50 wk old) compared with human idiopathic dilated cardiomyopathy data; and 3) microRNA expression, with rat microRNA microarrays in left ventricles of 2-day-old HHR and age-matched NHR. We also investigated experimentally validated microRNA-mRNA interactions. Whole-genome sequencing revealed unique variants mostly located in noncoding regions of HHR and NHR. We found 29 genes differentially expressed in at least 1 age. Genes encoding desmoglein 2 (Dsg2) and transthyretin (Ttr) were significantly differentially expressed at all ages in the HHR, but only Ttr was also differentially expressed in human idiopathic cardiomyopathy. Lastly, only two microRNAs differentially expressed in the HHR were present in our comparison of validated microRNA-mRNA interactions. These two microRNAs interact with five of the genes studied. Our study shows that genes involved in monogenic forms of human cardiomyopathies may also influence polygenic forms of the disease.
- Authors: Prestes, Priscilla , Marques, Francine , Lopez-Campos, Guillermo , Lewandowski, Paul , Delbridge, Lea , Charchar, Fadi , Harrap, Stephen
- Date: 2018
- Type: Text , Journal article
- Relation: Physiological Genomics Vol. 50, no. 9 (2018), p. 680-687
- Full Text:
- Reviewed:
- Description: Hypertrophic cardiomyopathy thickens heart muscles, reducing functionality and increasing risk of cardiac disease and morbidity. Genetic factors are involved, but their contribution is poorly understood. We used the hypertrophic heart rat (HHR), a unique normotensive polygenic model of cardiac hypertrophy and heart failure, to investigate the role of genes associated with monogenic human cardiomyopathy. We selected 42 genes involved in monogenic human cardiomyopathies to study: 1) DNA variants, by sequencing the whole genome of 13-wk-old HHR and age-matched normal heart rat (NHR), its genetic control strain; 2) mRNA expression, by targeted RNA-sequencing in left ventricles of HHR and NHR at 5 ages (2 days old and 4, 13, 33, and 50 wk old) compared with human idiopathic dilated cardiomyopathy data; and 3) microRNA expression, with rat microRNA microarrays in left ventricles of 2-day-old HHR and age-matched NHR. We also investigated experimentally validated microRNA-mRNA interactions. Whole-genome sequencing revealed unique variants mostly located in noncoding regions of HHR and NHR. We found 29 genes differentially expressed in at least 1 age. Genes encoding desmoglein 2 (Dsg2) and transthyretin (Ttr) were significantly differentially expressed at all ages in the HHR, but only Ttr was also differentially expressed in human idiopathic cardiomyopathy. Lastly, only two microRNAs differentially expressed in the HHR were present in our comparison of validated microRNA-mRNA interactions. These two microRNAs interact with five of the genes studied. Our study shows that genes involved in monogenic forms of human cardiomyopathies may also influence polygenic forms of the disease.
Long-term aerobic exercise improves vascular function into old age : A systematic review, meta-analysis and meta regression of observational and interventional studies
- Campbell, Amy, Grace, Fergal, Ritchie, Louise, Beaumont, Alexander, Sculthorpe, Nicholas
- Authors: Campbell, Amy , Grace, Fergal , Ritchie, Louise , Beaumont, Alexander , Sculthorpe, Nicholas
- Date: 2019
- Type: Text , Journal article , Review
- Relation: Frontiers in Physiology Vol. 10, no. FEB (2019), p. 1-16
- Full Text:
- Reviewed:
- Description: There is an emerging body of literature relating to the effectiveness of frequent aerobic exercise as a prophylactic for age-associated dysfunction of large arteries, yet systematic evaluation and precise estimate of this effect is unknown. We conducted a systematic review and meta-analysis of controlled studies examining flow mediated dilatation (FMD) of athletic older persons and otherwise healthy sedentary counterparts to (i) compare FMD as a determinant of endothelial function between athletes and sedentary individuals and, (ii) summarize the effect of exercise training on FMD in studies of sedentary aging persons. Studies were identified from systematic search of major electronic databases from inception to January 2018. Study quality was assessed before conducting a random effects meta-analysis to calculate a pooled ES (mean difference) with 95% CI's. Thirteen studies [4 interventional (n = 125); 10 cross-sectional [including one study from the interventional analysis; (n = 485)] with age ranges from 62 to 75 years underwent quantitative pooling of data. The majority of study participants were male. Older athletes had more favorable FMD compared with sedentary controls (2.1%; CI: 1.4, 2.8%; P < 0.001). There was no significant improvement in the vascular function of sedentary cohorts following a period of exercise training (0.7%; CI: −0.675, 2.09%; P = 0.316). However, there was a significant increase in baseline diameter from pre to post intervention (0.1 mm; CI: 0.07, 0.13 mm; P < 0.001). In addition, there was no significant difference in endothelial independent vasodilation between the trained and sedentary older adults (1.57%; CI: −0.13, 3.27%; P = 0.07), or from pre to post exercise intervention (1.48%; CI: −1.34, 4.3%; P = 0.3). In conclusion, long-term aerobic exercise appears to attenuate the decline in endothelial vascular function, a benefit which is maintained during chronological aging. However, currently there is not enough evidence to suggest that exercise interventions improve vascular function in previously sedentary healthy older adults.
- Authors: Campbell, Amy , Grace, Fergal , Ritchie, Louise , Beaumont, Alexander , Sculthorpe, Nicholas
- Date: 2019
- Type: Text , Journal article , Review
- Relation: Frontiers in Physiology Vol. 10, no. FEB (2019), p. 1-16
- Full Text:
- Reviewed:
- Description: There is an emerging body of literature relating to the effectiveness of frequent aerobic exercise as a prophylactic for age-associated dysfunction of large arteries, yet systematic evaluation and precise estimate of this effect is unknown. We conducted a systematic review and meta-analysis of controlled studies examining flow mediated dilatation (FMD) of athletic older persons and otherwise healthy sedentary counterparts to (i) compare FMD as a determinant of endothelial function between athletes and sedentary individuals and, (ii) summarize the effect of exercise training on FMD in studies of sedentary aging persons. Studies were identified from systematic search of major electronic databases from inception to January 2018. Study quality was assessed before conducting a random effects meta-analysis to calculate a pooled ES (mean difference) with 95% CI's. Thirteen studies [4 interventional (n = 125); 10 cross-sectional [including one study from the interventional analysis; (n = 485)] with age ranges from 62 to 75 years underwent quantitative pooling of data. The majority of study participants were male. Older athletes had more favorable FMD compared with sedentary controls (2.1%; CI: 1.4, 2.8%; P < 0.001). There was no significant improvement in the vascular function of sedentary cohorts following a period of exercise training (0.7%; CI: −0.675, 2.09%; P = 0.316). However, there was a significant increase in baseline diameter from pre to post intervention (0.1 mm; CI: 0.07, 0.13 mm; P < 0.001). In addition, there was no significant difference in endothelial independent vasodilation between the trained and sedentary older adults (1.57%; CI: −0.13, 3.27%; P = 0.07), or from pre to post exercise intervention (1.48%; CI: −1.34, 4.3%; P = 0.3). In conclusion, long-term aerobic exercise appears to attenuate the decline in endothelial vascular function, a benefit which is maintained during chronological aging. However, currently there is not enough evidence to suggest that exercise interventions improve vascular function in previously sedentary healthy older adults.
How much is enough in rehabilitation? High running workloads following lower limb muscle injury delay return to play but protect against subsequent injury
- Stares, Jordan, Dawson, Brian, Peeling, Peter, Drew, Michael, Heasman, Jarryd, Rogalski, Brent, Colby, Marcus
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Drew, Michael , Heasman, Jarryd , Rogalski, Brent , Colby, Marcus
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1019-1024
- Full Text:
- Reviewed:
- Description: Objectives: Examine the influence of rehabilitation training loads on return to play (RTP) time and subsequent injury in elite Australian footballers. Design: Prospective cohort study. Methods: Internal (sessional rating of perceived exertion: sRPE) and external (distance, sprint distance) workload and lower limb non-contact muscle injury data was collected from 58 players over 5 seasons. Rehabilitation periods were analysed for running workloads and time spent in 3 rehabilitation stages (1: off-legs training, 2: non-football running, 3: group football training) was calculated. Multi-level survival analyses with random effects accounting for player and season were performed. Hazard ratios (HR) and 95% confidence intervals (CI) for each variable were produced for RTP time and time to subsequent injury. Results: Of 85 lower limb muscle injuries, 70 were rehabilitated to RTP, with 30 cases of subsequent injury recorded (recurrence rate = 11.8%, new site injury rate = 31.4%). Completion of high rehabilitation workloads delayed RTP (distance: >49,775 m [reference: 34,613–49,775 m]: HR 0.12, 95%CI 0.04–0.36, sRPE: >1266 AU [reference: 852–1266 AU]: HR 0.09, 95%CI 0.03–0.32). Return to running within 4 days increased subsequent injury risk (3–4 days [reference: 5–6 days]: HR 25.88, 95%CI 2.06–324.4). Attaining moderate-high sprint distance (427–710 m) was protective against subsequent injury (154–426 m: [reference: 427–710 m]: HR 37.41, 95%CI 2.70–518.64). Conclusions: Training load monitoring can inform player rehabilitation programs. Higher rehabilitation training loads delayed RTP; however, moderate-high sprint running loads can protect against subsequent injury. Shared-decision making regarding RTP should include accumulated training loads and consider the trade-off between expedited RTP and lower subsequent injury risk.
- Authors: Stares, Jordan , Dawson, Brian , Peeling, Peter , Drew, Michael , Heasman, Jarryd , Rogalski, Brent , Colby, Marcus
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1019-1024
- Full Text:
- Reviewed:
- Description: Objectives: Examine the influence of rehabilitation training loads on return to play (RTP) time and subsequent injury in elite Australian footballers. Design: Prospective cohort study. Methods: Internal (sessional rating of perceived exertion: sRPE) and external (distance, sprint distance) workload and lower limb non-contact muscle injury data was collected from 58 players over 5 seasons. Rehabilitation periods were analysed for running workloads and time spent in 3 rehabilitation stages (1: off-legs training, 2: non-football running, 3: group football training) was calculated. Multi-level survival analyses with random effects accounting for player and season were performed. Hazard ratios (HR) and 95% confidence intervals (CI) for each variable were produced for RTP time and time to subsequent injury. Results: Of 85 lower limb muscle injuries, 70 were rehabilitated to RTP, with 30 cases of subsequent injury recorded (recurrence rate = 11.8%, new site injury rate = 31.4%). Completion of high rehabilitation workloads delayed RTP (distance: >49,775 m [reference: 34,613–49,775 m]: HR 0.12, 95%CI 0.04–0.36, sRPE: >1266 AU [reference: 852–1266 AU]: HR 0.09, 95%CI 0.03–0.32). Return to running within 4 days increased subsequent injury risk (3–4 days [reference: 5–6 days]: HR 25.88, 95%CI 2.06–324.4). Attaining moderate-high sprint distance (427–710 m) was protective against subsequent injury (154–426 m: [reference: 427–710 m]: HR 37.41, 95%CI 2.70–518.64). Conclusions: Training load monitoring can inform player rehabilitation programs. Higher rehabilitation training loads delayed RTP; however, moderate-high sprint running loads can protect against subsequent injury. Shared-decision making regarding RTP should include accumulated training loads and consider the trade-off between expedited RTP and lower subsequent injury risk.
Implementation of concussion guidelines in community Australian Football and Rugby League - The experiences and challenges faced by coaches and sports trainers
- Kemp, Joanne, Newton, Joshua, White, Peta, Finch, Caroline
- Authors: Kemp, Joanne , Newton, Joshua , White, Peta , Finch, Caroline
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 19, no. 4 (2015), p.305-310
- Relation: http://purl.org/au-research/grants/nhmrc/1058737
- Full Text:
- Reviewed:
- Description: Objectives: While guidelines outlining the appropriate management of sport-related concussion have been developed and adapted for use within community sport, it remains unknown how they are experienced by those responsible for implementing them.: Design: Longitudinal study.: Methods: 111 coaches and sports trainers from community-level Australian Football and Rugby League teams completed pre- and post-season surveys assessing their attitudes towards using concussion guidelines. Participants also provided post-season feedback regarding their experiences in using the guidelines.: Results: 71% of participants reported using the guidelines in the preceding season. Post-season attitude was related to pre-season attitude (p = 0.002), football code (p = 0.015), and team role (p = 0.045). An interaction between team role and guideline use (p = 0.012) was also found, with coaches who had used the guidelines, and sports trainers who had not, reporting more positive post-season attitudes towards using the concussion guidelines. Implementation challenges included disputing of decisions about return-to-play by players, parents, and coaches, and a perceived lack of time. Recommendations for improved guideline materials included using larger fonts and providing for witnessing of advice given to players.: Conclusions: This is the first study to examine the implementation of concussion guidelines in community sport. Training of coaches/sports trainers needs enhancement. In addition, new education should be developed for parents/players about the importance of the return-to-play advice given to them by those who follow these guidelines. Information provided by those who attempted to use the guidelines will assist the refinement of implementation and dissemination processes around concussion guidelines across sports. © 2015 Sports Medicine Australia
- Authors: Kemp, Joanne , Newton, Joshua , White, Peta , Finch, Caroline
- Date: 2015
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 19, no. 4 (2015), p.305-310
- Relation: http://purl.org/au-research/grants/nhmrc/1058737
- Full Text:
- Reviewed:
- Description: Objectives: While guidelines outlining the appropriate management of sport-related concussion have been developed and adapted for use within community sport, it remains unknown how they are experienced by those responsible for implementing them.: Design: Longitudinal study.: Methods: 111 coaches and sports trainers from community-level Australian Football and Rugby League teams completed pre- and post-season surveys assessing their attitudes towards using concussion guidelines. Participants also provided post-season feedback regarding their experiences in using the guidelines.: Results: 71% of participants reported using the guidelines in the preceding season. Post-season attitude was related to pre-season attitude (p = 0.002), football code (p = 0.015), and team role (p = 0.045). An interaction between team role and guideline use (p = 0.012) was also found, with coaches who had used the guidelines, and sports trainers who had not, reporting more positive post-season attitudes towards using the concussion guidelines. Implementation challenges included disputing of decisions about return-to-play by players, parents, and coaches, and a perceived lack of time. Recommendations for improved guideline materials included using larger fonts and providing for witnessing of advice given to players.: Conclusions: This is the first study to examine the implementation of concussion guidelines in community sport. Training of coaches/sports trainers needs enhancement. In addition, new education should be developed for parents/players about the importance of the return-to-play advice given to them by those who follow these guidelines. Information provided by those who attempted to use the guidelines will assist the refinement of implementation and dissemination processes around concussion guidelines across sports. © 2015 Sports Medicine Australia
Inflammation and Oral Contraceptive Use in Female Athletes Before the Rio Olympic Games
- Larsen, Brianna, Cox, Amanda, Colbey, Candice, Drew, Michael, McGuire, Helen, Fazekas de St Groth, Barbara, Hughes, David, Vlahovich, Nicole, Waddington, Gordon, Burke, Louise, Lundy, Bronwen, West, Nicholas, Minahan, Clare
- Authors: Larsen, Brianna , Cox, Amanda , Colbey, Candice , Drew, Michael , McGuire, Helen , Fazekas de St Groth, Barbara , Hughes, David , Vlahovich, Nicole , Waddington, Gordon , Burke, Louise , Lundy, Bronwen , West, Nicholas , Minahan, Clare
- Date: 2020
- Type: Text , Journal article
- Relation: Frontiers in Physiology Vol. 11, no. (2020), p.
- Full Text:
- Reviewed:
- Description: This study investigated the association between synthetic ovarian hormone use [i.e., the oral contraceptive (OC) pill] and basal C-reactive protein (CRP), peripheral blood immune cell subsets, and circulating pro- and anti-inflammatory cytokine concentrations in elite female athletes. Elite female athletes (n = 53) selected in Rio Summer Olympic squads participated in this study; 25 were taking an OC (AthletesOC) and 28 were naturally hormonally cycling (AthletesNC). Venous blood samples were collected at rest for the determination of sex hormones, cortisol, CRP, peripheral blood mononuclear memory and naïve CD4+ T-cells, CD8+ T-cells and natural killer cells, as well as pro- and anti-inflammatory cytokine concentrations. C-reactive protein concentrations were elevated (p < 0.001) in AthletesOC (median = 2.02, IQR = 3.15) compared to AthletesNC (median = 0.57, IQR = 1.07). No differences were reported for cortisol, cytokines, or PBMC immune cell subsets, although there was a trend (p = 0.062) for higher IL-6 concentrations in AthletesNC. Female Olympians had substantially higher CRP concentrations, a marker of inflammation and tissue damage, before the Rio Olympic Games if they used an OC. Future research should examine the potential consequences for athlete performance/recovery so that, if necessary, practitioners can implement prevention programs. © Copyright © 2020 Larsen, Cox, Colbey, Drew, McGuire, Fazekas de St Groth, Hughes, Vlahovich, Waddington, Burke, Lundy, West and Minahan.
- Authors: Larsen, Brianna , Cox, Amanda , Colbey, Candice , Drew, Michael , McGuire, Helen , Fazekas de St Groth, Barbara , Hughes, David , Vlahovich, Nicole , Waddington, Gordon , Burke, Louise , Lundy, Bronwen , West, Nicholas , Minahan, Clare
- Date: 2020
- Type: Text , Journal article
- Relation: Frontiers in Physiology Vol. 11, no. (2020), p.
- Full Text:
- Reviewed:
- Description: This study investigated the association between synthetic ovarian hormone use [i.e., the oral contraceptive (OC) pill] and basal C-reactive protein (CRP), peripheral blood immune cell subsets, and circulating pro- and anti-inflammatory cytokine concentrations in elite female athletes. Elite female athletes (n = 53) selected in Rio Summer Olympic squads participated in this study; 25 were taking an OC (AthletesOC) and 28 were naturally hormonally cycling (AthletesNC). Venous blood samples were collected at rest for the determination of sex hormones, cortisol, CRP, peripheral blood mononuclear memory and naïve CD4+ T-cells, CD8+ T-cells and natural killer cells, as well as pro- and anti-inflammatory cytokine concentrations. C-reactive protein concentrations were elevated (p < 0.001) in AthletesOC (median = 2.02, IQR = 3.15) compared to AthletesNC (median = 0.57, IQR = 1.07). No differences were reported for cortisol, cytokines, or PBMC immune cell subsets, although there was a trend (p = 0.062) for higher IL-6 concentrations in AthletesNC. Female Olympians had substantially higher CRP concentrations, a marker of inflammation and tissue damage, before the Rio Olympic Games if they used an OC. Future research should examine the potential consequences for athlete performance/recovery so that, if necessary, practitioners can implement prevention programs. © Copyright © 2020 Larsen, Cox, Colbey, Drew, McGuire, Fazekas de St Groth, Hughes, Vlahovich, Waddington, Burke, Lundy, West and Minahan.
The efficacy of an iterative “sequence of prevention” approach to injury prevention by a multidisciplinary team in professional rugby union
- Tee, Jason, Bekker, Sheree, Collins, Rob, Klingbiel, Jannie, van Rooyen, Ivan, van Wyk, David, Till, Kevin, Jones, Ben
- Authors: Tee, Jason , Bekker, Sheree , Collins, Rob , Klingbiel, Jannie , van Rooyen, Ivan , van Wyk, David , Till, Kevin , Jones, Ben
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 9 (2018), p. 899-904
- Full Text:
- Reviewed:
- Description: Objectives: Due to the complex systems nature of injuries, the responsibility for injury risk management cannot lie solely within a single domain of professional practice. Interdisciplinary collaboration between technical/tactical coaches, strength and conditioning coaches, team doctors, physical therapists and sport scientists is likely to have a meaningful impact on injury risk. This study describes the application and efficacy of a multidisciplinary approach to reducing team injury risk in professional rugby union. Design: Observational longitudinal cohort study. Methods: Epidemiological injury data was collected from a professional rugby union team for 5 consecutive seasons. Following each season, these data informed multidisciplinary intervention strategies to reduce injury risk. The effectiveness of these strategies was iteratively assessed to inform future interventions. Specific examples of intervention strategies are provided. Results: Overall team injury burden displayed a likely beneficial decrease (−8%; injury rate ratio (IRR) 0.9, 95%CI 0.9–1.0) from 2012 to 2016. This was achieved through a most likely beneficial improvement in non-contact injury burden (−39%; IRR 0.6, 95%CI 0.6–0.7). Contact injury burden was increased, but to a lesser extent (+18%; IRR 1.2, 95%CI 1.1–1.3, most likely harmful) during the same period. Conclusions: The range of skills required to effectively manage complex injury phenomena in professional collision sport crosses disciplinary boundaries. The evidence presented here points to the effectiveness of a multidisciplinary approach to reducing injury risk. This model will likely be applicable across a range of team and individual sports.
- Authors: Tee, Jason , Bekker, Sheree , Collins, Rob , Klingbiel, Jannie , van Rooyen, Ivan , van Wyk, David , Till, Kevin , Jones, Ben
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 9 (2018), p. 899-904
- Full Text:
- Reviewed:
- Description: Objectives: Due to the complex systems nature of injuries, the responsibility for injury risk management cannot lie solely within a single domain of professional practice. Interdisciplinary collaboration between technical/tactical coaches, strength and conditioning coaches, team doctors, physical therapists and sport scientists is likely to have a meaningful impact on injury risk. This study describes the application and efficacy of a multidisciplinary approach to reducing team injury risk in professional rugby union. Design: Observational longitudinal cohort study. Methods: Epidemiological injury data was collected from a professional rugby union team for 5 consecutive seasons. Following each season, these data informed multidisciplinary intervention strategies to reduce injury risk. The effectiveness of these strategies was iteratively assessed to inform future interventions. Specific examples of intervention strategies are provided. Results: Overall team injury burden displayed a likely beneficial decrease (−8%; injury rate ratio (IRR) 0.9, 95%CI 0.9–1.0) from 2012 to 2016. This was achieved through a most likely beneficial improvement in non-contact injury burden (−39%; IRR 0.6, 95%CI 0.6–0.7). Contact injury burden was increased, but to a lesser extent (+18%; IRR 1.2, 95%CI 1.1–1.3, most likely harmful) during the same period. Conclusions: The range of skills required to effectively manage complex injury phenomena in professional collision sport crosses disciplinary boundaries. The evidence presented here points to the effectiveness of a multidisciplinary approach to reducing injury risk. This model will likely be applicable across a range of team and individual sports.
Multivariate modelling of subjective and objective monitoring data improve the detection of non-contact injury risk in elite Australian footballers
- Colby, Marcus, Dawson, Brian, Peeling, Peter, Heasman, Jarryd, Rogalski, Brent, Drew, Michael, Stares, Jordan, Zouhal, Hassane, Lester, Leanne
- Authors: Colby, Marcus , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Stares, Jordan , Zouhal, Hassane , Lester, Leanne
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 20, no. 12 (2017), p. 1068-1074
- Full Text:
- Reviewed:
- Description: Objectives: To assess the association between workload, subjective wellness, musculoskeletal screening measures and non-contact injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Across 4 seasons in 70 players from one club, cumulative weekly workloads (acute; 1 week, chronic; 2-, 3-, 4-week) and acute:chronic workload ratio’s (ACWR: 1-week load/average 4-weekly load) for session-Rating of Perceived Exertion (sRPE) and GPS-derived distance and sprint distance were calculated. Wellness, screening and non-contact injury data were also documented. Univariate and multivariate regression models determined injury incidence rate ratios (IRR) while accounting for interaction/moderating effects. Receiver operating characteristics determined model predictive accuracy (area under curve: AUC). Results: Very low cumulative chronic (2-, 3-, 4- week) workloads were associated with the greatest injury risk (univariate IRR = 1.71–2.16, 95% CI = 1.10–4.52) in the subsequent week. In multivariate analysis, the interaction between a low chronic load and a very high distance (adj-IRR = 2.60, 95% CI = 1.07–6.34) or low sRPE ACWR (adj-IRR = 2.52, 95% CI = 1.01–6.29) was associated with increased injury risk. Subjectively reporting “yes” (vs. “no”) for old lower limb pain and heavy non-football activity in the previous 7 days (multivariate adj-IRR = 2.01–2.25, 95% CI = 1.02–4.95) and playing experience (>9 years) (multivariate adj- IRR = 2.05, 95% CI = 1.03–4.06) was also associated with increased injury risk, but screening data were not. Predictive capacity of multivariate models was significantly better than univariate (AUCmultivariate = 0.70, 95% CI 0.64–0.75; AUCunivariate range = 0.51–0.60). Conclusions: Chronic load is an important moderating factor in the workload–injury relationship. Low chronic loads coupled with low or very high ACWR are associated with increased injury risk.
- Description: Objectives: To assess the association between workload, subjective
- Authors: Colby, Marcus , Dawson, Brian , Peeling, Peter , Heasman, Jarryd , Rogalski, Brent , Drew, Michael , Stares, Jordan , Zouhal, Hassane , Lester, Leanne
- Date: 2017
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 20, no. 12 (2017), p. 1068-1074
- Full Text:
- Reviewed:
- Description: Objectives: To assess the association between workload, subjective wellness, musculoskeletal screening measures and non-contact injury risk in elite Australian footballers. Design: Prospective cohort study. Methods: Across 4 seasons in 70 players from one club, cumulative weekly workloads (acute; 1 week, chronic; 2-, 3-, 4-week) and acute:chronic workload ratio’s (ACWR: 1-week load/average 4-weekly load) for session-Rating of Perceived Exertion (sRPE) and GPS-derived distance and sprint distance were calculated. Wellness, screening and non-contact injury data were also documented. Univariate and multivariate regression models determined injury incidence rate ratios (IRR) while accounting for interaction/moderating effects. Receiver operating characteristics determined model predictive accuracy (area under curve: AUC). Results: Very low cumulative chronic (2-, 3-, 4- week) workloads were associated with the greatest injury risk (univariate IRR = 1.71–2.16, 95% CI = 1.10–4.52) in the subsequent week. In multivariate analysis, the interaction between a low chronic load and a very high distance (adj-IRR = 2.60, 95% CI = 1.07–6.34) or low sRPE ACWR (adj-IRR = 2.52, 95% CI = 1.01–6.29) was associated with increased injury risk. Subjectively reporting “yes” (vs. “no”) for old lower limb pain and heavy non-football activity in the previous 7 days (multivariate adj-IRR = 2.01–2.25, 95% CI = 1.02–4.95) and playing experience (>9 years) (multivariate adj- IRR = 2.05, 95% CI = 1.03–4.06) was also associated with increased injury risk, but screening data were not. Predictive capacity of multivariate models was significantly better than univariate (AUCmultivariate = 0.70, 95% CI 0.64–0.75; AUCunivariate range = 0.51–0.60). Conclusions: Chronic load is an important moderating factor in the workload–injury relationship. Low chronic loads coupled with low or very high ACWR are associated with increased injury risk.
- Description: Objectives: To assess the association between workload, subjective
Seasonal time-loss match injury rates and burden in South African under-16 rugby teams
- Sewry, Nicola, Verhagen, Evert, Lambert, Mike, van Mechelen, Willem, Readhead, Clint, Viljoen, Wayne, Brown, James
- Authors: Sewry, Nicola , Verhagen, Evert , Lambert, Mike , van Mechelen, Willem , Readhead, Clint , Viljoen, Wayne , Brown, James
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 1 (2019), p. 54-58
- Full Text:
- Reviewed:
- Description: Objectives: Youth rugby union is a popular sport with a high injury incidence density (IID) and burden. This high risk has called for further research into the factors affecting the injuries in youth rugby. The aim of the study was to analyse time-loss IID and burden in multiple schoolboy rugby teams over a season and the potential factors associated with injury. Design: Prospective cohort Methods: All time-loss injuries were recorded from three schools for the whole season. Overall IID and injury burden were calculated, as well as for injury event, type, location and the match quarter in which they occurred and Poisson regression analyses were performed to determine differences. Results: IID was 28.8 (18.9–38.6) injuries per 1000 player hours over the season, with an injury burden of 379.2 (343.6–414.9) days lost per 1000 player hours. The ball-carrier had a significantly higher IID (11.3 (5.2–17.5) per 1000 player hours) compared to other events, and the joint (non-bone)/ligament injuries were the most common (IID of 12.2 (5.8–18.6) per 1000 player hours) and severe type of injury (burden of 172.6 (148.5–196.6) days lost per 1000 player hours). Conclusions: The IID was similar to previous youth rugby studies, however the injury burden was much lower. The South African youth cohort showed similar factors associated with injury for inciting event (the tackle) and injury type (joint (non-bone)/ligament) and location (lower limb) as seen in other studies in both youth and senior players.
- Authors: Sewry, Nicola , Verhagen, Evert , Lambert, Mike , van Mechelen, Willem , Readhead, Clint , Viljoen, Wayne , Brown, James
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 1 (2019), p. 54-58
- Full Text:
- Reviewed:
- Description: Objectives: Youth rugby union is a popular sport with a high injury incidence density (IID) and burden. This high risk has called for further research into the factors affecting the injuries in youth rugby. The aim of the study was to analyse time-loss IID and burden in multiple schoolboy rugby teams over a season and the potential factors associated with injury. Design: Prospective cohort Methods: All time-loss injuries were recorded from three schools for the whole season. Overall IID and injury burden were calculated, as well as for injury event, type, location and the match quarter in which they occurred and Poisson regression analyses were performed to determine differences. Results: IID was 28.8 (18.9–38.6) injuries per 1000 player hours over the season, with an injury burden of 379.2 (343.6–414.9) days lost per 1000 player hours. The ball-carrier had a significantly higher IID (11.3 (5.2–17.5) per 1000 player hours) compared to other events, and the joint (non-bone)/ligament injuries were the most common (IID of 12.2 (5.8–18.6) per 1000 player hours) and severe type of injury (burden of 172.6 (148.5–196.6) days lost per 1000 player hours). Conclusions: The IID was similar to previous youth rugby studies, however the injury burden was much lower. The South African youth cohort showed similar factors associated with injury for inciting event (the tackle) and injury type (joint (non-bone)/ligament) and location (lower limb) as seen in other studies in both youth and senior players.
PCA based population generation for genetic network optimization
- Youseph, Ahammed, Chetty, Madhu, Karmakar, Gour
- Authors: Youseph, Ahammed , Chetty, Madhu , Karmakar, Gour
- Date: 2018
- Type: Text , Journal article
- Relation: Cognitive Neurodynamics Vol. 12, no. 4 (2018), p. 417-429
- Full Text:
- Reviewed:
- Description: A gene regulatory network (GRN) represents a set of genes and its regulatory interactions. The inference of the regulatory interactions between genes is usually carried out using an appropriate mathematical model and the available gene expression profile. Among the various models proposed for GRN inference, our recently proposed Michaelis–Menten based ODE model provides a good trade-off between the computational complexity and biological relevance. This model, like other known GRN models, also uses an evolutionary algorithm for parameter estimation. Considering various issues associated with such population based stochastic optimization approaches (e.g. diversity, premature convergence due to local optima, accuracy, etc.), it becomes important to seed the initial population with good individuals which are closer to the optimal solution. In this paper, we exploit the inherent strength of principal component analysis (PCA) in a novel manner to initialize the population for GRN optimization. The benefit of the proposed method is validated by reconstructing in silico and in vivo networks of various sizes. For the same level of accuracy, the approach with PCA based initialization shows improved convergence speed.
- Authors: Youseph, Ahammed , Chetty, Madhu , Karmakar, Gour
- Date: 2018
- Type: Text , Journal article
- Relation: Cognitive Neurodynamics Vol. 12, no. 4 (2018), p. 417-429
- Full Text:
- Reviewed:
- Description: A gene regulatory network (GRN) represents a set of genes and its regulatory interactions. The inference of the regulatory interactions between genes is usually carried out using an appropriate mathematical model and the available gene expression profile. Among the various models proposed for GRN inference, our recently proposed Michaelis–Menten based ODE model provides a good trade-off between the computational complexity and biological relevance. This model, like other known GRN models, also uses an evolutionary algorithm for parameter estimation. Considering various issues associated with such population based stochastic optimization approaches (e.g. diversity, premature convergence due to local optima, accuracy, etc.), it becomes important to seed the initial population with good individuals which are closer to the optimal solution. In this paper, we exploit the inherent strength of principal component analysis (PCA) in a novel manner to initialize the population for GRN optimization. The benefit of the proposed method is validated by reconstructing in silico and in vivo networks of various sizes. For the same level of accuracy, the approach with PCA based initialization shows improved convergence speed.
The impact of strength level on adaptations to combined weightlifting, plyometric, and ballistic training
- James, Lachlan, Haff, Gregory, Vincent, Kelly, Connick, Mark, Hoffman, Ben, Beckman, Emma
- Authors: James, Lachlan , Haff, Gregory , Vincent, Kelly , Connick, Mark , Hoffman, Ben , Beckman, Emma
- Date: 2018
- Type: Text , Journal article
- Relation: Scandinavian Journal of Medicine & Science in Sports Vol. 28, no. 5 (2018), p. 1494-1505
- Full Text:
- Reviewed:
- Description: The purpose of this investigation was to determine whether the magnitude of adaptation to integrated ballistic training is influenced by initial strength level. Such information is needed to inform resistance training guidelines for both higher-and lower-level athlete populations. To this end, two groups of distinctly different strength levels (stronger: one-repetition-maximum (1RM) squat = 2.01 ± 0.15 kg·BM
- Authors: James, Lachlan , Haff, Gregory , Vincent, Kelly , Connick, Mark , Hoffman, Ben , Beckman, Emma
- Date: 2018
- Type: Text , Journal article
- Relation: Scandinavian Journal of Medicine & Science in Sports Vol. 28, no. 5 (2018), p. 1494-1505
- Full Text:
- Reviewed:
- Description: The purpose of this investigation was to determine whether the magnitude of adaptation to integrated ballistic training is influenced by initial strength level. Such information is needed to inform resistance training guidelines for both higher-and lower-level athlete populations. To this end, two groups of distinctly different strength levels (stronger: one-repetition-maximum (1RM) squat = 2.01 ± 0.15 kg·BM
Association between preseason training and performance in elite Australian football
- McCaskie, Callum, Young, Warren, Fahrner, Brendan, Sim, Marc
- Authors: McCaskie, Callum , Young, Warren , Fahrner, Brendan , Sim, Marc
- Date: 2019
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 14, no. 1 (2019), p. 68-75
- Full Text:
- Reviewed:
- Description: Purpose: To examine the association between preseason training variables and subsequent in-season performance in an elite Australian football team. Methods: Data from 41 elite male Australian footballers (mean [SD] age = 23.4 [3.1] y, height =188.4 [7.1] cm, and mass = 86.7 [7.9] kg) were collected from 1 Australian Football League (AFL) club. Preseason training data (external load, internal load, fitness testing, and session participation) were collected across the 17-wk preseason phase (6 and 11 wk post-Christmas). Champion Data© Player Rank (CDPR), coaches’ ratings, and round 1 selection were used as in-season performance measures. CDPR and coaches’ ratings were examined over the entire season, first half of the season, and the first 4 games. Both Pearson and partial (controlling for AFL age) correlations were calculated to assess if any associations existed between preseason training variables and in-season performance measures. A median split was also employed to differentiate between higher- and lower-performing players for each performance measure. Results: Preseason training activities appeared to have almost no association with performance measured across the entire season and the first half of the season. However, many preseason training variables were significantly linked with performance measured across the first 4 games. Preseason training variables that were measured post-Christmas were the most strongly associated with in-season performance measures. Specifically, total on-field session rating of perceived exertion post-Christmas, a measurement of internal load, displayed the greatest association with performance. Conclusion: Late preseason training (especially on-field match-specific training) is associated with better performance in the early season.
- Authors: McCaskie, Callum , Young, Warren , Fahrner, Brendan , Sim, Marc
- Date: 2019
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 14, no. 1 (2019), p. 68-75
- Full Text:
- Reviewed:
- Description: Purpose: To examine the association between preseason training variables and subsequent in-season performance in an elite Australian football team. Methods: Data from 41 elite male Australian footballers (mean [SD] age = 23.4 [3.1] y, height =188.4 [7.1] cm, and mass = 86.7 [7.9] kg) were collected from 1 Australian Football League (AFL) club. Preseason training data (external load, internal load, fitness testing, and session participation) were collected across the 17-wk preseason phase (6 and 11 wk post-Christmas). Champion Data© Player Rank (CDPR), coaches’ ratings, and round 1 selection were used as in-season performance measures. CDPR and coaches’ ratings were examined over the entire season, first half of the season, and the first 4 games. Both Pearson and partial (controlling for AFL age) correlations were calculated to assess if any associations existed between preseason training variables and in-season performance measures. A median split was also employed to differentiate between higher- and lower-performing players for each performance measure. Results: Preseason training activities appeared to have almost no association with performance measured across the entire season and the first half of the season. However, many preseason training variables were significantly linked with performance measured across the first 4 games. Preseason training variables that were measured post-Christmas were the most strongly associated with in-season performance measures. Specifically, total on-field session rating of perceived exertion post-Christmas, a measurement of internal load, displayed the greatest association with performance. Conclusion: Late preseason training (especially on-field match-specific training) is associated with better performance in the early season.
Do neurocognitive SCAT3 baseline test scores differ between footballers (soccer) living with and without diability? A cross-sectional study
- Weiler, Richard, van Mechelen, Willem, Fuller, Colin, Ahmed, Osman, Verhagen, Evert
- Authors: Weiler, Richard , van Mechelen, Willem , Fuller, Colin , Ahmed, Osman , Verhagen, Evert
- Date: 2018
- Type: Text , Journal article
- Relation: Clinical Journal of Sport Medicine Vol. 28, no. 1 (2018), p. 43-50
- Full Text:
- Reviewed:
- Description: OBJECTIVE:: To determine if baseline Sport Concussion Assessment Tool, third Edition (SCAT3) scores differ between athletes with and without disability. DESIGN:: Cross-sectional comparison of preseason baseline SCAT3 scores for a range of England international footballers. SETTING:: Team doctors and physiotherapists supporting England football teams recorded playersʼ SCAT 3 baseline tests from August 1, 2013 to July 31, 2014. PARTICIPANTS:: A convenience sample of 249 England footballers, of whom 185 were players without disability (male: 119; female: 66) and 64 were players with disability (male learning disability: 17; male cerebral palsy: 28; male blind: 10; female deaf: 9). ASSESSMENT AND OUTCOME MEASURES:: Between-group comparisons of median SCAT3 total and section scores were made using nonparametric Mann–Whitney–Wilcoxon ranked-sum test. MAIN RESULTS:: All footballers with disability scored higher symptom severity scores compared with male players without disability. Male footballers with learning disability demonstrated no significant difference in the total number of symptoms, but recorded significantly lower scores on immediate memory and delayed recall compared with male players without disability. Male blind footballersʼ scored significantly higher for total concentration and delayed recall, and male footballers with cerebral palsy scored significantly higher on balance testing and immediate memory, when compared with male players without disability. Female footballers with deafness scored significantly higher for total concentration and balance testing than female footballers without disability. CONCLUSIONS:: This study suggests that significant differences exist between SCAT3 baseline section scores for footballers with and without disability. Concussion consensus guidelines should recognize these differences and produce guidelines that are specific for the growing number of athletes living with disability.
- Authors: Weiler, Richard , van Mechelen, Willem , Fuller, Colin , Ahmed, Osman , Verhagen, Evert
- Date: 2018
- Type: Text , Journal article
- Relation: Clinical Journal of Sport Medicine Vol. 28, no. 1 (2018), p. 43-50
- Full Text:
- Reviewed:
- Description: OBJECTIVE:: To determine if baseline Sport Concussion Assessment Tool, third Edition (SCAT3) scores differ between athletes with and without disability. DESIGN:: Cross-sectional comparison of preseason baseline SCAT3 scores for a range of England international footballers. SETTING:: Team doctors and physiotherapists supporting England football teams recorded playersʼ SCAT 3 baseline tests from August 1, 2013 to July 31, 2014. PARTICIPANTS:: A convenience sample of 249 England footballers, of whom 185 were players without disability (male: 119; female: 66) and 64 were players with disability (male learning disability: 17; male cerebral palsy: 28; male blind: 10; female deaf: 9). ASSESSMENT AND OUTCOME MEASURES:: Between-group comparisons of median SCAT3 total and section scores were made using nonparametric Mann–Whitney–Wilcoxon ranked-sum test. MAIN RESULTS:: All footballers with disability scored higher symptom severity scores compared with male players without disability. Male footballers with learning disability demonstrated no significant difference in the total number of symptoms, but recorded significantly lower scores on immediate memory and delayed recall compared with male players without disability. Male blind footballersʼ scored significantly higher for total concentration and delayed recall, and male footballers with cerebral palsy scored significantly higher on balance testing and immediate memory, when compared with male players without disability. Female footballers with deafness scored significantly higher for total concentration and balance testing than female footballers without disability. CONCLUSIONS:: This study suggests that significant differences exist between SCAT3 baseline section scores for footballers with and without disability. Concussion consensus guidelines should recognize these differences and produce guidelines that are specific for the growing number of athletes living with disability.