Inattentional blindness and pattern-matching failure : The case of failure to recognize clinical cues
- Authors: Al-Moteri, Modi , Symmons, Mark , Cooper, Simon J. , Plummer, Virginia
- Date: 2018
- Type: Text , Journal article
- Relation: Applied Ergonomics Vol. 73, no. (2018), p. 174-182
- Full Text: false
- Reviewed:
- Description: Eye-tracking methodology was used to investigate lapses in the appropriate treatment of ward patients due to not noticing critical cues of deterioration. Forty nursing participants with different levels of experience participated in an interactive screen-based simulation of hypovolemic shock. The results show that 65% of the participants exhibited at least one episode of non-fixation on clinically relevant, fully visible cues that were in plain sight. Thirty-five percent of participants dwelt for sufficient time (>200 ms) on important cues for perception to take place, but no action followed, indicating they had pattern-matching failure. When participants fail to notice what, they should notice in patient status until it is too late, this can have serious consequences. Much work needs to be done, since these human perceptual limitations can affect patient safety in general wards.
Assessment and monitoring practices of Australian fitness professionals
- Authors: Bennie, Jason , Wiesner, Glen , van Uffelen, Jannique , Harvey, Jack , Craike, Melinda , Biddle, Stuart
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 4 (2018), p. 433-438
- Full Text: false
- Reviewed:
- Description: Objectives: Assessment and monitoring of client health and fitness is a key part of fitness professionals’ practices. However, little is known about prevalence of this practice. This study describes the assessment/monitoring practices of a large sample of Australian fitness professionals. Design: Cross-sectional. Methods: In 2014, 1206 fitness professionals completed an online survey. Respondents reported their frequency (4 point-scale: [1] ‘never’ to [4] ‘always’) of assessment/monitoring of eight health and fitness constructs (e.g. body composition, aerobic fitness). This was classified as: (i) ‘high’ (‘always’ assessing/monitoring ≥5 constructs); (ii) ‘medium’ (1–4 constructs); (iii) ‘low’ (0 constructs). Classifications are reported by demographic and fitness industry characteristics. The odds of being classified as a ‘high assessor/monitor’ according to social ecological correlates were examined using a multiple-factor logistic regression model. Results: Mean age of respondents was 39.3 (±11.6) years and 71.6% were female. A total of 15.8% (95% CI: 13.7%–17.9%) were classified as a ‘high’ assessor/monitor. Constructs with the largest proportion of being ‘always’ assessed were body composition (47.7%; 95% CI: 45.0%–50.1%) and aerobic fitness (42.5%; 95% CI: 39.6%–45.3%). Those with the lowest proportion of being ‘always’ assessed were balance (24.0%; 95% CI: 24.7%–26.5%) and mental health (20.2%; 95% CI: 18.1%–29.6%). A perceived lack of client interest and fitness professionals not considering assessing their responsibility were associated with lower odds of being classified as a ‘high assessor/monitor’. Conclusions: Most fitness professionals do not routinely assess/monitor client fitness and health. Key factors limiting client health assessment and monitoring include a perceived lack of client interest and professionals not considering this their role. © 2017
A framework for the etiology of running-related injuries
- Authors: Bertelsen, Michael , Hulme, Adam , Petersen, Jesper , Brund, Rene , Sørensen, Henrik , Finch, Caroline , Parner, Erik , Nielsen, Rasmus
- Date: 2017
- Type: Text , Journal article , Review
- Relation: Scandinavian Journal of Medicine and Science in Sports Vol. 27, no. 11 (2017), p. 1170-1180
- Full Text: false
- Reviewed:
- Description: The etiology of running-related injury is important to consider as the effectiveness of a given running-related injury prevention intervention is dependent on whether etiologic factors are readily modifiable and consistent with a biologically plausible causal mechanism. Therefore, the purpose of the present article was to present an evidence-informed conceptual framework outlining the multifactorial nature of running-related injury etiology. In the framework, four mutually exclusive parts are presented: (a) Structure-specific capacity when entering a running session; (b) structure-specific cumulative load per running session; (c) reduction in the structure-specific capacity during a running session; and (d) exceeding the structure-specific capacity. The framework can then be used to inform the design of future running-related injury prevention studies, including the formation of research questions and hypotheses, as well as the monitoring of participation-related and non-participation-related exposures. In addition, future research applications should focus on addressing how changes in one or more exposures influence the risk of running-related injury. This necessitates the investigation of how different factors affect the structure-specific load and/or the load capacity, and the dose-response relationship between running participation and injury risk. Ultimately, this direction allows researchers to move beyond traditional risk factor identification to produce research findings that are not only reliably reported in terms of the observed cause-effect association, but also translatable in practice. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd
Cardiac telomere length in heart development, function, and disease
- Authors: Booth, Scott , Charchar, Fadi
- Date: 2017
- Type: Text , Journal article , Review
- Relation: Physiological Genomics Vol. 49, no. 7 (2017), p. 368-384
- Relation: http://purl.org/au-research/grants/nhmrc/1034371
- Full Text: false
- Reviewed:
- Description: Telomeres are repetitive nucleoprotein structures at chromosome ends, and a decrease in the number of these repeats, known as a reduction in telomere length (TL), triggers cellular senescence and apoptosis. Heart disease, the worldwide leading cause of death, often results from the loss of cardiac cells, which could be explained by decreases in TL. Due to the cell-specific regulation of TL, this review focuses on studies that have measured telomeres in heart cells and critically assesses the relationship between cardiac TL and heart function. There are several lines of evidence that have identified rapid changes in cardiac TL during the onset and progression of heart disease as well as at critical stages of development. There are also many factors, such as the loss of telomeric proteins, oxidative stress, and hypoxia, that decrease cardiac TL and heart function. In contrast, antioxidants, calorie restriction, and exercise can prevent both cardiac telomere attrition and the progression of heart disease. TL in the heart is also indicative of proliferative potential and could facilitate the identification of cells suitable for cardiac rejuvenation. Although these findings highlight the involvement of TL in heart function, there are important questions regarding the validity of animal models, as well as several confounding factors, that need to be considered when interpreting results and planning future research. With these in mind, elucidating the telomeric mechanisms involved in heart development and the transition to disease holds promise to prevent cardiac dysfunction and potentiate regeneration after injury. © 2017 the American Physiological Society.
Rating of perceived exertion is a stable and appropriate measure of workload in judo
- Authors: Bromley, Sally , Drew, Michael , McIntosh, Andrew , Talpey, Scott
- Date: 2018
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 21, no. 10 (2018), p. 1008-1012
- Full Text: false
- Reviewed:
- Description: Objectives: Heart rate (HR), blood lactate concentration [La] and/or rating of perceived exertion (RPE) have been utilised to monitor judo training load in technical and randori (competition training) sessions, but are yet to be investigated in mixed sessions containing both elements. Therefore the purpose of this study was to: (1) determine the stability of these variables, and (2) to assess the efficacy of RPE as a load variable for mixed judo sessions. Design: Cross-sectional study. Methods: Twenty-nine athletes attended two mixed training sessions at an international training camp. Bout and session characteristics, including RPE, physical and mental effort, heart rate (HR) and post-session [La] were recorded. A two-way random-effects intra-class correlation assessed variable stability. Multilevel mixed-effects ordered logistic regression investigated relationships between RPE and other variables for bouts and sessions. Results: Average and minimum HR across sessions correlated highly (ICC = 0.95 and 0.94, respectively). Good correlations existed between [La], session-RPE and mental effort, and fair correlation of max HR and physical effort. No relationships existed between [La]/HR and session-RPE. A unit increase in bout-RPE resulted in a 2.09 unit increase in physical, or a 1.36 unit increase in mental, effort holding all other bout variables constant. Gender and competitive level did not influence statistical models. Conclusions: Results provide further evidence that RPE can be used across a range of competitive levels and genders to monitor workload of mixed sessions and individual randori in judo. Physical effort may play a larger role than mental effort when athletes reflect on exertion during training. © 2018
Telomeres, exercise and cardiovascular disease : Finding the means to justify the ends
- Authors: Chilton, Warrick , O'Brien, Brendan , Grace, Fergal , Charchar, Fadi
- Date: 2017
- Type: Text , Journal article
- Relation: Acta Physiologica Vol. 220, no. 2 (2017), p. 186-188
- Full Text: false
- Reviewed:
Evaluating influence of microRNA in reconstructing gene regulatory networks
- Authors: Chowdhury, Ahsan , Chetty, Madhu , Nguyen, Vinh
- Date: 2015
- Type: Text , Journal article
- Relation: Cognitive neurodynamics Vol. 8, no. 3 (2015), p. 251-9
- Full Text: false
- Reviewed:
- Description: Gene regulatory network (GRN) consists of interactions between transcription factors (TFs) and target genes (TGs). Recently, it has been observed that micro RNAs (miRNAs) play a significant part in genetic interactions. However, current microarray technologies do not capture miRNA expression levels. To overcome this, we propose a new technique to reverse engineer GRN from the available partial microarray data which contains expression levels of TFs and TGs only. Using S-System model, the approach is adapted to cope with the unavailability of information about the expression levels of miRNAs. The versatile Differential Evolutionary algorithm is used for optimization and parameter estimation. Experimental studies on four in silico networks, and a real network of Saccharomyces cerevisiae called IRMA network, show significant improvement compared to traditional S-System approach.
Stochastic S-system modeling of gene regulatory network
- Authors: Chowdhury, Ahsan , Chetty, Madhu , Evans, Rob
- Date: 2015
- Type: Text , Journal article
- Relation: Cognitive Neurodynamics Vol. 9, no. 5 (2015), p. 535-547
- Full Text: false
- Reviewed:
- Description: Microarray gene expression data can provide insights into biological processes at a system-wide level and is commonly used for reverse engineering gene regulatory networks (GRN). Due to the amalgamation of noise from different sources, microarray expression profiles become inherently noisy leading to significant impact on the GRN reconstruction process. Microarray replicates (both biological and technical), generated to increase the reliability of data obtained under noisy conditions, have limited influence in enhancing the accuracy of reconstruction. Therefore, instead of the conventional GRN modeling approaches which are deterministic, stochastic techniques are becoming increasingly necessary for inferring GRN from noisy microarray data. In this paper, we propose a new stochastic GRN model by investigating incorporation of various standard noise measurements in the deterministic S-system model. Experimental evaluations performed for varying sizes of synthetic network, representing different stochastic processes, demonstrate the effect of noise on the accuracy of genetic network modeling and the significance of stochastic modeling for GRN reconstruction. The proposed stochastic model is subsequently applied to infer the regulations among genes in two real life networks: (1) the well-studied IRMA network, a real-life in-vivo synthetic network constructed within the Saccharomycescerevisiae yeast, and (2) the SOS DNA repair network in Escherichiacoli. © 2015, Springer Science+Business Media Dordrecht.
Achilles tendon structure improves on UTC imaging over a 5-month pre-season in elite Australian football players
- Authors: Docking, Sean , Rosengarten, Samuel , Cook, Jill
- Date: 2016
- Type: Text , Journal article
- Relation: Scandinavian Journal of Medicine and Science in Sports Vol. 26, no. 5 (2016), p. 557-563
- Full Text: false
- Reviewed:
- Description: Pre-season injuries are common and may be due to a reintroduction of training loads. Tendons are sensitive to changes in load, making them vulnerable to injury in the pre-season. This study investigated changes in Achilles tendon structure on ultrasound tissue characterization (UTC) over the course of a 5-month pre-season in elite male Australian football players. Eighteen elite male Australian football players with no history of Achilles tendinopathy and normal Achilles tendons were recruited. The left Achilles tendon was scanned with UTC to quantify the stability of the echopattern. Participants were scanned at the start and completion of a 5-month pre-season. Fifteen players remained asymptomatic over the course of the pre-season. All four echo-types were significantly different at the end of the pre-season, with the overall echopattern suggesting an improvement in Achilles tendon structure. Three of the 18 participants developed Achilles tendon pain that coincided with a change in the UTC echopattern. This study demonstrates that the UTC echopattern of the Achilles tendon improves over a 5-month pre-season training period, representing increased fibrillar alignment. However, further investigation is needed to elucidate with this alteration in the UTC echopattern results in improved tendon resilience and load capacity. © 2016 John Wiley & Sons A/S.
Quantification of Achilles and patellar tendon structure on imaging does not enhance ability to predict self-reported symptoms beyond grey-scale ultrasound and previous history
- Authors: Docking, Sean , Rio, Ebonie , Cook, Jill , Carey, David , Fortington, Lauren
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 2 (2019), p. 145-150
- Full Text: false
- Reviewed:
- Description: Background: Tendon pathology on imaging has been associated with an increased risk of developing symptoms. This evidence is based on classifying the tendon as normal or pathological. It is unclear whether the extent of tendon pathology is associated with the development or severity of symptoms. Objectives: To investigate whether the presence and extent of tendon pathology on ultrasound tissue characterisation (UTC), or a previous history of symptoms, were associated with the development of symptoms over a football season. Methods: 179 male Australian football players underwent UTC imaging of their Achilles and/or patellar tendon at the start of the pre-season. Players completed monthly OSTRC overuse questionnaires to quantify the presence and severity of Achilles and/or patellar tendon symptoms. Risk factor analysis was performed to identify associations between imaging and the development of symptoms. Results: A pathological Achilles tendon increased the risk of developing symptoms (RR = 3.2, 95%CI 1.7–5.9). Conversely, a pathological patellar tendon was not significantly associated with the development of symptoms (RR = 1.8, 95%CI 0.9–3.7). Quantification of tendon structure using UTC did not enhance the ability to identify athletes who developed symptoms. Previous history of symptoms was the strongest predictor for the development of symptoms (Achilles RR = 3.0 95%CI 1.8–4.8; patellar RR = 3.7 95%CI 2.2–6.1). Conclusion: Tendon pathology was associated with the development of self-reported symptoms; however previous history of symptoms was a stronger risk factor. The extent of disorganisation quantified by UTC should not be used as a marker for the presence or severity of current and future symptoms.
Normative MRI, ultrasound and muscle functional MRI findings in the forearms of asymptomatic elite rowers
- Authors: Drew, Michael , Trease, Larissa , Caneiro, J. P. , Hooper, Ivan , Ooi, Chin-Chin , Counsel, Peter , Connell, David , Rice, Anthony , Knight, Emma , Hoy, Gregory , Lovell, Gregory
- Date: 2016
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 19, no. 2 (2016), p. 103-108
- Full Text: false
- Reviewed:
- Description: Objectives Forearm injuries are common and debilitating to elite rowers. Chronic exertional compartment syndrome, intersection syndrome and proximal radial bone stress injuries have been documented in this population. This paper explores the imaging findings related to these conditions in asymptomatic elite rowers. Design Observational study. Methods 19 asymptomatic senior elite and under-23 rowers currently competing at National level or above underwent ultrasound (US), Magnetic Resonance Imaging (MRI) and muscle functional MRI evaluation of their forearms. A comprehensive evaluation sheet identifying characteristics of bone stress, intersection syndrome and chronic exertional compartment syndrome was utilised based on a literature search and review by senior clinicians working with this population. Results Peritendinous fluid of Extensor Carpi Radialis Longus (n=10, 53%) or Extensor Carpi Radialis Brevis (n=6, 32%) was a common finding on US. MRI had a higher rate of identification than US. Extensor Digitorum (Coeff=−1.76, 95%CI −3.04 to −0.49), Flexor Carpi Radialis (Coeff=−2.86, 95%CI −5.35 to −0.38) and Flexor Carpi Ulnaris (Coeff=−3.31, 95%CI −5.30 to −1.32), Pronator Teres (Coeff=−3.94, 95%CI −6.89 to −0.99), and Supinator (Coeff=−168, 95%CI −3.28 to −0.02) showed statistically significant changes immediately post-exercise. Mild proximal radial marrow hyperintensity was present (n=15, 78.9%) with three participants (15.8%) also having mild periosteal oedema of the radius. Conclusions Imaging findings commonly seen in symptomatic populations are observed in elite, asymptomatic rowers. Care should be taken when diagnosing bone stress injuries, intersection syndrome and compartment syndrome on imaging findings alone. Data presented can be utilised as a normative dataset for future case studies.
Quantifying cricket fast-bowling skill
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article , Review
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 7 (2018), p. 830-838
- Full Text: false
- Reviewed:
- Description: Objectives: To evaluate the current evidence regarding the quantification of cricket fast-bowling skill. Methods: Studies that assessed fast-bowling skill (bowling speed and accuracy) were identified from searches in SPORTDiscus (EBSCO) in June 2017. The reference lists of identified papers were also examined for relevant investigations. Results: A total of 16 papers matched the inclusion criteria, and discrepancies in assessment procedures were evident. Differences in test environment, pitch, and cricket ball characteristics; the warm-up prior to test; test familiarization procedures; permitted run-up lengths; bowling spell length; delivery sequence; test instructions; collection of bowling speed data; and collection and reportage of bowling accuracy data were apparent throughout the literature. The reliability and sensitivity of fast-bowling skill measures have rarely been reported across the literature. Only 1 study has attempted to assess the construct validity of its skill measures. Conclusions: There are several discrepancies in how fast-bowling skill has been assessed and subsequently quantified in the literature to date. This is a problem, because comparisons between studies are often difficult. Therefore, a strong rationale exists for the creation of match-specific standardized fast-bowling assessments that offer greater ecological validity while maintaining acceptable reliability and sensitivity of the skill measures. If prospective research can act on the proposed recommendations from this review, then coaches will be able to make more informed decisions surrounding player selection, talent identification, return to skill following injury, and the efficacy of short- and long-term training interventions for fast bowlers.
Relationship between selected physical qualities, bowling kinematics, and pace bowling skill in club-standard cricketers
- Authors: Feros, Simon , Young, Warren , O'Brien, Brendan
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Strength and Conditioning Research Vol. 33, no. 10 (Oct 2019), p. 2812-2825
- Full Text: false
- Reviewed:
- Description: Although strength and conditioning of cricket pace bowlers has become more specialized in recent times, little is understood about the interplay between physical capacities, pace bowling kinematics, and pace bowling skill measures. This study sought to determine these interrelationships. Thirty-one male club-standard pace bowlers completed 3 test sessions on separate occasions 4-7 days apart. The first testing session comprised an 8-over pace bowling assessment, where bowling skill and selected bowling kinematics were measured. A physical test battery was completed over the remaining 2 sessions. Peak and mean ball release (BR) speed were related with 1 repetition maximum pull-up strength (r(s) = 0.56, p = 0.005) and correlated with 20-m sprint time (r(s) = -0.42, p = 0.022; r(s) = -0.37, p = 0.044, respectively). Mean radial error was associated with 10-m and 20-m sprint times (r(s) = 0.41, p = 0.030; r(s) = 0.38, p = 0.037, respectively), and correlated with height and peak power from 3 countermovement jumps (CMJs) (r(s) = -0.39, p = 0.036; r(s) = -0.41, p = 0.031, respectively), and mean peak power from 20 CMJs (r(s) = -0.45, p = 0.020). Bivariate variable error was correlated with front-leg extension angle at BR (r(s) = 0.41, p = 0.036), and also with approach speed (r(s) = -0.36, p = 0.050). These relationships may assist strength and conditioning coaches in designing more effective training programs to enhance bowling speed and accuracy. Training interventions are warranted, however, to validate these associations.
Efficacy of combined general, special, and specific resistance training on pace bowling skill in club-standard cricketers
- Authors: Feros, Simon , Young, Warren , OʼBrien, Brendan
- Date: 2020
- Type: Text , Journal article
- Relation: Journal of strength and conditioning research Vol. 34, no. 9 (2020), p. 2596-2607
- Full Text: false
- Reviewed:
- Description: Feros, SA, Young, WB, and O'Brien, BJ. Efficacy of combined general, special, and specific resistance training on pace bowling skill in club-standard cricketers. J Strength Cond Res 34(9): 2596-2607, 2020-This study investigated the efficacy of combined "general," "special," and "specific" resistance training on pace bowling skill. Twelve male, club-standard pace bowlers were randomly allocated to a combined resistance training (CRT) program or traditional cricket training (TCT) program for 8 weeks. The CRT group (n = 6) trained with 300, 250-g, and standard cricket balls; performed 20-m sprints with +20% and +15% body mass resistance (but also unresisted); and completed chin-up and pull-up training. The TCT group (n = 6) trained with standard balls and performed unresisted 20-m sprints. No statistically significant GROUP × TIME interactions were identified. The CRT group demonstrated a "clear moderate" enhancement in peak ball release speed (mean ±95% confidence limits [CLs]: 1.2 ± 1.5 m·s, d = 0.66 ± 0.83), a "clear large" increase in mean radial error (mean ±95% CLs: 7.1 ± 6.5 cm, d = 0.94 ± 0.87), and a "clear large" rise in bivariate variable error (mean ±95% CLs: 7.2 ± 7.8 cm, d = 0.97 ± 1.05). The TCT group exhibited "unclear" changes across all pace bowling skill measures. Both groups displayed "unclear" changes in approach speed, 20-m sprint time, and 1 repetition maximum pull-up strength. In 8 weeks, the CRT program improved peak ball release speed, but at the cost of poorer bowling accuracy and consistency of bowling accuracy. These findings could be attributed to bowling with the heavier balls. The inclusion of "specific" resistance training does not seem to be effective in enhancing all-round pace bowling skill in club-standard cricketers.
"It Doesn't Make Sense for Us Not to Have One" - Understanding reasons why community sports organizations chose to participate in a funded automated external defibrillator program
- Authors: Fortington, Lauren , Bekker, Sheree , Morgan, Damian , Finch, Caroline
- Date: 2019
- Type: Text , Journal article
- Relation: Clinical Journal of Sport Medicine Vol. 29, no. 4 (2019), p. 324-328
- Full Text: false
- Reviewed:
- Description: Objective: Implementation of automated external defibrillators (AEDs) in community sports settings is an important component of emergency medical planning. This study aimed to understand motivations for why sports organizations participated in a government-funded program that provided AEDs and associated first-aid training. Design: Face-to-face interviews. Setting: Community sports organizations in Victoria, Australia. Participants: Representatives from 14 organizations who participated in a government-funded AED program. Main Outcome Measures: Motivations to participate in the AED program were explored using a qualitative descriptive approach. Results: Two overarching themes emerged: awareness of the program and decision to apply. Awareness was gained indirectly through grant advertising in newsletters/emails/web sites and directly through their sporting associations. For most organizations, there was no decision process per se, rather, the opportunity to apply was the key determinant for participating in the program. A duty of care also emerged as a key driving factor, with recognition of AEDs as a valuable asset to communities broadly, not just the participants' immediate sports setting. Reflecting on participation in the program, these participants identified that it was important to increase awareness about AED ownership and use. The program benefits were clearly summed up as being best prepared for a worst-case scenario. Discussion: This study provides new understanding of why community sports organizations apply for an AED and training. The strongest reason was simply the opportunity to acquire this at no cost. Therefore, for wider implementation of AEDs, additional funding opportunities, targeted awareness of these opportunities, and continued promotion of AED importance are recommended.
Match injuries in Sri Lankan junior cricket : A prospective, longitudinal study
- Authors: Gamage, Prasanna , Fortington, Lauren , Kountouris, Alex , Finch, Caroline
- Date: 2019
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 22, no. 6 (2019), p. 647-652
- Full Text: false
- Reviewed:
- Description: Objectives: Understanding the nature of injuries in cricket is key to mitigate injury risks and prioritise preventive measures. This study aimed to identify the incidence and nature of match injuries among Sri Lankan junior cricketers. Design: Longitudinal follow-up study with prospective in-season data collection. Methods: A national survey of schoolboy, division-1 cricket teams in under-15 and under-17 age groups. Using a paper-based questionnaire, distributed to school-teams at the start of the 2016 cricket season, respondents recorded any injuries, including the site, type and mechanism. Match injury incidence rates (match-IIR) (injuries/100 match-player-days) were calculated overall, by position and for match time loss (MTL) and non-MTL injuries. Results: From 59 school-teams, 573 players responded, with 404 players reporting 744 injuries in 648 matches. The match-IIR was 28.0 injuries/100 match-player-days (95% CI = 26.0–30.2). The highest match-IIR was reported among fielders (46.0% of all injuries sustained; match-IIR = 12.9) compared with batters (25.4%; match-IIR = 7.1) and bowlers (20.3%; match-IIR = 5.7). Abrasions and bruises to the knee or elbow were the most common injuries among fielders, with the majority being non-MTL injuries. Conclusions: Almost half (46.0%) of all injuries were to fielders, and more research into their severity and mechanisms is needed to identify the need for, and design of, preventive measures. Batters sustained a relatively large number of facial-organ injuries from being struck by the ball, presenting a need to evaluate the use and appropriateness of helmets by Sri Lankan junior cricketers. Similar to other junior cricket studies, the most common injuries among bowlers were strains and sprains, mainly affecting the lower limbs and lower back. © 2018
Sport and leisure activities in the heat: What safety resources exist?
- Authors: Gonsalves, Marlon , O'Brien, Brendan , Twomey, Dara
- Date: 2021
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 24, no. 8 (2021), p. 781-786
- Full Text: false
- Reviewed:
- Description: Objectives: To conduct a document analysis of sports and leisure activity heat-related injury prevention resources in Australia and develop an understanding of the content within those resources. Design & Methods: Heat resources were included if they dealt specifically with, or could be extrapolated to, prevention of heat-related injuries. Collating strategies for the catalogue included: (1) a detailed search of the organisation's website and (2) an online search for sport specific heat resources. A content analysis of each resource was first performed, and descriptive codes were assigned to the data using qualitative data analysis software. Every coded text was recorded as an individual data point (n). Common sub-categories were identified by thematic analysis and collated under three broader categories. Results: A total of 468 data points were identified within the 64 heat resources found. Guidelines (n = 20) and policies (n = 18) were the most common type of resources followed by factsheets (n = 9), webpages (n = 8), laws and by-laws (n = 2). Three overarching categories emerged through the data analysis process: preventive strategies (n = 299, 63.9%), risk factors (n = 94, 20.1%), treatment (n = 75, 16.0%). Activity modification, which included information on rescheduling games and extra breaks, was the most common intervention. Cricket, soccer, swimming and triathlon had the most complete set of heat resources. Conclusions: The findings of this study provide an insight into the composition of heat-related sports injury prevention resources within Australia and identify areas for development. As the resources were incomplete for many sports, the development of more comprehensive heat safety resources is required to ensure the safety of participants. © 2021 Elsevier Ltd
Developing a contributing factor classification scheme for Rasmussen's AcciMap : Reliability and validity evaluation
- Authors: Goode, Natassia , Salmon, Paul , Taylor, Natalie , Lenné, Michael , Finch, Caroline
- Date: 2017
- Type: Text , Journal article
- Relation: Applied Ergonomics Vol. 64, no. (2017), p. 14-26
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text: false
- Reviewed:
- Description: One factor potentially limiting the uptake of Rasmussen's (1997) Accimap method by practitioners is the lack of a contributing factor classification scheme to guide accident analyses. This article evaluates the intra- and inter-rater reliability and criterion-referenced validity of a classification scheme developed to support the use of Accimap by led outdoor activity (LOA) practitioners. The classification scheme has two levels: the system level describes the actors, artefacts and activity context in terms of 14 codes; the descriptor level breaks the system level codes down into 107 specific contributing factors. The study involved 11 LOA practitioners using the scheme on two separate occasions to code a pre-determined list of contributing factors identified from four incident reports. Criterion-referenced validity was assessed by comparing the codes selected by LOA practitioners to those selected by the method creators. Mean intra-rater reliability scores at the system (M = 83.6%) and descriptor (M = 74%) levels were acceptable. Mean inter-rater reliability scores were not consistently acceptable for both coding attempts at the system level (MT1 = 68.8%; M T2 = 73.9%), and were poor at the descriptor level (MT1 = 58.5%; M T2 = 64.1%). Mean criterion referenced validity scores at the system level were acceptable (MT1 = 73.9%; M T2 = 75.3%). However, they were not consistently acceptable at the descriptor level (MT1 = 67.6%; M T2 = 70.8%). Overall, the results indicate that the classification scheme does not currently satisfy reliability and validity requirements, and that further work is required. The implications for the design and development of contributing factors classification schemes are discussed. © 2017 Elsevier Ltd
A 12-month prospective cohort study of symptoms of common mental disorders among European professional footballers
- Authors: Gouttebarge, Vincent , Aoki, Haruhito , Verhagen, Evert , Kerkhoffs, Gino
- Date: 2017
- Type: Text , Journal article
- Relation: Clinical Journal of Sport Medicine Vol. 27, no. 5 (2017), p. 487-492
- Full Text: false
- Reviewed:
- Description: Objective: To determine the 12-month incidence and comorbidity of symptoms of common mental disorders (CMD) among European professional footballers and to explore the association of potential stressors with the health conditions under study among those European professional footballers. Design: Observational prospective cohort study with a follow-up period of 12 months. Participants: Male professional footballers from 5 European countries (n = 384 at baseline). Assessment of Risk Factors: Adverse life events, conflicts with trainer/coach, and career dissatisfaction were explored by using validated questionnaires. Main Outcome Measures: Symptoms of distress, anxiety/depression, sleep disturbance, and adverse alcohol use were assessed using validated questionnaires. Results: A total of 384 players (mean age of 27 years old; mean career duration of 8 years) were enrolled, of which 262 completed the follow-up period. The incidence of symptoms of CMD were 12% for distress, 37% for anxiety/depression, 19% for sleep disturbance, and 14% for adverse alcohol use. Over the follow-up period of 12 months, approximately 13% of the participants reported 2 symptoms, 5% three symptoms, and 3% four symptoms. Professional footballers reporting recent adverse life events, a conflict with trainer/coach, or career dissatisfaction were more likely to report symptoms of CMD, but statistically significant associations were not found. Conclusions: The 12-month incidence of symptoms of CMD among European professional footballers ranged from 12% for symptoms of distress to 37% for symptoms of anxiety/depression. A professional football team typically drawn from a squad of 25 players can expect symptoms of CMD to occur among at least 3 players in one season.
From control to causation : Validating a ‘complex systems model’ of running-related injury development and prevention
- Authors: Hulme, Adam , Salmon, Paul , Nielsen, Rasmus , Read, Gemma , Finch, Caroline
- Date: 2017
- Type: Text , Journal article
- Relation: Applied Ergonomics Vol. 65, no. (2017), p. 345-354
- Full Text: false
- Reviewed:
- Description: Introduction There is a need for an ecological and complex systems approach for better understanding the development and prevention of running-related injury (RRI). In a previous article, we proposed a prototype model of the Australian recreational distance running system which was based on the Systems Theoretic Accident Mapping and Processes (STAMP) method. That model included the influence of political, organisational, managerial, and sociocultural determinants alongside individual-level factors in relation to RRI development. The purpose of this study was to validate that prototype model by drawing on the expertise of both systems thinking and distance running experts. Materials and methods This study used a modified Delphi technique involving a series of online surveys (December 2016- March 2017). The initial survey was divided into four sections containing a total of seven questions pertaining to different features associated with the prototype model. Consensus in opinion about the validity of the prototype model was reached when the number of experts who agreed or disagreed with survey statement was ≥75% of the total number of respondents. Results A total of two Delphi rounds was needed to validate the prototype model. Out of a total of 51 experts who were initially contacted, 50.9% (n = 26) completed the first round of the Delphi, and 92.3% (n = 24) of those in the first round participated in the second. Most of the 24 full participants considered themselves to be a running expert (66.7%), and approximately a third indicated their expertise as a systems thinker (33.3%). After the second round, 91.7% of the experts agreed that the prototype model was a valid description of the Australian distance running system. Conclusion This is the first study to formally examine the development and prevention of RRI from an ecological and complex systems perspective. The validated model of the Australian distance running system facilitates theoretical advancement in terms of identifying practical system-wide opportunities for the implementation of sustainable RRI prevention interventions. This ‘big picture’ perspective represents the first step required when thinking about the range of contributory causal factors that affect other system elements, as well as runners' behaviours in relation to RRI risk. © 2017 Elsevier Ltd