Understanding perceptions of injury risk associated with playing junior cricket
- White, Peta, Finch, Caroline, Dennis, Rebecca, Siesmaa, Emma
- Authors: White, Peta , Finch, Caroline , Dennis, Rebecca , Siesmaa, Emma
- Date: 2010
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 14, no. 2 (2010 2010), p. 115-120
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text:
- Reviewed:
- Description: Preventing sports injuries in children is important, but there is limited information about children's perceptions of injury risk or their injury beliefs and attitudes. This study investigated injury risk perceptions in a sample of junior sports participants across different age levels of play. Junior cricket players (n = 284, aged 8-16) completed a survey about their injury risk perceptions. Survey questions asked about players' perceived injury risk to themselves compared to cricketers in general, as well as their perceived injury risk across different playing position, ground condition, and protective equipment use scenarios. Chi-square analysis found that risk perceptions were significantly higher in U12 and U14 players for both batting and fielding compared to U16 players and that U16 players had a higher risk perception associated with bowling. Players tended to see themselves as less likely to be injured than cricketers in general and perceived there to be a high risk of injury when fielding close to the batter and a comparatively low risk of injury when fielding in the outfield. Junior players also perceived there to be a high injury risk associated with playing on hard and bumpy grounds. Despite their relatively accurate perceptions of risk and appreciation for the importance of protective equipment, junior players need continual reminding of the importance of safety strategies by coaches and others. Coaches need to inform players that fielding injuries can occur anywhere on the ground, and include skills practice accordingly. © 2010 Sports Medicine Australia.
- Authors: White, Peta , Finch, Caroline , Dennis, Rebecca , Siesmaa, Emma
- Date: 2010
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 14, no. 2 (2010 2010), p. 115-120
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text:
- Reviewed:
- Description: Preventing sports injuries in children is important, but there is limited information about children's perceptions of injury risk or their injury beliefs and attitudes. This study investigated injury risk perceptions in a sample of junior sports participants across different age levels of play. Junior cricket players (n = 284, aged 8-16) completed a survey about their injury risk perceptions. Survey questions asked about players' perceived injury risk to themselves compared to cricketers in general, as well as their perceived injury risk across different playing position, ground condition, and protective equipment use scenarios. Chi-square analysis found that risk perceptions were significantly higher in U12 and U14 players for both batting and fielding compared to U16 players and that U16 players had a higher risk perception associated with bowling. Players tended to see themselves as less likely to be injured than cricketers in general and perceived there to be a high risk of injury when fielding close to the batter and a comparatively low risk of injury when fielding in the outfield. Junior players also perceived there to be a high injury risk associated with playing on hard and bumpy grounds. Despite their relatively accurate perceptions of risk and appreciation for the importance of protective equipment, junior players need continual reminding of the importance of safety strategies by coaches and others. Coaches need to inform players that fielding injuries can occur anywhere on the ground, and include skills practice accordingly. © 2010 Sports Medicine Australia.
Challenges in the development of standards for synthetic turf for Australian football and cricket
- Twomey, Dara, Otago, Leonie, Saunders, Natalie
- Authors: Twomey, Dara , Otago, Leonie , Saunders, Natalie
- Date: 2010
- Type: Text , Journal article
- Relation: Proceedings of the Institution of Mechanical Engineers, Part P: Journal of Sports Engineering and Technology Vol. , no. (2010), p. 9
- Full Text:
- Reviewed:
- Description: Given the escalating drought conditions in Australia, synthetic surfaces have recently been explored as a viable surface option for community-level Australian football–cricket ovals. The vast majority of Australian football ovals are transformed into cricket pitches during the football off-season and hence the characteristics of both sports had to be duly considered in the development of standards that could be tested in a laboratory setting, for a synthetic turf surface. This paper describes the data collection and test methods undertaken in the development of the standards for synthetic surface use in Australian football and cricket. The paper also discusses the issues and challenges encountered during the development of standards for multi-sport synthetic surfaces to ensure player safety while maintaining the performance characteristics of both sports. Surface property and ball interaction tests were undertaken on natural playing surfaces, both in situ and in the laboratory to determine the properties of the current playing surface for each sport. This paper highlights the importance of careful consideration of the characteristics of both games and the use of equipment from both sports in the testing methods. The standards described in this paper have now been accepted by the Australian Football League and Cricket Australia and the product approval process and use of synthetic surfaces for Australian football and cricket is imminent.
- Authors: Twomey, Dara , Otago, Leonie , Saunders, Natalie
- Date: 2010
- Type: Text , Journal article
- Relation: Proceedings of the Institution of Mechanical Engineers, Part P: Journal of Sports Engineering and Technology Vol. , no. (2010), p. 9
- Full Text:
- Reviewed:
- Description: Given the escalating drought conditions in Australia, synthetic surfaces have recently been explored as a viable surface option for community-level Australian football–cricket ovals. The vast majority of Australian football ovals are transformed into cricket pitches during the football off-season and hence the characteristics of both sports had to be duly considered in the development of standards that could be tested in a laboratory setting, for a synthetic turf surface. This paper describes the data collection and test methods undertaken in the development of the standards for synthetic surface use in Australian football and cricket. The paper also discusses the issues and challenges encountered during the development of standards for multi-sport synthetic surfaces to ensure player safety while maintaining the performance characteristics of both sports. Surface property and ball interaction tests were undertaken on natural playing surfaces, both in situ and in the laboratory to determine the properties of the current playing surface for each sport. This paper highlights the importance of careful consideration of the characteristics of both games and the use of equipment from both sports in the testing methods. The standards described in this paper have now been accepted by the Australian Football League and Cricket Australia and the product approval process and use of synthetic surfaces for Australian football and cricket is imminent.
The reliability of musculoskeletal screening tests used in cricket
- Dennis, Rebecca, Finch, Caroline, Elliott, Bruce, Farhart, Patrick
- Authors: Dennis, Rebecca , Finch, Caroline , Elliott, Bruce , Farhart, Patrick
- Date: 2008
- Type: Text , Journal article
- Relation: Physical Therapy in Sport Vol. 9, no. 1 (2008), p. 25-33
- Full Text:
- Reviewed:
- Description: Objectives: To determine the inter- and intra-observer reliability of a field-based musculoskeletal screening protocol used to measure potential injury risk factors in cricket fast bowlers. Design: Test-retest reliability study. Setting: High performance Australian cricket. Participants: Ten volunteers. Two sports physiotherapists conducted the testing. Main outcome measures: Participants completed the following tests: knee extension; modified Thomas test (hip extension and abduction); hip internal and external rotation; combined elevation; ankle dorsiflexion lunge; bridging hold; prone four point hold; and calf heel raises. Methods: For each of the tests, the participants were tested by each physiotherapist twice, and the inter- and intra-observer reliability were concurrently assessed. Results: The inter-observer reliability of the tests was generally poor, with only four of the ten tests having an intraclass correlation coefficient (ICC) greater than 0.80 (range of ICCs 0.27-0.99). The intra-observer reliability of the tests was considerably higher, with nine tests having an ICC greater than 0.80 (range of ICCs 0.56-0.99). Conclusions: With the exception of the bridging hold, all tests would be considered acceptable where only one observer was conducting the testing. However, only the ankle dorsiflexion lunge, combined elevation test, calf heel raise test and prone four point hold have acceptable reliability when there are multiple physiotherapists recording measurements. © 2007 Elsevier Ltd. All rights reserved.
- Description: C1
- Authors: Dennis, Rebecca , Finch, Caroline , Elliott, Bruce , Farhart, Patrick
- Date: 2008
- Type: Text , Journal article
- Relation: Physical Therapy in Sport Vol. 9, no. 1 (2008), p. 25-33
- Full Text:
- Reviewed:
- Description: Objectives: To determine the inter- and intra-observer reliability of a field-based musculoskeletal screening protocol used to measure potential injury risk factors in cricket fast bowlers. Design: Test-retest reliability study. Setting: High performance Australian cricket. Participants: Ten volunteers. Two sports physiotherapists conducted the testing. Main outcome measures: Participants completed the following tests: knee extension; modified Thomas test (hip extension and abduction); hip internal and external rotation; combined elevation; ankle dorsiflexion lunge; bridging hold; prone four point hold; and calf heel raises. Methods: For each of the tests, the participants were tested by each physiotherapist twice, and the inter- and intra-observer reliability were concurrently assessed. Results: The inter-observer reliability of the tests was generally poor, with only four of the ten tests having an intraclass correlation coefficient (ICC) greater than 0.80 (range of ICCs 0.27-0.99). The intra-observer reliability of the tests was considerably higher, with nine tests having an ICC greater than 0.80 (range of ICCs 0.56-0.99). Conclusions: With the exception of the bridging hold, all tests would be considered acceptable where only one observer was conducting the testing. However, only the ankle dorsiflexion lunge, combined elevation test, calf heel raise test and prone four point hold have acceptable reliability when there are multiple physiotherapists recording measurements. © 2007 Elsevier Ltd. All rights reserved.
- Description: C1
The determinants and development of fast bowling performance in cricket
- Authors: Feros, Simon
- Date: 2015
- Type: Text , Thesis , PhD
- Full Text:
- Description: This thesis sought to reveal the physical and kinematic determinants of pace bowling performance. After drawing on these determinants, a secondary aim was to investigate whether pace bowling performance could be enhanced with chronic resistance training and warm-up strategies. However, before the physical and kinematic determinants of pace bowling performance could be identified, and the effects of two training interventions and warm-ups on pace bowling performance, a new pace bowling test was created, and the test-retest reliability of its performance and kinematic measures were evaluated. Knowledge of a variables’ test-retest reliability is important for interpreting the validity of correlations, but also for the determination of a meaningful change following a training intervention. Only one published study to date has explored the test-retest reliability of a pace bowling assessment, and this test only measured bowling accuracy (1). Previous research has not comprehensively examined the relationships between physical qualities and pace bowling performance. Several important physical qualities (e.g., power, speed-acceleration, flexibility, repeat-sprint ability) have been excluded in correlational research, which may be crucial for optimal pace bowling performance. Furthermore, there is only one published training intervention study on pace bowling research (2). Consequently there is scant evidence for coaches to design training programs proven to enhance pace bowling performance. Baseball pitching studies have trialled the effects of heavy-ball throwing in the warm-up on subsequent throwing velocity and accuracy, but this approach has not been studied in cricket pace bowling, especially after several weeks of training. Therefore, four studies were conducted in this PhD project to address these deficiencies in the literature. The purpose of Study 1 (Chapter 3) was to ascertain the test-retest reliability of bowling performance measures (i.e., bowling speed, bowling accuracy, consistency of bowling speed, and consistency of bowling accuracy) and selected bowling kinematics (i.e., approach speed, step length, step-length phase duration, power phase duration, and knee extension angle at front-foot contact and at ball release) in a novel eight-over test, and for the first four overs of this test. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and coefficient of variation (CV) were used as measures of test-retest reliability (3). Following a three week familiarisation period of bowling, 13 participants completed a novel eight-over bowling test on two separate days with 4–7 days apart. The most reliable performance measures in the bowling test were peak bowling speed (ICC = 0.948–0.975, CV = 1.3–1.9%) and mean bowling speed (ICC = 0.981–0.987, CV = 1.0–1.3%). Perceived effort was partially reliable (ICC = 0.650– 0.659, CV = 3.8–3.9%). However, mean bowling accuracy (ICC = 0.491–0.685, CV = 12.5–16.8%) and consistency of bowling accuracy failed to meet the pre-set standard for acceptable reliability (ICC = 0.434–0.454, CV = 15.3–19.3%). All bowling kinematic variables except approach speed exhibited acceptable reliability (i.e., ICC > 0.8, CV < 10%). The first four overs of the bowling test exhibited slightly poorer test-retest reliability for all measures, compared to the entire eight-over test. There were no systematic biases (i.e., p > 0.05) detected with all variables between bowling tests, indicating there was no learning or fatigue effects. The smallest worthwhile change was established for all bowling performance and kinematic variables, by multiplying the SEM by 1.5 (4). It is recommended that the eight-over pace bowling test be used as a more comprehensive measure of consistency of bowling speed and consistency of bowling accuracy, as bowlers are more likely to be fatigued. However, if coaches seek to assess pace bowlers in shorter time, delimiting the test to the first four overs is recommended. Both versions of the pace bowling test are only capable of reliably measuring bowling performance outcomes such as peak and mean bowling speed, and perceived effort. The second study of this PhD project examined the relationships between selected physical qualities, bowling kinematics, and bowling performance measures. Another purpose of this novel study was to determine if delivery instructions (i.e., maximal-effort, match-intensity, slower-ball) influenced the strength of the relationships between physical qualities and bowling performance measures. Given that there were three delivery instructions in the bowling test, an objective of this study was to explore the relationship between bowling speed and bowling accuracy (i.e., speed-accuracy trade-off). Thirty-one participants completed an eight-over bowling test in the first session, and a series of physical tests, spread over two separate sessions. Each session was separated by four to seven days. Mean bowling speed (of all pooled deliveries) was significantly correlated to 1-RM pull-up strength (rs [24] = 0.55, p = 0.01) and 20-m sprint time (rs [30] = -0.37, p = 0.04), but the correlations marginally increased as delivery effort increased (i.e., maximal-effort ball). Greater hamstring flexibility was associated with a better consistency of bowling speed, but only for a match-intensity delivery (rs [29] = -0.49, p = 0.01). Repeat-sprint ability (i.e., percent decrement on 10 × 20-m sprints, on every 20 s) displayed a stronger correlation to consistency of bowling speed (rs [21] = -0.42, p = 0.06) than for mean bowling speed (rs [21] = 0.15, p = 0.53). Bench press strength was moderately related to bowling accuracy for a maximal-effort delivery (rs [26] = -0.42, p = 0.03), with weaker but non-significant (p > 0.05) correlations for match-intensity and slower-ball deliveries. Bowling accuracy was also significantly related to peak concentric countermovement jump power (rs [28] = -0.41, p = 0.03) and mean peak concentric countermovement jump power (rs [27] = -0.45, p = 0.02), with both physical qualities displaying stronger correlations as delivery effort increased. Greater reactive strength was negatively associated with mean bowling accuracy (rs [30] = 0.38, p = 0.04) and consistency of bowling accuracy (rs [30] = 0.43, p = 0.02) for maximal-effort deliveries only. Faster bowling speeds were correlated to a longer step length (rs [31] = 0.51, p < 0.01) and quicker power phase duration (rs [31] = -0.45, p = 0.01). A better consistency of bowling accuracy was associated with a faster approach speed (rs [31] = -0.36, p = 0.05) and greater knee flexion angle at ball release (rs [27] = -0.42, p = 0.03). No speedaccuracy trade-off was observed for the group (rs [31] = -0.28, p = 0.12), indicating that most bowlers could be instructed to train at maximal-effort without compromising bowling accuracy. Pull-up strength training and speed-acceleration training were chosen for the “evidence-based” training program (Study 3). Heavy-ball bowling was also considered as part of the evidence-based training program, as it is a specific form of training used previously, and because there was a shortage of significant relationships (p < 0.05) between physical qualities and bowling performance measures in Study 2. The third investigation of this PhD project compared the effects of an eight-week evidence-based training program or normal training program (not a control group) on pace bowling performance, approach speed, speed-acceleration, and pull-up strength. Participants were matched for bowling speed and then randomly split into two training groups, with six participants in each group. After an initial two-week familiarisation period of bowling training, sprint training, and pull-up training, participants completed two training sessions per week, and were tested before and after the training intervention. Testing comprised the four-over pace bowling test (Study 1), 20-m sprint test (Study 2), and 1-RM pull-up test (Study 2). In training, the volume of bowling and sprinting was constant between both groups; the only differences were that the evidence-based training group bowled with heavy balls (250 g and 300 g) as well as a regular ball (156 g), sprinted with a weighted-vest (15% and 20% body mass) and without a weighted-vest, and performed pull-up training. Participants were instructed to deliver each ball with maximal effort in training, as no speed-accuracy trade-off was observed for the sample in Study 2. The evidence-based training group bowled with poorer accuracy and consistency of accuracy, with only a small improvement in peak and mean bowling speed. Heavy-ball bowling may have had a negative transfer to regular-ball bowling. Although speculative, a longer evidence-based program may have significantly enhanced bowling speed. Coaches could use both training programs to develop performance but should be aware that bowling accuracy may suffer with the evidence-based program. The evidence-based training group displayed slower 20-m sprint times following training (0.08 ± 0.05 s). However, the normal training group was also slower (0.10 ± 0.09 s), indicating the potential for speed-acceleration improvement is compromised if speed training is performed immediately after bowling training; most likely due to residual fatigue. Consequently it is recommended that speed-acceleration training be conducted when bowlers are not fatigued, in a separate session, or at the beginning of a session. The evidence-based training group improved their 1-RM pull-up strength by 5.8 ± 6.8 kg (d = 0.68), compared to the normal training group of 0.2 ± 1.7 kg (d = 0.01). The difference between training groups is due to the fact that the normal training group were not prescribed pull-up training. As many participants could not complete the pull-up exercise due to insufficient strength, the dumbbell pullover may be a suitable alternative that is more specific to the motion of the bowling arm (i.e., extended arm). The fourth study of this PhD project explored the acute effects of a heavy-ball bowling warm-up on pace bowling performance, and determined if these acute effects could be enhanced or negated following an evidence-based training program. This study involved the same participants who completed the evidence-based training program in Study 3. These participants were required to perform two different bowling warm-ups (heavy-ball or regular-ball) in pre and post-test period, followed by the four-over pace bowling test (Study 1). In pre-test period, bowling accuracy was 8.8 ± 7.4 cm worse for the heavy-ball warm-up compared to the regular-ball warm-up (d = 1.19). In post-test period however, bowling accuracy was 5.5 ± 6.4 cm better in the heavy-ball warm-up compared to the regular-ball warm-up (d = -0.90). A similar trend was observed for consistency of bowling accuracy. These findings indicate that pace bowlers adapt to heavy-ball bowling, and bowl more accurately with a regular ball if they warm-up with a heavy ball first (but only after eight weeks of heavy-ball training). Coaches could employ a heavy-ball warm-up prior to training or a match, but only after eight weeks of evidence based training. It is hypothesised that a less biomechanically similar exercise to the pace bowling motion such as resisted push-ups / bench press throws could be more effective in eliciting potentiation by activating higher order motor units without negatively transferring to bowling performance. From the studies presented in this thesis, it is concluded that peak and mean bowling speed are the most reliable bowling performance measures, and all kinematic variables apart from approach speed possess excellent reliability. Furthermore, 1-RM pull-up strength and 20-m speed are significantly correlated to bowling speed. An evidence-based training program can develop peak and mean bowling speed, but the cost to bowling accuracy and consistency of bowling accuracy does not make this training program worthwhile in enhancing pace bowling performance. A heavy-ball warm-up impairs bowling accuracy and consistency of bowling accuracy compared to the regular-ball warm-up, but only prior to training with the heavier balls. Pace bowlers adapt to heavyball bowling after eight weeks of training, but must use the heavy balls in the warm-up to bowl more accurately with a regular ball, otherwise pace bowling performance is below optimal.
- Description: Doctor of Philosophy
- Authors: Feros, Simon
- Date: 2015
- Type: Text , Thesis , PhD
- Full Text:
- Description: This thesis sought to reveal the physical and kinematic determinants of pace bowling performance. After drawing on these determinants, a secondary aim was to investigate whether pace bowling performance could be enhanced with chronic resistance training and warm-up strategies. However, before the physical and kinematic determinants of pace bowling performance could be identified, and the effects of two training interventions and warm-ups on pace bowling performance, a new pace bowling test was created, and the test-retest reliability of its performance and kinematic measures were evaluated. Knowledge of a variables’ test-retest reliability is important for interpreting the validity of correlations, but also for the determination of a meaningful change following a training intervention. Only one published study to date has explored the test-retest reliability of a pace bowling assessment, and this test only measured bowling accuracy (1). Previous research has not comprehensively examined the relationships between physical qualities and pace bowling performance. Several important physical qualities (e.g., power, speed-acceleration, flexibility, repeat-sprint ability) have been excluded in correlational research, which may be crucial for optimal pace bowling performance. Furthermore, there is only one published training intervention study on pace bowling research (2). Consequently there is scant evidence for coaches to design training programs proven to enhance pace bowling performance. Baseball pitching studies have trialled the effects of heavy-ball throwing in the warm-up on subsequent throwing velocity and accuracy, but this approach has not been studied in cricket pace bowling, especially after several weeks of training. Therefore, four studies were conducted in this PhD project to address these deficiencies in the literature. The purpose of Study 1 (Chapter 3) was to ascertain the test-retest reliability of bowling performance measures (i.e., bowling speed, bowling accuracy, consistency of bowling speed, and consistency of bowling accuracy) and selected bowling kinematics (i.e., approach speed, step length, step-length phase duration, power phase duration, and knee extension angle at front-foot contact and at ball release) in a novel eight-over test, and for the first four overs of this test. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and coefficient of variation (CV) were used as measures of test-retest reliability (3). Following a three week familiarisation period of bowling, 13 participants completed a novel eight-over bowling test on two separate days with 4–7 days apart. The most reliable performance measures in the bowling test were peak bowling speed (ICC = 0.948–0.975, CV = 1.3–1.9%) and mean bowling speed (ICC = 0.981–0.987, CV = 1.0–1.3%). Perceived effort was partially reliable (ICC = 0.650– 0.659, CV = 3.8–3.9%). However, mean bowling accuracy (ICC = 0.491–0.685, CV = 12.5–16.8%) and consistency of bowling accuracy failed to meet the pre-set standard for acceptable reliability (ICC = 0.434–0.454, CV = 15.3–19.3%). All bowling kinematic variables except approach speed exhibited acceptable reliability (i.e., ICC > 0.8, CV < 10%). The first four overs of the bowling test exhibited slightly poorer test-retest reliability for all measures, compared to the entire eight-over test. There were no systematic biases (i.e., p > 0.05) detected with all variables between bowling tests, indicating there was no learning or fatigue effects. The smallest worthwhile change was established for all bowling performance and kinematic variables, by multiplying the SEM by 1.5 (4). It is recommended that the eight-over pace bowling test be used as a more comprehensive measure of consistency of bowling speed and consistency of bowling accuracy, as bowlers are more likely to be fatigued. However, if coaches seek to assess pace bowlers in shorter time, delimiting the test to the first four overs is recommended. Both versions of the pace bowling test are only capable of reliably measuring bowling performance outcomes such as peak and mean bowling speed, and perceived effort. The second study of this PhD project examined the relationships between selected physical qualities, bowling kinematics, and bowling performance measures. Another purpose of this novel study was to determine if delivery instructions (i.e., maximal-effort, match-intensity, slower-ball) influenced the strength of the relationships between physical qualities and bowling performance measures. Given that there were three delivery instructions in the bowling test, an objective of this study was to explore the relationship between bowling speed and bowling accuracy (i.e., speed-accuracy trade-off). Thirty-one participants completed an eight-over bowling test in the first session, and a series of physical tests, spread over two separate sessions. Each session was separated by four to seven days. Mean bowling speed (of all pooled deliveries) was significantly correlated to 1-RM pull-up strength (rs [24] = 0.55, p = 0.01) and 20-m sprint time (rs [30] = -0.37, p = 0.04), but the correlations marginally increased as delivery effort increased (i.e., maximal-effort ball). Greater hamstring flexibility was associated with a better consistency of bowling speed, but only for a match-intensity delivery (rs [29] = -0.49, p = 0.01). Repeat-sprint ability (i.e., percent decrement on 10 × 20-m sprints, on every 20 s) displayed a stronger correlation to consistency of bowling speed (rs [21] = -0.42, p = 0.06) than for mean bowling speed (rs [21] = 0.15, p = 0.53). Bench press strength was moderately related to bowling accuracy for a maximal-effort delivery (rs [26] = -0.42, p = 0.03), with weaker but non-significant (p > 0.05) correlations for match-intensity and slower-ball deliveries. Bowling accuracy was also significantly related to peak concentric countermovement jump power (rs [28] = -0.41, p = 0.03) and mean peak concentric countermovement jump power (rs [27] = -0.45, p = 0.02), with both physical qualities displaying stronger correlations as delivery effort increased. Greater reactive strength was negatively associated with mean bowling accuracy (rs [30] = 0.38, p = 0.04) and consistency of bowling accuracy (rs [30] = 0.43, p = 0.02) for maximal-effort deliveries only. Faster bowling speeds were correlated to a longer step length (rs [31] = 0.51, p < 0.01) and quicker power phase duration (rs [31] = -0.45, p = 0.01). A better consistency of bowling accuracy was associated with a faster approach speed (rs [31] = -0.36, p = 0.05) and greater knee flexion angle at ball release (rs [27] = -0.42, p = 0.03). No speedaccuracy trade-off was observed for the group (rs [31] = -0.28, p = 0.12), indicating that most bowlers could be instructed to train at maximal-effort without compromising bowling accuracy. Pull-up strength training and speed-acceleration training were chosen for the “evidence-based” training program (Study 3). Heavy-ball bowling was also considered as part of the evidence-based training program, as it is a specific form of training used previously, and because there was a shortage of significant relationships (p < 0.05) between physical qualities and bowling performance measures in Study 2. The third investigation of this PhD project compared the effects of an eight-week evidence-based training program or normal training program (not a control group) on pace bowling performance, approach speed, speed-acceleration, and pull-up strength. Participants were matched for bowling speed and then randomly split into two training groups, with six participants in each group. After an initial two-week familiarisation period of bowling training, sprint training, and pull-up training, participants completed two training sessions per week, and were tested before and after the training intervention. Testing comprised the four-over pace bowling test (Study 1), 20-m sprint test (Study 2), and 1-RM pull-up test (Study 2). In training, the volume of bowling and sprinting was constant between both groups; the only differences were that the evidence-based training group bowled with heavy balls (250 g and 300 g) as well as a regular ball (156 g), sprinted with a weighted-vest (15% and 20% body mass) and without a weighted-vest, and performed pull-up training. Participants were instructed to deliver each ball with maximal effort in training, as no speed-accuracy trade-off was observed for the sample in Study 2. The evidence-based training group bowled with poorer accuracy and consistency of accuracy, with only a small improvement in peak and mean bowling speed. Heavy-ball bowling may have had a negative transfer to regular-ball bowling. Although speculative, a longer evidence-based program may have significantly enhanced bowling speed. Coaches could use both training programs to develop performance but should be aware that bowling accuracy may suffer with the evidence-based program. The evidence-based training group displayed slower 20-m sprint times following training (0.08 ± 0.05 s). However, the normal training group was also slower (0.10 ± 0.09 s), indicating the potential for speed-acceleration improvement is compromised if speed training is performed immediately after bowling training; most likely due to residual fatigue. Consequently it is recommended that speed-acceleration training be conducted when bowlers are not fatigued, in a separate session, or at the beginning of a session. The evidence-based training group improved their 1-RM pull-up strength by 5.8 ± 6.8 kg (d = 0.68), compared to the normal training group of 0.2 ± 1.7 kg (d = 0.01). The difference between training groups is due to the fact that the normal training group were not prescribed pull-up training. As many participants could not complete the pull-up exercise due to insufficient strength, the dumbbell pullover may be a suitable alternative that is more specific to the motion of the bowling arm (i.e., extended arm). The fourth study of this PhD project explored the acute effects of a heavy-ball bowling warm-up on pace bowling performance, and determined if these acute effects could be enhanced or negated following an evidence-based training program. This study involved the same participants who completed the evidence-based training program in Study 3. These participants were required to perform two different bowling warm-ups (heavy-ball or regular-ball) in pre and post-test period, followed by the four-over pace bowling test (Study 1). In pre-test period, bowling accuracy was 8.8 ± 7.4 cm worse for the heavy-ball warm-up compared to the regular-ball warm-up (d = 1.19). In post-test period however, bowling accuracy was 5.5 ± 6.4 cm better in the heavy-ball warm-up compared to the regular-ball warm-up (d = -0.90). A similar trend was observed for consistency of bowling accuracy. These findings indicate that pace bowlers adapt to heavy-ball bowling, and bowl more accurately with a regular ball if they warm-up with a heavy ball first (but only after eight weeks of heavy-ball training). Coaches could employ a heavy-ball warm-up prior to training or a match, but only after eight weeks of evidence based training. It is hypothesised that a less biomechanically similar exercise to the pace bowling motion such as resisted push-ups / bench press throws could be more effective in eliciting potentiation by activating higher order motor units without negatively transferring to bowling performance. From the studies presented in this thesis, it is concluded that peak and mean bowling speed are the most reliable bowling performance measures, and all kinematic variables apart from approach speed possess excellent reliability. Furthermore, 1-RM pull-up strength and 20-m speed are significantly correlated to bowling speed. An evidence-based training program can develop peak and mean bowling speed, but the cost to bowling accuracy and consistency of bowling accuracy does not make this training program worthwhile in enhancing pace bowling performance. A heavy-ball warm-up impairs bowling accuracy and consistency of bowling accuracy compared to the regular-ball warm-up, but only prior to training with the heavier balls. Pace bowlers adapt to heavyball bowling after eight weeks of training, but must use the heavy balls in the warm-up to bowl more accurately with a regular ball, otherwise pace bowling performance is below optimal.
- Description: Doctor of Philosophy
Injury rate and patterns of Sydney grade cricketers : A prospective study of injuries in 408 cricketers
- Soomro, Najeebullah, Redrup, Daniel, Evens, Chris, Strasiotto, Luke, Singh, Shekhar, Lyle, David, Singh, Himalaya, Ferdinands, Rene, Sanders, Ross
- Authors: Soomro, Najeebullah , Redrup, Daniel , Evens, Chris , Strasiotto, Luke , Singh, Shekhar , Lyle, David , Singh, Himalaya , Ferdinands, Rene , Sanders, Ross
- Date: 2018
- Type: Text , Journal article
- Relation: Postgraduate Medical Journal Vol. 94, no. 1114 (2018), p. 425-431
- Full Text:
- Reviewed:
- Description: Background The grade cricket competition, also known as premier cricket
- Authors: Soomro, Najeebullah , Redrup, Daniel , Evens, Chris , Strasiotto, Luke , Singh, Shekhar , Lyle, David , Singh, Himalaya , Ferdinands, Rene , Sanders, Ross
- Date: 2018
- Type: Text , Journal article
- Relation: Postgraduate Medical Journal Vol. 94, no. 1114 (2018), p. 425-431
- Full Text:
- Reviewed:
- Description: Background The grade cricket competition, also known as premier cricket
Adaptation, translation and reliability of the Australian 'Juniors Enjoying Cricket Safely' injury risk perception questionnaire for Sri Lanka
- Gamage, Prasanna, Fortington, Lauren, Finch, Caroline
- Authors: Gamage, Prasanna , Fortington, Lauren , Finch, Caroline
- Date: 2018
- Type: Text , Journal article
- Relation: BMJ Open Sport and Exercise Medicine Vol. 4, no. 1 (2018), p. 1-9
- Full Text:
- Reviewed:
- Description: Objectives Cricket is a very popular sport in Sri Lanka. In this setting there has been limited research; specifically, there is little knowledge of cricket injuries. To support future research possibilities, the aim of this study was to cross-culturally adapt, translate and test the reliability of an Australian-developed questionnaire for the Sri Lankan context. Methods The Australian 'Juniors Enjoying Cricket Safely' (JECS-Aus) injury risk perception questionnaire was cross-culturally adapted to suit the Sri Lankan context and subsequently translated into the two main languages (Sinhala and Tamil) based on standard forward-back translation. The translated questionnaires were examined for content validity by two language schoolteachers. The questionnaires were completed twice, 2 weeks apart, by two groups of school cricketers (males) aged 11-15 years (Sinhala (n=24), Tamil (n=30)) to assess reliability. Test-retest scores were evaluated for agreement. Where responses were <100% agreement, Cohen's kappa (κ) statistics were calculated. Questions with moderate-to-poor test-retest reliability (κ <0.6) were reconsidered for modification. Results Both the Sinhala and Tamil questionnaires had 100% agreement for questions on demographic data, and 88%-100% agreement for questions on participation in cricket and injury history. Of the injury risk perception questions, 72% (Sinhala) and 90% (Tamil) questions showed a substantial (κ =0.61-0.8) and almost perfect (κ =0.81-1.0) test-retest agreement. Conclusion The adapted and translated JECS-SL questionnaire demonstrated strong reliability. This is the first study to adapt the JECS-Aus questionnaire for use in a different population, providing an outcome measure for assessing injury risk perceptions in Sri Lankan junior cricketers.
- Authors: Gamage, Prasanna , Fortington, Lauren , Finch, Caroline
- Date: 2018
- Type: Text , Journal article
- Relation: BMJ Open Sport and Exercise Medicine Vol. 4, no. 1 (2018), p. 1-9
- Full Text:
- Reviewed:
- Description: Objectives Cricket is a very popular sport in Sri Lanka. In this setting there has been limited research; specifically, there is little knowledge of cricket injuries. To support future research possibilities, the aim of this study was to cross-culturally adapt, translate and test the reliability of an Australian-developed questionnaire for the Sri Lankan context. Methods The Australian 'Juniors Enjoying Cricket Safely' (JECS-Aus) injury risk perception questionnaire was cross-culturally adapted to suit the Sri Lankan context and subsequently translated into the two main languages (Sinhala and Tamil) based on standard forward-back translation. The translated questionnaires were examined for content validity by two language schoolteachers. The questionnaires were completed twice, 2 weeks apart, by two groups of school cricketers (males) aged 11-15 years (Sinhala (n=24), Tamil (n=30)) to assess reliability. Test-retest scores were evaluated for agreement. Where responses were <100% agreement, Cohen's kappa (κ) statistics were calculated. Questions with moderate-to-poor test-retest reliability (κ <0.6) were reconsidered for modification. Results Both the Sinhala and Tamil questionnaires had 100% agreement for questions on demographic data, and 88%-100% agreement for questions on participation in cricket and injury history. Of the injury risk perception questions, 72% (Sinhala) and 90% (Tamil) questions showed a substantial (κ =0.61-0.8) and almost perfect (κ =0.81-1.0) test-retest agreement. Conclusion The adapted and translated JECS-SL questionnaire demonstrated strong reliability. This is the first study to adapt the JECS-Aus questionnaire for use in a different population, providing an outcome measure for assessing injury risk perceptions in Sri Lankan junior cricketers.
The reliability and sensitivity of performance measures in a novel pace-bowling test
- Feros, Simon, Young, Warren, O’Brien, Brendan
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 2 (2018), p. 151-155
- Full Text:
- Reviewed:
- Description: Objectives: To evaluate the reliability and sensitivity of performance measures in a novel pace-bowling test. Methods: Thirteen male amateur-club fast bowlers completed a novel pace-bowling test on 2 separate occasions, 4–7 d apart. Participants delivered 48 balls (8 overs) at 5 targets on a suspended sheet situated behind a live batter, who stood in a right-handed and left-handed stance for an equal number of deliveries. Delivery instruction was frequently changed, with all deliveries executed in a preplanned sequence. Data on ball-release speed were captured by radar gun. A high-speed camera captured the moment of ball impact on the target sheet for assessment of radial error and bivariate variable error. Delivery rating of perceived exertion (0–100%) was collected as a measure of intensity. Results: Intraclass correlation coefficients and coefficients of variation revealed excellent reliability for peak and mean ball-release speed, acceptable reliability for delivery rating of perceived exertion, and poor reliability for mean radial error, bivariate variable error, and variability of ball-release speed. The smallest worthwhile change indicated high sensitivity with peak and mean ball-release speed and lower sensitivity with mean radial error and bivariate variable error. Conclusions: The novel pace-bowling test incorporates improvements in ecological validity compared with its predecessors and can be used to provide a more comprehensive evaluation of pace-bowling performance. Data on the smallest worthwhile change can improve interpretation of pace-bowling research findings and may therefore influence recommendations for applied practice. © 2018 Human Kinetics, Inc.
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 2 (2018), p. 151-155
- Full Text:
- Reviewed:
- Description: Objectives: To evaluate the reliability and sensitivity of performance measures in a novel pace-bowling test. Methods: Thirteen male amateur-club fast bowlers completed a novel pace-bowling test on 2 separate occasions, 4–7 d apart. Participants delivered 48 balls (8 overs) at 5 targets on a suspended sheet situated behind a live batter, who stood in a right-handed and left-handed stance for an equal number of deliveries. Delivery instruction was frequently changed, with all deliveries executed in a preplanned sequence. Data on ball-release speed were captured by radar gun. A high-speed camera captured the moment of ball impact on the target sheet for assessment of radial error and bivariate variable error. Delivery rating of perceived exertion (0–100%) was collected as a measure of intensity. Results: Intraclass correlation coefficients and coefficients of variation revealed excellent reliability for peak and mean ball-release speed, acceptable reliability for delivery rating of perceived exertion, and poor reliability for mean radial error, bivariate variable error, and variability of ball-release speed. The smallest worthwhile change indicated high sensitivity with peak and mean ball-release speed and lower sensitivity with mean radial error and bivariate variable error. Conclusions: The novel pace-bowling test incorporates improvements in ecological validity compared with its predecessors and can be used to provide a more comprehensive evaluation of pace-bowling performance. Data on the smallest worthwhile change can improve interpretation of pace-bowling research findings and may therefore influence recommendations for applied practice. © 2018 Human Kinetics, Inc.
Progressive rebels of Boy's Own Adventure? The 1935 Australian Cricket tour of India; breaking down social and racial barriers
- Authors: Ponsford, Megan
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: In October 1935, a touring party embarked on the inaugural tour of India by an Australian cricket team. To a great, and somewhat stereotypical, extent popular representations of IndianeAustralian relations are viewed through the lens of cricket – the national game in both countries. This dissertation about a significant, yet overlooked, chapter in sporting history examines the Australian cricketers’ response to the social, racial and political hierarchies of lateecolonial India. The experience of the touring party encouraged a reeimagining of ideological perspectives and this thesis identifies a uniquely Australian subjectivity to the British colonisation of India. The tour between the colony (India) and the dominion (Australia) can be interpreted as an antie imperial gesture. Both countries were attempting to forge relationships that would be independent from Britain. The role of cricket, itself experiencing a renaissance during the 1930s as it transformed from a largely amateur pursuit to an increasingly professional occupation is interrogated. As part of this transformation international cricket positioned itself as an increasingly politicised global entity within the broader turbulence of the firstehalf of the twentieth century. All those involved in the tour are now dead. However a close historical analysis of previously lost, highly personalised, primary material (letters, manuscripts, photographs and cricket ephemera) enables an interpretation of the players’ experience. This thesis argues that sporting events can be interpreted as cultural ciphers yet scholars and the wider sportsewriting community have neglected the historical significance of the 1935/36 tour. The unofficial status of the tour and its highly professional emphasis alienated it from the amateur ideals of Australian cricket. This transnational, multiedisciplinary approach addresses a lacunae in the professional trajectory of cricket. It also provides a new understanding and historical counter narrative of idetwentieth century IndianeAustralian sporting history and cultural exchange.
- Description: Doctor of Philosophy
- Authors: Ponsford, Megan
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: In October 1935, a touring party embarked on the inaugural tour of India by an Australian cricket team. To a great, and somewhat stereotypical, extent popular representations of IndianeAustralian relations are viewed through the lens of cricket – the national game in both countries. This dissertation about a significant, yet overlooked, chapter in sporting history examines the Australian cricketers’ response to the social, racial and political hierarchies of lateecolonial India. The experience of the touring party encouraged a reeimagining of ideological perspectives and this thesis identifies a uniquely Australian subjectivity to the British colonisation of India. The tour between the colony (India) and the dominion (Australia) can be interpreted as an antie imperial gesture. Both countries were attempting to forge relationships that would be independent from Britain. The role of cricket, itself experiencing a renaissance during the 1930s as it transformed from a largely amateur pursuit to an increasingly professional occupation is interrogated. As part of this transformation international cricket positioned itself as an increasingly politicised global entity within the broader turbulence of the firstehalf of the twentieth century. All those involved in the tour are now dead. However a close historical analysis of previously lost, highly personalised, primary material (letters, manuscripts, photographs and cricket ephemera) enables an interpretation of the players’ experience. This thesis argues that sporting events can be interpreted as cultural ciphers yet scholars and the wider sportsewriting community have neglected the historical significance of the 1935/36 tour. The unofficial status of the tour and its highly professional emphasis alienated it from the amateur ideals of Australian cricket. This transnational, multiedisciplinary approach addresses a lacunae in the professional trajectory of cricket. It also provides a new understanding and historical counter narrative of idetwentieth century IndianeAustralian sporting history and cultural exchange.
- Description: Doctor of Philosophy
Injury epidemiology among Australian female cricketers
- Authors: Perera, Nirmala
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: Cricket is a male-dominated sport; however, its popularity among females is increasing. Like other sports, participation in cricket poses the risk of injury to players. Injury problems for female cricketers are virtually unknown, as studies examining cricket injuries include almost exclusively male participants. In other sports, the types of injuries experienced by men and women are known to be different. Therefore, a clear understanding of the extent and types of injuries sustained by female cricket players is required, to underpin appropriately targeted injury prevention strategies. This thesis provides the first detailed epidemiological profile of cricket injuries sustained by women, by: 1. conducting a systematic review describing injuries in competitive team bat or stick sports in women, to enable cricket injuries to be viewed within the perspective of wider, but relevant, injury data, 2. evaluating existing data sources relating to hospital admissions from Victoria and Queensland and successful insurance claims across Australia, 3. examining the nature and incidence of cricket injuries in elite female players using Cricket Australia’s Athlete Management System, and 4. conducting a nationwide self-report survey of injuries during the 2014–15 season. This PhD research represents participants from different levels of play, across age groups and across Australia. The findings indicate that incidence of injuries for female cricketers were higher than expected based on previous findings in comparable sports, except when considered in relation to insurance claims. The cricket injury rate across hospital presentations, insurance claims, the AMS (Fair Play AMS 2016) and self-reported survey data, each of which represents different level of the sports injury pyramid, identified all-rounders and pace bowlers as having a higher incidence of injury than players in other positions. The highest frequency of reported injuries were in the head, hands, knees and ankles. The nature of the most common injuries were dislocations/sprains/strains, fractures, muscle injury, joint injury and gradual onset injuries. At the elite-level, lumber spine stress fractures accounted for a significant amount of time-loss from the sport. In this thesis, findings from the insurance claims, self-reported survey and AMS (Fair Play AMS 2016) data indicated that most injuries were of a low severity and were more likely to be treated outside of healthcare facilities such as hospitals. In summary, patterns of the most common injuries, in terms of anatomical location and nature of the injuries, were consistent throughout community-level players with some similarities to elite-level players. However, the injury mechanisms and risk factors may differ depending on the level of competition and player’s skill. Recommendations are that ongoing injury surveillance should be conducted at all levels of the sport, and surveillance methodology should be tailored to the specific setting, personnel and available resources. Therefore, before implementing an injury surveillance system at the community-level of the sport, more research is needed to fully understand what type of injury surveillance system might be feasible and suitable in this context.
- Description: Doctor of Philosophy
- Authors: Perera, Nirmala
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: Cricket is a male-dominated sport; however, its popularity among females is increasing. Like other sports, participation in cricket poses the risk of injury to players. Injury problems for female cricketers are virtually unknown, as studies examining cricket injuries include almost exclusively male participants. In other sports, the types of injuries experienced by men and women are known to be different. Therefore, a clear understanding of the extent and types of injuries sustained by female cricket players is required, to underpin appropriately targeted injury prevention strategies. This thesis provides the first detailed epidemiological profile of cricket injuries sustained by women, by: 1. conducting a systematic review describing injuries in competitive team bat or stick sports in women, to enable cricket injuries to be viewed within the perspective of wider, but relevant, injury data, 2. evaluating existing data sources relating to hospital admissions from Victoria and Queensland and successful insurance claims across Australia, 3. examining the nature and incidence of cricket injuries in elite female players using Cricket Australia’s Athlete Management System, and 4. conducting a nationwide self-report survey of injuries during the 2014–15 season. This PhD research represents participants from different levels of play, across age groups and across Australia. The findings indicate that incidence of injuries for female cricketers were higher than expected based on previous findings in comparable sports, except when considered in relation to insurance claims. The cricket injury rate across hospital presentations, insurance claims, the AMS (Fair Play AMS 2016) and self-reported survey data, each of which represents different level of the sports injury pyramid, identified all-rounders and pace bowlers as having a higher incidence of injury than players in other positions. The highest frequency of reported injuries were in the head, hands, knees and ankles. The nature of the most common injuries were dislocations/sprains/strains, fractures, muscle injury, joint injury and gradual onset injuries. At the elite-level, lumber spine stress fractures accounted for a significant amount of time-loss from the sport. In this thesis, findings from the insurance claims, self-reported survey and AMS (Fair Play AMS 2016) data indicated that most injuries were of a low severity and were more likely to be treated outside of healthcare facilities such as hospitals. In summary, patterns of the most common injuries, in terms of anatomical location and nature of the injuries, were consistent throughout community-level players with some similarities to elite-level players. However, the injury mechanisms and risk factors may differ depending on the level of competition and player’s skill. Recommendations are that ongoing injury surveillance should be conducted at all levels of the sport, and surveillance methodology should be tailored to the specific setting, personnel and available resources. Therefore, before implementing an injury surveillance system at the community-level of the sport, more research is needed to fully understand what type of injury surveillance system might be feasible and suitable in this context.
- Description: Doctor of Philosophy
Too many rib ticklers? Injuries in Australian women's cricket (PhD Academy Award)
- Authors: Perera, Nirmala
- Date: 2019
- Type: Text , Journal article , Editorial Material
- Relation: British Journal of Sports Medicine Vol. 53, no. 22 (Nov 2019), p. 1436-1437
- Full Text:
- Reviewed:
- Authors: Perera, Nirmala
- Date: 2019
- Type: Text , Journal article , Editorial Material
- Relation: British Journal of Sports Medicine Vol. 53, no. 22 (Nov 2019), p. 1436-1437
- Full Text:
- Reviewed:
Medical-attention injuries in community cricket : a systematic review
- McLeod, Geordie, O'Connor, Siobhan, Morgan, Damian, Kountouris, Alex, Finch, Caroline, Fortington, Lauren
- Authors: McLeod, Geordie , O'Connor, Siobhan , Morgan, Damian , Kountouris, Alex , Finch, Caroline , Fortington, Lauren
- Date: 2020
- Type: Text , Journal article , Review
- Relation: BMJ Open Sport and Exercise Medicine Vol. 6, no. 1 (2020), p.
- Full Text:
- Reviewed:
- Description: Objectives The aim was to identify and describe outcomes from original published studies that present the number, nature, mechanism and severity of medically treated injuries sustained in community-level cricket. Design Systematic review. Methods Nine databases were systematically searched to December 2019 using terms "cricket
- Authors: McLeod, Geordie , O'Connor, Siobhan , Morgan, Damian , Kountouris, Alex , Finch, Caroline , Fortington, Lauren
- Date: 2020
- Type: Text , Journal article , Review
- Relation: BMJ Open Sport and Exercise Medicine Vol. 6, no. 1 (2020), p.
- Full Text:
- Reviewed:
- Description: Objectives The aim was to identify and describe outcomes from original published studies that present the number, nature, mechanism and severity of medically treated injuries sustained in community-level cricket. Design Systematic review. Methods Nine databases were systematically searched to December 2019 using terms "cricket
Concussion assessment and management — what do community-level cricket participants know?
- Kodikara, Dulan, Plumb, Mandy, Twomey, Dara
- Authors: Kodikara, Dulan , Plumb, Mandy , Twomey, Dara
- Date: 2023
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 26, no. 9 (2023), p. 448-453
- Full Text:
- Reviewed:
- Description: Objectives: To explore Australian cricket participants' knowledge of concussion assessment and management, and awareness of current concussion guidelines. Design: Cross-sectional survey. Methods: Novel and validated surveys were disseminated online, among over 16 year Australian cricket players and officials at the end of the 2018/19 cricket season. Data were collected on knowledge and awareness of concussion and analysed using descriptive statistics and crosstabulations. Further comparisons were made for the players between injured and non-injured, and helmet wearers and non-helmet wearers using Fisher's exact statistical test. Results: Both players (n = 224, 93 %) and officials (n = 36, 100 %) demonstrated strong knowledge of the importance of immediately evaluating suspected concussions. In comparison with players without helmets (n = 11), those using helmets (n = 135) considered replacing their helmets after a concussion to be vital to concussion assessment (p = 0.02). Overall, 80–97 % of players and 81–97 % of officials understood the importance of many factors regarding concussion management. When concussion management knowledge was compared by injury status, injured players (n = 17, 94 %) believed someone with a concussion should be hospitalised immediately, in contrast to non-injured players (n = 154, 69 %) (p = 0.04). Players (63 %) were less aware of concussion guidelines than officials (81 %). Conclusions: Overall, the knowledge of concussion assessment and management was satisfactory. However, there were discrepancies among players on some aspects of awareness of concussion guidelines. Increasing players' familiarity and experience in using the concussion guidelines is warranted. Targeted campaigns are needed to further improve concussion recognition and treatment at community-level cricket, so all participants play a role in making cricket a safe sport. © 2023
- Authors: Kodikara, Dulan , Plumb, Mandy , Twomey, Dara
- Date: 2023
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 26, no. 9 (2023), p. 448-453
- Full Text:
- Reviewed:
- Description: Objectives: To explore Australian cricket participants' knowledge of concussion assessment and management, and awareness of current concussion guidelines. Design: Cross-sectional survey. Methods: Novel and validated surveys were disseminated online, among over 16 year Australian cricket players and officials at the end of the 2018/19 cricket season. Data were collected on knowledge and awareness of concussion and analysed using descriptive statistics and crosstabulations. Further comparisons were made for the players between injured and non-injured, and helmet wearers and non-helmet wearers using Fisher's exact statistical test. Results: Both players (n = 224, 93 %) and officials (n = 36, 100 %) demonstrated strong knowledge of the importance of immediately evaluating suspected concussions. In comparison with players without helmets (n = 11), those using helmets (n = 135) considered replacing their helmets after a concussion to be vital to concussion assessment (p = 0.02). Overall, 80–97 % of players and 81–97 % of officials understood the importance of many factors regarding concussion management. When concussion management knowledge was compared by injury status, injured players (n = 17, 94 %) believed someone with a concussion should be hospitalised immediately, in contrast to non-injured players (n = 154, 69 %) (p = 0.04). Players (63 %) were less aware of concussion guidelines than officials (81 %). Conclusions: Overall, the knowledge of concussion assessment and management was satisfactory. However, there were discrepancies among players on some aspects of awareness of concussion guidelines. Increasing players' familiarity and experience in using the concussion guidelines is warranted. Targeted campaigns are needed to further improve concussion recognition and treatment at community-level cricket, so all participants play a role in making cricket a safe sport. © 2023
Head, neck, and facial injuries in Australian cricket
- Authors: Kodikara, Dulan
- Date: 2023
- Type: Text , Thesis , PhD
- Full Text:
- Description: Head, neck and facial (HNF) injuries are a significant concern in cricket due to the nature of the game and the potential impact of fast-moving balls and collisions. These types of injuries occur as a result of direct hits from the cricket ball, accidental collisions between players or falls during fielding or batting. HNF injuries can range from minor cuts and bruises to more severe concussions, fractures, or dental trauma. While some HNF injuries in cricket can be career-ending and severe, others may not be as catastrophic. Over the past decade, there has been a noticeable increase in the incidence of HNF injuries in elite-level cricket, and the tragic death of an Australian test cricketer in 2014 from a head injury heightened awareness of the seriousness and potential fatality of such injuries in the sport. To mitigate the risk of serious injuries, cricket players are encouraged to wear protective equipment such as helmets and neck guards. At the elite level of the sport, stringent safety protocols and regulations are enforced to prioritise player wellbeing, ensuring that immediate medical attention is available during training or games. Further, routine injury surveillance at the elite level has proven effective in monitoring and reducing the likelihood of serious HNF injuries. Nevertheless, there is a noticeable lack of research investigating HNF injuries among cricket participants, particularly at the community level. This lack of reporting hampers the identification and implementation of effective strategies to minimise the risk of such injuries. This thesis seeks to bridge this research gap by examining HNF injuries in community-level cricket under two broad objectives, providing valuable insights for injury prevention and risk mitigation strategies. The first objective of this thesis was to develop a comprehensive understanding of HNF cricket injury epidemiology and the reporting of helmet usage. A systematic review was conducted, analysing 29 studies to determine the incidence, nature, and mechanisms of HNF injuries in cricket, the reported use of helmets and ‘gold standard’ definitions. Facial fractures and concussions were the most frequently specified types of injuries, and the impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. Only three studies (10%) reported the use of helmets. The systematic review highlighted the lack of evidence regarding the reporting of HNF cricket injuries according to international cricket consensus injury definitions, as well as the limited data on helmet usage at the time of injury. Additionally, the review identified gaps in evidence concerning HNF injuries across different age groups, levels of play and diverse populations, along with discrepancies in reporting injury-specific mechanisms. Community-level HNF cricket injuries that required hospitalisation in Victoria, Australia, over a decade, spanning from 2007/8 to 2016/17 were also reviewed under the first objective. During this period, Victorian hospitals treated 3,907 HNF cricket injuries. Male participants accounted for a higher number of injuries than female participants, and the age group most commonly requiring hospital treatment was 10–14 years. Open wounds were the most frequent type of injury (30%), and the primary mechanism for HNF cricket injuries during this decade was being hit, struck, or crushed (86%). Our literature review and the hospital study form the ideal platform for injury prevention efforts by establishing HNF injury prevalence and common injury mechanisms. The second broad objective of this thesis was to investigate the use of cricket helmets among cricket participants, to study the ability of Australian cricket participants to perceive injury risk and to explore the knowledge and awareness of concussion assessment and management. An online survey was conducted to address each facet of our second objective. Over 90% of the players and 50% of the officials reported wearing a helmet during the 2018/19 cricket season, but most did not use a neck protector. Most of the helmets used met the recommended British Standards, and the most common brand used was Masuri. For most of the players and officials who participated in our survey, comfort, and ability to prevent HNF injuries were the two most important factors affecting their decision to purchase a cricket helmet. More than 80% of players and almost 50% of officials expressed the belief that helmets were not necessary for activities such as bowling and fielding at a distance from the batter. Yet, the fact that more than 80% of all participants expressed their willingness to keep using helmets under compulsory regulations indicates that implementing mandatory helmet rules might result in a significant increase in helmet adoption and enhance the overall safety of the sport. Over 70% of our survey participants demonstrated satisfactory levels of knowledge regarding concussion assessment and management. These findings suggest that the potential for severe complications stemming from concussions related to cricket could be reduced, particularly in light of the limited availability of qualified medical professionals at the community-level. The strong understanding of concussion guidelines among our survey participants implies that they would be inclined to prioritise safety and choose helmets that align with the recommended safety standards. In summary, this PhD research has achieved its objective of making the first large-scale scientific contribution to enhance safety and prevent HNF injuries among participants of community-level cricket in Australia. Additionally, this research effectively assessed the participants’ knowledge, comprehension and attitudes regarding utilising protective helmets and the importance of following Cricket Australia’s concussion guidelines.
- Description: Doctor of Philosophy
- Authors: Kodikara, Dulan
- Date: 2023
- Type: Text , Thesis , PhD
- Full Text:
- Description: Head, neck and facial (HNF) injuries are a significant concern in cricket due to the nature of the game and the potential impact of fast-moving balls and collisions. These types of injuries occur as a result of direct hits from the cricket ball, accidental collisions between players or falls during fielding or batting. HNF injuries can range from minor cuts and bruises to more severe concussions, fractures, or dental trauma. While some HNF injuries in cricket can be career-ending and severe, others may not be as catastrophic. Over the past decade, there has been a noticeable increase in the incidence of HNF injuries in elite-level cricket, and the tragic death of an Australian test cricketer in 2014 from a head injury heightened awareness of the seriousness and potential fatality of such injuries in the sport. To mitigate the risk of serious injuries, cricket players are encouraged to wear protective equipment such as helmets and neck guards. At the elite level of the sport, stringent safety protocols and regulations are enforced to prioritise player wellbeing, ensuring that immediate medical attention is available during training or games. Further, routine injury surveillance at the elite level has proven effective in monitoring and reducing the likelihood of serious HNF injuries. Nevertheless, there is a noticeable lack of research investigating HNF injuries among cricket participants, particularly at the community level. This lack of reporting hampers the identification and implementation of effective strategies to minimise the risk of such injuries. This thesis seeks to bridge this research gap by examining HNF injuries in community-level cricket under two broad objectives, providing valuable insights for injury prevention and risk mitigation strategies. The first objective of this thesis was to develop a comprehensive understanding of HNF cricket injury epidemiology and the reporting of helmet usage. A systematic review was conducted, analysing 29 studies to determine the incidence, nature, and mechanisms of HNF injuries in cricket, the reported use of helmets and ‘gold standard’ definitions. Facial fractures and concussions were the most frequently specified types of injuries, and the impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. Only three studies (10%) reported the use of helmets. The systematic review highlighted the lack of evidence regarding the reporting of HNF cricket injuries according to international cricket consensus injury definitions, as well as the limited data on helmet usage at the time of injury. Additionally, the review identified gaps in evidence concerning HNF injuries across different age groups, levels of play and diverse populations, along with discrepancies in reporting injury-specific mechanisms. Community-level HNF cricket injuries that required hospitalisation in Victoria, Australia, over a decade, spanning from 2007/8 to 2016/17 were also reviewed under the first objective. During this period, Victorian hospitals treated 3,907 HNF cricket injuries. Male participants accounted for a higher number of injuries than female participants, and the age group most commonly requiring hospital treatment was 10–14 years. Open wounds were the most frequent type of injury (30%), and the primary mechanism for HNF cricket injuries during this decade was being hit, struck, or crushed (86%). Our literature review and the hospital study form the ideal platform for injury prevention efforts by establishing HNF injury prevalence and common injury mechanisms. The second broad objective of this thesis was to investigate the use of cricket helmets among cricket participants, to study the ability of Australian cricket participants to perceive injury risk and to explore the knowledge and awareness of concussion assessment and management. An online survey was conducted to address each facet of our second objective. Over 90% of the players and 50% of the officials reported wearing a helmet during the 2018/19 cricket season, but most did not use a neck protector. Most of the helmets used met the recommended British Standards, and the most common brand used was Masuri. For most of the players and officials who participated in our survey, comfort, and ability to prevent HNF injuries were the two most important factors affecting their decision to purchase a cricket helmet. More than 80% of players and almost 50% of officials expressed the belief that helmets were not necessary for activities such as bowling and fielding at a distance from the batter. Yet, the fact that more than 80% of all participants expressed their willingness to keep using helmets under compulsory regulations indicates that implementing mandatory helmet rules might result in a significant increase in helmet adoption and enhance the overall safety of the sport. Over 70% of our survey participants demonstrated satisfactory levels of knowledge regarding concussion assessment and management. These findings suggest that the potential for severe complications stemming from concussions related to cricket could be reduced, particularly in light of the limited availability of qualified medical professionals at the community-level. The strong understanding of concussion guidelines among our survey participants implies that they would be inclined to prioritise safety and choose helmets that align with the recommended safety standards. In summary, this PhD research has achieved its objective of making the first large-scale scientific contribution to enhance safety and prevent HNF injuries among participants of community-level cricket in Australia. Additionally, this research effectively assessed the participants’ knowledge, comprehension and attitudes regarding utilising protective helmets and the importance of following Cricket Australia’s concussion guidelines.
- Description: Doctor of Philosophy
A systematic review of head, neck and-facial injuries in cricket
- Kodikara, Dulan, Twomey, Dara, Plumb, Mandy
- Authors: Kodikara, Dulan , Twomey, Dara , Plumb, Mandy
- Date: 2022
- Type: Text , Journal article
- Relation: International Journal of Sports Medicine Vol. 43, no. 6 (2022), p. 496-504
- Full Text:
- Reviewed:
- Description: This systematic review was conducted to identify the incidence, nature and mechanisms of head, neck and facial (HNF) injuries in cricket and the reported use of helmets. Five databases were searched up to 30 thNovember 2020. From peer-reviewed cricket injury studies published in English, studies reporting on HNF cricket injuries as per the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were selected. Twenty-nine studies were included. HNF injuries had a cumulative total of 794/5,886 injuries equating to 13% of all injuries. Non- specified HNF injuries (n=210, 26%) were the most prevalent type of injury followed by non-specified head injuries (n=130, 16%), other non-specified fractures (n=119, 15%) and concussions (n=60, 8%).The impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. The use of helmet was reported in only three studies (10%). From studies reporting on HNF cricket injuries, facial fractures, and concussions were the most common specified-types of injury. There is little evidence on reporting of HNF cricket injuries as per the international cricket consensus injury definitions, as well as the use of helmets at the time of injury. © 2022 American Institute of Physics Inc.. All rights reserved.
- Authors: Kodikara, Dulan , Twomey, Dara , Plumb, Mandy
- Date: 2022
- Type: Text , Journal article
- Relation: International Journal of Sports Medicine Vol. 43, no. 6 (2022), p. 496-504
- Full Text:
- Reviewed:
- Description: This systematic review was conducted to identify the incidence, nature and mechanisms of head, neck and facial (HNF) injuries in cricket and the reported use of helmets. Five databases were searched up to 30 thNovember 2020. From peer-reviewed cricket injury studies published in English, studies reporting on HNF cricket injuries as per the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were selected. Twenty-nine studies were included. HNF injuries had a cumulative total of 794/5,886 injuries equating to 13% of all injuries. Non- specified HNF injuries (n=210, 26%) were the most prevalent type of injury followed by non-specified head injuries (n=130, 16%), other non-specified fractures (n=119, 15%) and concussions (n=60, 8%).The impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. The use of helmet was reported in only three studies (10%). From studies reporting on HNF cricket injuries, facial fractures, and concussions were the most common specified-types of injury. There is little evidence on reporting of HNF cricket injuries as per the international cricket consensus injury definitions, as well as the use of helmets at the time of injury. © 2022 American Institute of Physics Inc.. All rights reserved.
- «
- ‹
- 1
- ›
- »