The reliability of musculoskeletal screening tests used in cricket
- Dennis, Rebecca, Finch, Caroline, Elliott, Bruce, Farhart, Patrick
- Authors: Dennis, Rebecca , Finch, Caroline , Elliott, Bruce , Farhart, Patrick
- Date: 2008
- Type: Text , Journal article
- Relation: Physical Therapy in Sport Vol. 9, no. 1 (2008), p. 25-33
- Full Text:
- Reviewed:
- Description: Objectives: To determine the inter- and intra-observer reliability of a field-based musculoskeletal screening protocol used to measure potential injury risk factors in cricket fast bowlers. Design: Test-retest reliability study. Setting: High performance Australian cricket. Participants: Ten volunteers. Two sports physiotherapists conducted the testing. Main outcome measures: Participants completed the following tests: knee extension; modified Thomas test (hip extension and abduction); hip internal and external rotation; combined elevation; ankle dorsiflexion lunge; bridging hold; prone four point hold; and calf heel raises. Methods: For each of the tests, the participants were tested by each physiotherapist twice, and the inter- and intra-observer reliability were concurrently assessed. Results: The inter-observer reliability of the tests was generally poor, with only four of the ten tests having an intraclass correlation coefficient (ICC) greater than 0.80 (range of ICCs 0.27-0.99). The intra-observer reliability of the tests was considerably higher, with nine tests having an ICC greater than 0.80 (range of ICCs 0.56-0.99). Conclusions: With the exception of the bridging hold, all tests would be considered acceptable where only one observer was conducting the testing. However, only the ankle dorsiflexion lunge, combined elevation test, calf heel raise test and prone four point hold have acceptable reliability when there are multiple physiotherapists recording measurements. © 2007 Elsevier Ltd. All rights reserved.
- Description: C1
- Authors: Dennis, Rebecca , Finch, Caroline , Elliott, Bruce , Farhart, Patrick
- Date: 2008
- Type: Text , Journal article
- Relation: Physical Therapy in Sport Vol. 9, no. 1 (2008), p. 25-33
- Full Text:
- Reviewed:
- Description: Objectives: To determine the inter- and intra-observer reliability of a field-based musculoskeletal screening protocol used to measure potential injury risk factors in cricket fast bowlers. Design: Test-retest reliability study. Setting: High performance Australian cricket. Participants: Ten volunteers. Two sports physiotherapists conducted the testing. Main outcome measures: Participants completed the following tests: knee extension; modified Thomas test (hip extension and abduction); hip internal and external rotation; combined elevation; ankle dorsiflexion lunge; bridging hold; prone four point hold; and calf heel raises. Methods: For each of the tests, the participants were tested by each physiotherapist twice, and the inter- and intra-observer reliability were concurrently assessed. Results: The inter-observer reliability of the tests was generally poor, with only four of the ten tests having an intraclass correlation coefficient (ICC) greater than 0.80 (range of ICCs 0.27-0.99). The intra-observer reliability of the tests was considerably higher, with nine tests having an ICC greater than 0.80 (range of ICCs 0.56-0.99). Conclusions: With the exception of the bridging hold, all tests would be considered acceptable where only one observer was conducting the testing. However, only the ankle dorsiflexion lunge, combined elevation test, calf heel raise test and prone four point hold have acceptable reliability when there are multiple physiotherapists recording measurements. © 2007 Elsevier Ltd. All rights reserved.
- Description: C1
The determinants and development of fast bowling performance in cricket
- Authors: Feros, Simon
- Date: 2015
- Type: Text , Thesis , PhD
- Full Text:
- Description: This thesis sought to reveal the physical and kinematic determinants of pace bowling performance. After drawing on these determinants, a secondary aim was to investigate whether pace bowling performance could be enhanced with chronic resistance training and warm-up strategies. However, before the physical and kinematic determinants of pace bowling performance could be identified, and the effects of two training interventions and warm-ups on pace bowling performance, a new pace bowling test was created, and the test-retest reliability of its performance and kinematic measures were evaluated. Knowledge of a variables’ test-retest reliability is important for interpreting the validity of correlations, but also for the determination of a meaningful change following a training intervention. Only one published study to date has explored the test-retest reliability of a pace bowling assessment, and this test only measured bowling accuracy (1). Previous research has not comprehensively examined the relationships between physical qualities and pace bowling performance. Several important physical qualities (e.g., power, speed-acceleration, flexibility, repeat-sprint ability) have been excluded in correlational research, which may be crucial for optimal pace bowling performance. Furthermore, there is only one published training intervention study on pace bowling research (2). Consequently there is scant evidence for coaches to design training programs proven to enhance pace bowling performance. Baseball pitching studies have trialled the effects of heavy-ball throwing in the warm-up on subsequent throwing velocity and accuracy, but this approach has not been studied in cricket pace bowling, especially after several weeks of training. Therefore, four studies were conducted in this PhD project to address these deficiencies in the literature. The purpose of Study 1 (Chapter 3) was to ascertain the test-retest reliability of bowling performance measures (i.e., bowling speed, bowling accuracy, consistency of bowling speed, and consistency of bowling accuracy) and selected bowling kinematics (i.e., approach speed, step length, step-length phase duration, power phase duration, and knee extension angle at front-foot contact and at ball release) in a novel eight-over test, and for the first four overs of this test. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and coefficient of variation (CV) were used as measures of test-retest reliability (3). Following a three week familiarisation period of bowling, 13 participants completed a novel eight-over bowling test on two separate days with 4–7 days apart. The most reliable performance measures in the bowling test were peak bowling speed (ICC = 0.948–0.975, CV = 1.3–1.9%) and mean bowling speed (ICC = 0.981–0.987, CV = 1.0–1.3%). Perceived effort was partially reliable (ICC = 0.650– 0.659, CV = 3.8–3.9%). However, mean bowling accuracy (ICC = 0.491–0.685, CV = 12.5–16.8%) and consistency of bowling accuracy failed to meet the pre-set standard for acceptable reliability (ICC = 0.434–0.454, CV = 15.3–19.3%). All bowling kinematic variables except approach speed exhibited acceptable reliability (i.e., ICC > 0.8, CV < 10%). The first four overs of the bowling test exhibited slightly poorer test-retest reliability for all measures, compared to the entire eight-over test. There were no systematic biases (i.e., p > 0.05) detected with all variables between bowling tests, indicating there was no learning or fatigue effects. The smallest worthwhile change was established for all bowling performance and kinematic variables, by multiplying the SEM by 1.5 (4). It is recommended that the eight-over pace bowling test be used as a more comprehensive measure of consistency of bowling speed and consistency of bowling accuracy, as bowlers are more likely to be fatigued. However, if coaches seek to assess pace bowlers in shorter time, delimiting the test to the first four overs is recommended. Both versions of the pace bowling test are only capable of reliably measuring bowling performance outcomes such as peak and mean bowling speed, and perceived effort. The second study of this PhD project examined the relationships between selected physical qualities, bowling kinematics, and bowling performance measures. Another purpose of this novel study was to determine if delivery instructions (i.e., maximal-effort, match-intensity, slower-ball) influenced the strength of the relationships between physical qualities and bowling performance measures. Given that there were three delivery instructions in the bowling test, an objective of this study was to explore the relationship between bowling speed and bowling accuracy (i.e., speed-accuracy trade-off). Thirty-one participants completed an eight-over bowling test in the first session, and a series of physical tests, spread over two separate sessions. Each session was separated by four to seven days. Mean bowling speed (of all pooled deliveries) was significantly correlated to 1-RM pull-up strength (rs [24] = 0.55, p = 0.01) and 20-m sprint time (rs [30] = -0.37, p = 0.04), but the correlations marginally increased as delivery effort increased (i.e., maximal-effort ball). Greater hamstring flexibility was associated with a better consistency of bowling speed, but only for a match-intensity delivery (rs [29] = -0.49, p = 0.01). Repeat-sprint ability (i.e., percent decrement on 10 × 20-m sprints, on every 20 s) displayed a stronger correlation to consistency of bowling speed (rs [21] = -0.42, p = 0.06) than for mean bowling speed (rs [21] = 0.15, p = 0.53). Bench press strength was moderately related to bowling accuracy for a maximal-effort delivery (rs [26] = -0.42, p = 0.03), with weaker but non-significant (p > 0.05) correlations for match-intensity and slower-ball deliveries. Bowling accuracy was also significantly related to peak concentric countermovement jump power (rs [28] = -0.41, p = 0.03) and mean peak concentric countermovement jump power (rs [27] = -0.45, p = 0.02), with both physical qualities displaying stronger correlations as delivery effort increased. Greater reactive strength was negatively associated with mean bowling accuracy (rs [30] = 0.38, p = 0.04) and consistency of bowling accuracy (rs [30] = 0.43, p = 0.02) for maximal-effort deliveries only. Faster bowling speeds were correlated to a longer step length (rs [31] = 0.51, p < 0.01) and quicker power phase duration (rs [31] = -0.45, p = 0.01). A better consistency of bowling accuracy was associated with a faster approach speed (rs [31] = -0.36, p = 0.05) and greater knee flexion angle at ball release (rs [27] = -0.42, p = 0.03). No speedaccuracy trade-off was observed for the group (rs [31] = -0.28, p = 0.12), indicating that most bowlers could be instructed to train at maximal-effort without compromising bowling accuracy. Pull-up strength training and speed-acceleration training were chosen for the “evidence-based” training program (Study 3). Heavy-ball bowling was also considered as part of the evidence-based training program, as it is a specific form of training used previously, and because there was a shortage of significant relationships (p < 0.05) between physical qualities and bowling performance measures in Study 2. The third investigation of this PhD project compared the effects of an eight-week evidence-based training program or normal training program (not a control group) on pace bowling performance, approach speed, speed-acceleration, and pull-up strength. Participants were matched for bowling speed and then randomly split into two training groups, with six participants in each group. After an initial two-week familiarisation period of bowling training, sprint training, and pull-up training, participants completed two training sessions per week, and were tested before and after the training intervention. Testing comprised the four-over pace bowling test (Study 1), 20-m sprint test (Study 2), and 1-RM pull-up test (Study 2). In training, the volume of bowling and sprinting was constant between both groups; the only differences were that the evidence-based training group bowled with heavy balls (250 g and 300 g) as well as a regular ball (156 g), sprinted with a weighted-vest (15% and 20% body mass) and without a weighted-vest, and performed pull-up training. Participants were instructed to deliver each ball with maximal effort in training, as no speed-accuracy trade-off was observed for the sample in Study 2. The evidence-based training group bowled with poorer accuracy and consistency of accuracy, with only a small improvement in peak and mean bowling speed. Heavy-ball bowling may have had a negative transfer to regular-ball bowling. Although speculative, a longer evidence-based program may have significantly enhanced bowling speed. Coaches could use both training programs to develop performance but should be aware that bowling accuracy may suffer with the evidence-based program. The evidence-based training group displayed slower 20-m sprint times following training (0.08 ± 0.05 s). However, the normal training group was also slower (0.10 ± 0.09 s), indicating the potential for speed-acceleration improvement is compromised if speed training is performed immediately after bowling training; most likely due to residual fatigue. Consequently it is recommended that speed-acceleration training be conducted when bowlers are not fatigued, in a separate session, or at the beginning of a session. The evidence-based training group improved their 1-RM pull-up strength by 5.8 ± 6.8 kg (d = 0.68), compared to the normal training group of 0.2 ± 1.7 kg (d = 0.01). The difference between training groups is due to the fact that the normal training group were not prescribed pull-up training. As many participants could not complete the pull-up exercise due to insufficient strength, the dumbbell pullover may be a suitable alternative that is more specific to the motion of the bowling arm (i.e., extended arm). The fourth study of this PhD project explored the acute effects of a heavy-ball bowling warm-up on pace bowling performance, and determined if these acute effects could be enhanced or negated following an evidence-based training program. This study involved the same participants who completed the evidence-based training program in Study 3. These participants were required to perform two different bowling warm-ups (heavy-ball or regular-ball) in pre and post-test period, followed by the four-over pace bowling test (Study 1). In pre-test period, bowling accuracy was 8.8 ± 7.4 cm worse for the heavy-ball warm-up compared to the regular-ball warm-up (d = 1.19). In post-test period however, bowling accuracy was 5.5 ± 6.4 cm better in the heavy-ball warm-up compared to the regular-ball warm-up (d = -0.90). A similar trend was observed for consistency of bowling accuracy. These findings indicate that pace bowlers adapt to heavy-ball bowling, and bowl more accurately with a regular ball if they warm-up with a heavy ball first (but only after eight weeks of heavy-ball training). Coaches could employ a heavy-ball warm-up prior to training or a match, but only after eight weeks of evidence based training. It is hypothesised that a less biomechanically similar exercise to the pace bowling motion such as resisted push-ups / bench press throws could be more effective in eliciting potentiation by activating higher order motor units without negatively transferring to bowling performance. From the studies presented in this thesis, it is concluded that peak and mean bowling speed are the most reliable bowling performance measures, and all kinematic variables apart from approach speed possess excellent reliability. Furthermore, 1-RM pull-up strength and 20-m speed are significantly correlated to bowling speed. An evidence-based training program can develop peak and mean bowling speed, but the cost to bowling accuracy and consistency of bowling accuracy does not make this training program worthwhile in enhancing pace bowling performance. A heavy-ball warm-up impairs bowling accuracy and consistency of bowling accuracy compared to the regular-ball warm-up, but only prior to training with the heavier balls. Pace bowlers adapt to heavyball bowling after eight weeks of training, but must use the heavy balls in the warm-up to bowl more accurately with a regular ball, otherwise pace bowling performance is below optimal.
- Description: Doctor of Philosophy
- Authors: Feros, Simon
- Date: 2015
- Type: Text , Thesis , PhD
- Full Text:
- Description: This thesis sought to reveal the physical and kinematic determinants of pace bowling performance. After drawing on these determinants, a secondary aim was to investigate whether pace bowling performance could be enhanced with chronic resistance training and warm-up strategies. However, before the physical and kinematic determinants of pace bowling performance could be identified, and the effects of two training interventions and warm-ups on pace bowling performance, a new pace bowling test was created, and the test-retest reliability of its performance and kinematic measures were evaluated. Knowledge of a variables’ test-retest reliability is important for interpreting the validity of correlations, but also for the determination of a meaningful change following a training intervention. Only one published study to date has explored the test-retest reliability of a pace bowling assessment, and this test only measured bowling accuracy (1). Previous research has not comprehensively examined the relationships between physical qualities and pace bowling performance. Several important physical qualities (e.g., power, speed-acceleration, flexibility, repeat-sprint ability) have been excluded in correlational research, which may be crucial for optimal pace bowling performance. Furthermore, there is only one published training intervention study on pace bowling research (2). Consequently there is scant evidence for coaches to design training programs proven to enhance pace bowling performance. Baseball pitching studies have trialled the effects of heavy-ball throwing in the warm-up on subsequent throwing velocity and accuracy, but this approach has not been studied in cricket pace bowling, especially after several weeks of training. Therefore, four studies were conducted in this PhD project to address these deficiencies in the literature. The purpose of Study 1 (Chapter 3) was to ascertain the test-retest reliability of bowling performance measures (i.e., bowling speed, bowling accuracy, consistency of bowling speed, and consistency of bowling accuracy) and selected bowling kinematics (i.e., approach speed, step length, step-length phase duration, power phase duration, and knee extension angle at front-foot contact and at ball release) in a novel eight-over test, and for the first four overs of this test. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and coefficient of variation (CV) were used as measures of test-retest reliability (3). Following a three week familiarisation period of bowling, 13 participants completed a novel eight-over bowling test on two separate days with 4–7 days apart. The most reliable performance measures in the bowling test were peak bowling speed (ICC = 0.948–0.975, CV = 1.3–1.9%) and mean bowling speed (ICC = 0.981–0.987, CV = 1.0–1.3%). Perceived effort was partially reliable (ICC = 0.650– 0.659, CV = 3.8–3.9%). However, mean bowling accuracy (ICC = 0.491–0.685, CV = 12.5–16.8%) and consistency of bowling accuracy failed to meet the pre-set standard for acceptable reliability (ICC = 0.434–0.454, CV = 15.3–19.3%). All bowling kinematic variables except approach speed exhibited acceptable reliability (i.e., ICC > 0.8, CV < 10%). The first four overs of the bowling test exhibited slightly poorer test-retest reliability for all measures, compared to the entire eight-over test. There were no systematic biases (i.e., p > 0.05) detected with all variables between bowling tests, indicating there was no learning or fatigue effects. The smallest worthwhile change was established for all bowling performance and kinematic variables, by multiplying the SEM by 1.5 (4). It is recommended that the eight-over pace bowling test be used as a more comprehensive measure of consistency of bowling speed and consistency of bowling accuracy, as bowlers are more likely to be fatigued. However, if coaches seek to assess pace bowlers in shorter time, delimiting the test to the first four overs is recommended. Both versions of the pace bowling test are only capable of reliably measuring bowling performance outcomes such as peak and mean bowling speed, and perceived effort. The second study of this PhD project examined the relationships between selected physical qualities, bowling kinematics, and bowling performance measures. Another purpose of this novel study was to determine if delivery instructions (i.e., maximal-effort, match-intensity, slower-ball) influenced the strength of the relationships between physical qualities and bowling performance measures. Given that there were three delivery instructions in the bowling test, an objective of this study was to explore the relationship between bowling speed and bowling accuracy (i.e., speed-accuracy trade-off). Thirty-one participants completed an eight-over bowling test in the first session, and a series of physical tests, spread over two separate sessions. Each session was separated by four to seven days. Mean bowling speed (of all pooled deliveries) was significantly correlated to 1-RM pull-up strength (rs [24] = 0.55, p = 0.01) and 20-m sprint time (rs [30] = -0.37, p = 0.04), but the correlations marginally increased as delivery effort increased (i.e., maximal-effort ball). Greater hamstring flexibility was associated with a better consistency of bowling speed, but only for a match-intensity delivery (rs [29] = -0.49, p = 0.01). Repeat-sprint ability (i.e., percent decrement on 10 × 20-m sprints, on every 20 s) displayed a stronger correlation to consistency of bowling speed (rs [21] = -0.42, p = 0.06) than for mean bowling speed (rs [21] = 0.15, p = 0.53). Bench press strength was moderately related to bowling accuracy for a maximal-effort delivery (rs [26] = -0.42, p = 0.03), with weaker but non-significant (p > 0.05) correlations for match-intensity and slower-ball deliveries. Bowling accuracy was also significantly related to peak concentric countermovement jump power (rs [28] = -0.41, p = 0.03) and mean peak concentric countermovement jump power (rs [27] = -0.45, p = 0.02), with both physical qualities displaying stronger correlations as delivery effort increased. Greater reactive strength was negatively associated with mean bowling accuracy (rs [30] = 0.38, p = 0.04) and consistency of bowling accuracy (rs [30] = 0.43, p = 0.02) for maximal-effort deliveries only. Faster bowling speeds were correlated to a longer step length (rs [31] = 0.51, p < 0.01) and quicker power phase duration (rs [31] = -0.45, p = 0.01). A better consistency of bowling accuracy was associated with a faster approach speed (rs [31] = -0.36, p = 0.05) and greater knee flexion angle at ball release (rs [27] = -0.42, p = 0.03). No speedaccuracy trade-off was observed for the group (rs [31] = -0.28, p = 0.12), indicating that most bowlers could be instructed to train at maximal-effort without compromising bowling accuracy. Pull-up strength training and speed-acceleration training were chosen for the “evidence-based” training program (Study 3). Heavy-ball bowling was also considered as part of the evidence-based training program, as it is a specific form of training used previously, and because there was a shortage of significant relationships (p < 0.05) between physical qualities and bowling performance measures in Study 2. The third investigation of this PhD project compared the effects of an eight-week evidence-based training program or normal training program (not a control group) on pace bowling performance, approach speed, speed-acceleration, and pull-up strength. Participants were matched for bowling speed and then randomly split into two training groups, with six participants in each group. After an initial two-week familiarisation period of bowling training, sprint training, and pull-up training, participants completed two training sessions per week, and were tested before and after the training intervention. Testing comprised the four-over pace bowling test (Study 1), 20-m sprint test (Study 2), and 1-RM pull-up test (Study 2). In training, the volume of bowling and sprinting was constant between both groups; the only differences were that the evidence-based training group bowled with heavy balls (250 g and 300 g) as well as a regular ball (156 g), sprinted with a weighted-vest (15% and 20% body mass) and without a weighted-vest, and performed pull-up training. Participants were instructed to deliver each ball with maximal effort in training, as no speed-accuracy trade-off was observed for the sample in Study 2. The evidence-based training group bowled with poorer accuracy and consistency of accuracy, with only a small improvement in peak and mean bowling speed. Heavy-ball bowling may have had a negative transfer to regular-ball bowling. Although speculative, a longer evidence-based program may have significantly enhanced bowling speed. Coaches could use both training programs to develop performance but should be aware that bowling accuracy may suffer with the evidence-based program. The evidence-based training group displayed slower 20-m sprint times following training (0.08 ± 0.05 s). However, the normal training group was also slower (0.10 ± 0.09 s), indicating the potential for speed-acceleration improvement is compromised if speed training is performed immediately after bowling training; most likely due to residual fatigue. Consequently it is recommended that speed-acceleration training be conducted when bowlers are not fatigued, in a separate session, or at the beginning of a session. The evidence-based training group improved their 1-RM pull-up strength by 5.8 ± 6.8 kg (d = 0.68), compared to the normal training group of 0.2 ± 1.7 kg (d = 0.01). The difference between training groups is due to the fact that the normal training group were not prescribed pull-up training. As many participants could not complete the pull-up exercise due to insufficient strength, the dumbbell pullover may be a suitable alternative that is more specific to the motion of the bowling arm (i.e., extended arm). The fourth study of this PhD project explored the acute effects of a heavy-ball bowling warm-up on pace bowling performance, and determined if these acute effects could be enhanced or negated following an evidence-based training program. This study involved the same participants who completed the evidence-based training program in Study 3. These participants were required to perform two different bowling warm-ups (heavy-ball or regular-ball) in pre and post-test period, followed by the four-over pace bowling test (Study 1). In pre-test period, bowling accuracy was 8.8 ± 7.4 cm worse for the heavy-ball warm-up compared to the regular-ball warm-up (d = 1.19). In post-test period however, bowling accuracy was 5.5 ± 6.4 cm better in the heavy-ball warm-up compared to the regular-ball warm-up (d = -0.90). A similar trend was observed for consistency of bowling accuracy. These findings indicate that pace bowlers adapt to heavy-ball bowling, and bowl more accurately with a regular ball if they warm-up with a heavy ball first (but only after eight weeks of heavy-ball training). Coaches could employ a heavy-ball warm-up prior to training or a match, but only after eight weeks of evidence based training. It is hypothesised that a less biomechanically similar exercise to the pace bowling motion such as resisted push-ups / bench press throws could be more effective in eliciting potentiation by activating higher order motor units without negatively transferring to bowling performance. From the studies presented in this thesis, it is concluded that peak and mean bowling speed are the most reliable bowling performance measures, and all kinematic variables apart from approach speed possess excellent reliability. Furthermore, 1-RM pull-up strength and 20-m speed are significantly correlated to bowling speed. An evidence-based training program can develop peak and mean bowling speed, but the cost to bowling accuracy and consistency of bowling accuracy does not make this training program worthwhile in enhancing pace bowling performance. A heavy-ball warm-up impairs bowling accuracy and consistency of bowling accuracy compared to the regular-ball warm-up, but only prior to training with the heavier balls. Pace bowlers adapt to heavyball bowling after eight weeks of training, but must use the heavy balls in the warm-up to bowl more accurately with a regular ball, otherwise pace bowling performance is below optimal.
- Description: Doctor of Philosophy
The reliability and sensitivity of performance measures in a novel pace-bowling test
- Feros, Simon, Young, Warren, O’Brien, Brendan
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 2 (2018), p. 151-155
- Full Text:
- Reviewed:
- Description: Objectives: To evaluate the reliability and sensitivity of performance measures in a novel pace-bowling test. Methods: Thirteen male amateur-club fast bowlers completed a novel pace-bowling test on 2 separate occasions, 4–7 d apart. Participants delivered 48 balls (8 overs) at 5 targets on a suspended sheet situated behind a live batter, who stood in a right-handed and left-handed stance for an equal number of deliveries. Delivery instruction was frequently changed, with all deliveries executed in a preplanned sequence. Data on ball-release speed were captured by radar gun. A high-speed camera captured the moment of ball impact on the target sheet for assessment of radial error and bivariate variable error. Delivery rating of perceived exertion (0–100%) was collected as a measure of intensity. Results: Intraclass correlation coefficients and coefficients of variation revealed excellent reliability for peak and mean ball-release speed, acceptable reliability for delivery rating of perceived exertion, and poor reliability for mean radial error, bivariate variable error, and variability of ball-release speed. The smallest worthwhile change indicated high sensitivity with peak and mean ball-release speed and lower sensitivity with mean radial error and bivariate variable error. Conclusions: The novel pace-bowling test incorporates improvements in ecological validity compared with its predecessors and can be used to provide a more comprehensive evaluation of pace-bowling performance. Data on the smallest worthwhile change can improve interpretation of pace-bowling research findings and may therefore influence recommendations for applied practice. © 2018 Human Kinetics, Inc.
- Authors: Feros, Simon , Young, Warren , O’Brien, Brendan
- Date: 2018
- Type: Text , Journal article
- Relation: International Journal of Sports Physiology and Performance Vol. 13, no. 2 (2018), p. 151-155
- Full Text:
- Reviewed:
- Description: Objectives: To evaluate the reliability and sensitivity of performance measures in a novel pace-bowling test. Methods: Thirteen male amateur-club fast bowlers completed a novel pace-bowling test on 2 separate occasions, 4–7 d apart. Participants delivered 48 balls (8 overs) at 5 targets on a suspended sheet situated behind a live batter, who stood in a right-handed and left-handed stance for an equal number of deliveries. Delivery instruction was frequently changed, with all deliveries executed in a preplanned sequence. Data on ball-release speed were captured by radar gun. A high-speed camera captured the moment of ball impact on the target sheet for assessment of radial error and bivariate variable error. Delivery rating of perceived exertion (0–100%) was collected as a measure of intensity. Results: Intraclass correlation coefficients and coefficients of variation revealed excellent reliability for peak and mean ball-release speed, acceptable reliability for delivery rating of perceived exertion, and poor reliability for mean radial error, bivariate variable error, and variability of ball-release speed. The smallest worthwhile change indicated high sensitivity with peak and mean ball-release speed and lower sensitivity with mean radial error and bivariate variable error. Conclusions: The novel pace-bowling test incorporates improvements in ecological validity compared with its predecessors and can be used to provide a more comprehensive evaluation of pace-bowling performance. Data on the smallest worthwhile change can improve interpretation of pace-bowling research findings and may therefore influence recommendations for applied practice. © 2018 Human Kinetics, Inc.
Fielders and batters are injured too : A prospective cohort study of injuries in junior club cricket
- Finch, Caroline, White, Peta, Dennis, Rebecca, Twomey, Dara, Hayen, Andrew
- Authors: Finch, Caroline , White, Peta , Dennis, Rebecca , Twomey, Dara , Hayen, Andrew
- Date: 2010
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 13, no. 5 (2010), p. 489-495
- Relation: http://purl.org/au-research/grants/nhmrc/565900
- Full Text: false
- Reviewed:
- Description: Internationally, there is a lack of good quality, prospectively collected injury data reported for junior club cricketers. This study describes injury rates according to age level of play and playing positions in junior community-level club cricketers to identify priorities for prevention. A prospective cohort study was used to monitor injuries in 88 under 12 years (U12), 203 U14 and 120 U16 players from the Ballarat Junior Cricket Association, Australia over the 2007/2008 playing season. Injury rates were calculated per 1000 participations when batting, bowling or fielding in matches and training sessions. Injury rate ratios were used to compare rates across age levels of play and position of play. Overall, 47 injuries were reported. Injury rates increased with age level of play with only one U12 player injured. Match injury rates were 3.57 per 1000 U14 participations versus 4.80 per 1000 U16 participations. Training injury rates were 4.20 per 1000 U14 participations versus 5.11 per 1000 U16 participations. On a proportionate basis, injuries occurred equally to fielders, batters and bowlers. There was a trend towards more injuries occurring while batting and fielding in matches, and more injuries occurring while bowling and batting during training sessions. In conclusion, injury rates in junior cricket players are low, but increase with age level of play. Unlike adult forms of the game, injuries occur to fielders and batters at least as frequently as to bowlers, indicating that preventive strategies need to be developed for all junior players and not just bowlers, as has been the focus previously. © 2009 Sports Medicine Australia.
- Description: 2003008120
Barriers to adolescent female participation in cricket
- Fowlie, J., Eime, Rochelle, Griffiths, K.
- Authors: Fowlie, J. , Eime, Rochelle , Griffiths, K.
- Date: 2021
- Type: Text , Journal article
- Relation: Annals of Leisure Research Vol. 24, no. 4 (2021), p. 513-531
- Full Text: false
- Reviewed:
- Description: With the ever-growing number of opportunities for females to participate in a range of sports without the previous associated gender norms, females are starting to become more involved in traditionally male-dominated sports, however, we know little about their barriers to participation. In this qualitative study, we investigated the barriers to adolescent female participation in cricket in a regional city in Victoria, Australia. The socio-ecological model was utilized to help guide semi-structured focus-group interviews with 20 adolescent females aged 10–12 years old. The participants identified the following as key barriers to their participation in cricket: lack of confidence in skills, having to play cricket with males, an absence of pathway opportunities, and no female only cricket competitions. These findings highlight the importance of adequate coaching specifically for females, female only teams and competitions, and the importance of developing a player pathway for adolescent girls. © 2020 Australia and New Zealand Association of Leisure Studies.
Ground sharing between cricket and football in Australia
- Frost, Lionel, Lightbody, Margaret, Halabi, Abdel, Carter, Amanda, Borrowman, Luc
- Authors: Frost, Lionel , Lightbody, Margaret , Halabi, Abdel , Carter, Amanda , Borrowman, Luc
- Date: 2016
- Type: Text , Book chapter
- Relation: Sports Through the Lens of Economic History Chapter 6 p. 89-105
- Full Text: false
- Reviewed:
- Description: Shared use of grounds allowed Australian cricket and football to subsidize each other, but cartel arrangements that determined the use of stadiums and the distribution of benefits and costs between sports may have been less than optimal. Estimation of deadweight losses from the use of stadiums is not possible in the absence of a counterfactual specifying the level of demand if the behaviour of cartel members had been coordinated more effectively. Archival, financial and attendance report data can be used to estimate increases in actual demand under alternative scenarios. In Melbourne and Adelaide, the controlling bodies of cricket and football uncured significant losses in welfare from joint use of their cities’ major stadium, due to the importance they attached to non-monetary aspects of utility.
- Frost, Lionel, Lightbody, Margaret, Carter, Amanda, Halabi, Abdel
- Authors: Frost, Lionel , Lightbody, Margaret , Carter, Amanda , Halabi, Abdel
- Date: 2016
- Type: Text , Journal article
- Relation: Business History Vol. 58, no. 8 (2016), p. 1164-1182
- Full Text: false
- Reviewed:
- Description: Before 1973, cricket and Australian Football used the Adelaide Oval for major games during their respective seasons. Football's popularity as a spectator sport prompted its organising body to seek to build an improved stadium, but cricket authorities controlled the asset and acted to maintain its specialised character as a cricket ground. A case study of how the gains from a shared capital good are negotiated when asset controllers and users have different objectives is provided. A series of counterfactual scenarios based on football remaining at the Oval is constructed from archival sources and their outcomes projected based on data in financial reports.
Adaptation, translation and reliability of the Australian 'Juniors Enjoying Cricket Safely' injury risk perception questionnaire for Sri Lanka
- Gamage, Prasanna, Fortington, Lauren, Finch, Caroline
- Authors: Gamage, Prasanna , Fortington, Lauren , Finch, Caroline
- Date: 2018
- Type: Text , Journal article
- Relation: BMJ Open Sport and Exercise Medicine Vol. 4, no. 1 (2018), p. 1-9
- Full Text:
- Reviewed:
- Description: Objectives Cricket is a very popular sport in Sri Lanka. In this setting there has been limited research; specifically, there is little knowledge of cricket injuries. To support future research possibilities, the aim of this study was to cross-culturally adapt, translate and test the reliability of an Australian-developed questionnaire for the Sri Lankan context. Methods The Australian 'Juniors Enjoying Cricket Safely' (JECS-Aus) injury risk perception questionnaire was cross-culturally adapted to suit the Sri Lankan context and subsequently translated into the two main languages (Sinhala and Tamil) based on standard forward-back translation. The translated questionnaires were examined for content validity by two language schoolteachers. The questionnaires were completed twice, 2 weeks apart, by two groups of school cricketers (males) aged 11-15 years (Sinhala (n=24), Tamil (n=30)) to assess reliability. Test-retest scores were evaluated for agreement. Where responses were <100% agreement, Cohen's kappa (κ) statistics were calculated. Questions with moderate-to-poor test-retest reliability (κ <0.6) were reconsidered for modification. Results Both the Sinhala and Tamil questionnaires had 100% agreement for questions on demographic data, and 88%-100% agreement for questions on participation in cricket and injury history. Of the injury risk perception questions, 72% (Sinhala) and 90% (Tamil) questions showed a substantial (κ =0.61-0.8) and almost perfect (κ =0.81-1.0) test-retest agreement. Conclusion The adapted and translated JECS-SL questionnaire demonstrated strong reliability. This is the first study to adapt the JECS-Aus questionnaire for use in a different population, providing an outcome measure for assessing injury risk perceptions in Sri Lankan junior cricketers.
- Authors: Gamage, Prasanna , Fortington, Lauren , Finch, Caroline
- Date: 2018
- Type: Text , Journal article
- Relation: BMJ Open Sport and Exercise Medicine Vol. 4, no. 1 (2018), p. 1-9
- Full Text:
- Reviewed:
- Description: Objectives Cricket is a very popular sport in Sri Lanka. In this setting there has been limited research; specifically, there is little knowledge of cricket injuries. To support future research possibilities, the aim of this study was to cross-culturally adapt, translate and test the reliability of an Australian-developed questionnaire for the Sri Lankan context. Methods The Australian 'Juniors Enjoying Cricket Safely' (JECS-Aus) injury risk perception questionnaire was cross-culturally adapted to suit the Sri Lankan context and subsequently translated into the two main languages (Sinhala and Tamil) based on standard forward-back translation. The translated questionnaires were examined for content validity by two language schoolteachers. The questionnaires were completed twice, 2 weeks apart, by two groups of school cricketers (males) aged 11-15 years (Sinhala (n=24), Tamil (n=30)) to assess reliability. Test-retest scores were evaluated for agreement. Where responses were <100% agreement, Cohen's kappa (κ) statistics were calculated. Questions with moderate-to-poor test-retest reliability (κ <0.6) were reconsidered for modification. Results Both the Sinhala and Tamil questionnaires had 100% agreement for questions on demographic data, and 88%-100% agreement for questions on participation in cricket and injury history. Of the injury risk perception questions, 72% (Sinhala) and 90% (Tamil) questions showed a substantial (κ =0.61-0.8) and almost perfect (κ =0.81-1.0) test-retest agreement. Conclusion The adapted and translated JECS-SL questionnaire demonstrated strong reliability. This is the first study to adapt the JECS-Aus questionnaire for use in a different population, providing an outcome measure for assessing injury risk perceptions in Sri Lankan junior cricketers.
- Kodikara, Dulan, Plumb, Mandy, Twomey, Dara
- Authors: Kodikara, Dulan , Plumb, Mandy , Twomey, Dara
- Date: 2020
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 23, no. 12 (2020), p. 1161-1165
- Full Text: false
- Reviewed:
- Description: Objectives: To present an epidemiological profile of hospital-treated head, neck and facial cricket injuries from 2007/08 to 2016/17 in Victoria, Australia. Design: Retrospective analysis of emergency department and hospital admission data. Methods: An analysis of Victorian hospital-treated head, neck and facial cricket injuries of all cricket participants over 5 years old between July 2007 and June 2017. Results: Over the decade, 3907 head, neck, facial (HNF) cricket injuries were treated in Victorian hospitals. The number of HNF cricket injuries substantially increased in the 2014/15 season from 367 to 435 injuries and remained over 400 in the subsequent years. More injuries were reported for male compared to female participants, 3583 compared to 324 injuries. When adjusted for participation in competitive cricket, the injury incidence rate was 1.3 per 1000 participants for males and 0.4 per 1000 participants for females. The 10−14 year age group most frequently required hospital treatment. Open wounds were the most common type of injury (1166, 29.8%) and the main mechanism for HNF cricket injury for this decade was hit/struck/crush (3361, 86.0%). Conclusions: This study provides a novel and current insight of the incidence and details of HNF injuries among cricket participants in Victoria over a decade. It is evident that males and younger participants, regardless of gender, have a higher risk of sustaining a HNF injury. This study provides a solid evidence base for stakeholders in developing strategies to minimise head, neck and facial injuries to make cricket a safe sport for all. © 2020 Sports Medicine Australia
Concussion assessment and management — what do community-level cricket participants know?
- Kodikara, Dulan, Plumb, Mandy, Twomey, Dara
- Authors: Kodikara, Dulan , Plumb, Mandy , Twomey, Dara
- Date: 2023
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 26, no. 9 (2023), p. 448-453
- Full Text:
- Reviewed:
- Description: Objectives: To explore Australian cricket participants' knowledge of concussion assessment and management, and awareness of current concussion guidelines. Design: Cross-sectional survey. Methods: Novel and validated surveys were disseminated online, among over 16 year Australian cricket players and officials at the end of the 2018/19 cricket season. Data were collected on knowledge and awareness of concussion and analysed using descriptive statistics and crosstabulations. Further comparisons were made for the players between injured and non-injured, and helmet wearers and non-helmet wearers using Fisher's exact statistical test. Results: Both players (n = 224, 93 %) and officials (n = 36, 100 %) demonstrated strong knowledge of the importance of immediately evaluating suspected concussions. In comparison with players without helmets (n = 11), those using helmets (n = 135) considered replacing their helmets after a concussion to be vital to concussion assessment (p = 0.02). Overall, 80–97 % of players and 81–97 % of officials understood the importance of many factors regarding concussion management. When concussion management knowledge was compared by injury status, injured players (n = 17, 94 %) believed someone with a concussion should be hospitalised immediately, in contrast to non-injured players (n = 154, 69 %) (p = 0.04). Players (63 %) were less aware of concussion guidelines than officials (81 %). Conclusions: Overall, the knowledge of concussion assessment and management was satisfactory. However, there were discrepancies among players on some aspects of awareness of concussion guidelines. Increasing players' familiarity and experience in using the concussion guidelines is warranted. Targeted campaigns are needed to further improve concussion recognition and treatment at community-level cricket, so all participants play a role in making cricket a safe sport. © 2023
- Authors: Kodikara, Dulan , Plumb, Mandy , Twomey, Dara
- Date: 2023
- Type: Text , Journal article
- Relation: Journal of Science and Medicine in Sport Vol. 26, no. 9 (2023), p. 448-453
- Full Text:
- Reviewed:
- Description: Objectives: To explore Australian cricket participants' knowledge of concussion assessment and management, and awareness of current concussion guidelines. Design: Cross-sectional survey. Methods: Novel and validated surveys were disseminated online, among over 16 year Australian cricket players and officials at the end of the 2018/19 cricket season. Data were collected on knowledge and awareness of concussion and analysed using descriptive statistics and crosstabulations. Further comparisons were made for the players between injured and non-injured, and helmet wearers and non-helmet wearers using Fisher's exact statistical test. Results: Both players (n = 224, 93 %) and officials (n = 36, 100 %) demonstrated strong knowledge of the importance of immediately evaluating suspected concussions. In comparison with players without helmets (n = 11), those using helmets (n = 135) considered replacing their helmets after a concussion to be vital to concussion assessment (p = 0.02). Overall, 80–97 % of players and 81–97 % of officials understood the importance of many factors regarding concussion management. When concussion management knowledge was compared by injury status, injured players (n = 17, 94 %) believed someone with a concussion should be hospitalised immediately, in contrast to non-injured players (n = 154, 69 %) (p = 0.04). Players (63 %) were less aware of concussion guidelines than officials (81 %). Conclusions: Overall, the knowledge of concussion assessment and management was satisfactory. However, there were discrepancies among players on some aspects of awareness of concussion guidelines. Increasing players' familiarity and experience in using the concussion guidelines is warranted. Targeted campaigns are needed to further improve concussion recognition and treatment at community-level cricket, so all participants play a role in making cricket a safe sport. © 2023
Head, neck, and facial injuries in Australian cricket
- Authors: Kodikara, Dulan
- Date: 2023
- Type: Text , Thesis , PhD
- Full Text:
- Description: Head, neck and facial (HNF) injuries are a significant concern in cricket due to the nature of the game and the potential impact of fast-moving balls and collisions. These types of injuries occur as a result of direct hits from the cricket ball, accidental collisions between players or falls during fielding or batting. HNF injuries can range from minor cuts and bruises to more severe concussions, fractures, or dental trauma. While some HNF injuries in cricket can be career-ending and severe, others may not be as catastrophic. Over the past decade, there has been a noticeable increase in the incidence of HNF injuries in elite-level cricket, and the tragic death of an Australian test cricketer in 2014 from a head injury heightened awareness of the seriousness and potential fatality of such injuries in the sport. To mitigate the risk of serious injuries, cricket players are encouraged to wear protective equipment such as helmets and neck guards. At the elite level of the sport, stringent safety protocols and regulations are enforced to prioritise player wellbeing, ensuring that immediate medical attention is available during training or games. Further, routine injury surveillance at the elite level has proven effective in monitoring and reducing the likelihood of serious HNF injuries. Nevertheless, there is a noticeable lack of research investigating HNF injuries among cricket participants, particularly at the community level. This lack of reporting hampers the identification and implementation of effective strategies to minimise the risk of such injuries. This thesis seeks to bridge this research gap by examining HNF injuries in community-level cricket under two broad objectives, providing valuable insights for injury prevention and risk mitigation strategies. The first objective of this thesis was to develop a comprehensive understanding of HNF cricket injury epidemiology and the reporting of helmet usage. A systematic review was conducted, analysing 29 studies to determine the incidence, nature, and mechanisms of HNF injuries in cricket, the reported use of helmets and ‘gold standard’ definitions. Facial fractures and concussions were the most frequently specified types of injuries, and the impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. Only three studies (10%) reported the use of helmets. The systematic review highlighted the lack of evidence regarding the reporting of HNF cricket injuries according to international cricket consensus injury definitions, as well as the limited data on helmet usage at the time of injury. Additionally, the review identified gaps in evidence concerning HNF injuries across different age groups, levels of play and diverse populations, along with discrepancies in reporting injury-specific mechanisms. Community-level HNF cricket injuries that required hospitalisation in Victoria, Australia, over a decade, spanning from 2007/8 to 2016/17 were also reviewed under the first objective. During this period, Victorian hospitals treated 3,907 HNF cricket injuries. Male participants accounted for a higher number of injuries than female participants, and the age group most commonly requiring hospital treatment was 10–14 years. Open wounds were the most frequent type of injury (30%), and the primary mechanism for HNF cricket injuries during this decade was being hit, struck, or crushed (86%). Our literature review and the hospital study form the ideal platform for injury prevention efforts by establishing HNF injury prevalence and common injury mechanisms. The second broad objective of this thesis was to investigate the use of cricket helmets among cricket participants, to study the ability of Australian cricket participants to perceive injury risk and to explore the knowledge and awareness of concussion assessment and management. An online survey was conducted to address each facet of our second objective. Over 90% of the players and 50% of the officials reported wearing a helmet during the 2018/19 cricket season, but most did not use a neck protector. Most of the helmets used met the recommended British Standards, and the most common brand used was Masuri. For most of the players and officials who participated in our survey, comfort, and ability to prevent HNF injuries were the two most important factors affecting their decision to purchase a cricket helmet. More than 80% of players and almost 50% of officials expressed the belief that helmets were not necessary for activities such as bowling and fielding at a distance from the batter. Yet, the fact that more than 80% of all participants expressed their willingness to keep using helmets under compulsory regulations indicates that implementing mandatory helmet rules might result in a significant increase in helmet adoption and enhance the overall safety of the sport. Over 70% of our survey participants demonstrated satisfactory levels of knowledge regarding concussion assessment and management. These findings suggest that the potential for severe complications stemming from concussions related to cricket could be reduced, particularly in light of the limited availability of qualified medical professionals at the community-level. The strong understanding of concussion guidelines among our survey participants implies that they would be inclined to prioritise safety and choose helmets that align with the recommended safety standards. In summary, this PhD research has achieved its objective of making the first large-scale scientific contribution to enhance safety and prevent HNF injuries among participants of community-level cricket in Australia. Additionally, this research effectively assessed the participants’ knowledge, comprehension and attitudes regarding utilising protective helmets and the importance of following Cricket Australia’s concussion guidelines.
- Description: Doctor of Philosophy
- Authors: Kodikara, Dulan
- Date: 2023
- Type: Text , Thesis , PhD
- Full Text:
- Description: Head, neck and facial (HNF) injuries are a significant concern in cricket due to the nature of the game and the potential impact of fast-moving balls and collisions. These types of injuries occur as a result of direct hits from the cricket ball, accidental collisions between players or falls during fielding or batting. HNF injuries can range from minor cuts and bruises to more severe concussions, fractures, or dental trauma. While some HNF injuries in cricket can be career-ending and severe, others may not be as catastrophic. Over the past decade, there has been a noticeable increase in the incidence of HNF injuries in elite-level cricket, and the tragic death of an Australian test cricketer in 2014 from a head injury heightened awareness of the seriousness and potential fatality of such injuries in the sport. To mitigate the risk of serious injuries, cricket players are encouraged to wear protective equipment such as helmets and neck guards. At the elite level of the sport, stringent safety protocols and regulations are enforced to prioritise player wellbeing, ensuring that immediate medical attention is available during training or games. Further, routine injury surveillance at the elite level has proven effective in monitoring and reducing the likelihood of serious HNF injuries. Nevertheless, there is a noticeable lack of research investigating HNF injuries among cricket participants, particularly at the community level. This lack of reporting hampers the identification and implementation of effective strategies to minimise the risk of such injuries. This thesis seeks to bridge this research gap by examining HNF injuries in community-level cricket under two broad objectives, providing valuable insights for injury prevention and risk mitigation strategies. The first objective of this thesis was to develop a comprehensive understanding of HNF cricket injury epidemiology and the reporting of helmet usage. A systematic review was conducted, analysing 29 studies to determine the incidence, nature, and mechanisms of HNF injuries in cricket, the reported use of helmets and ‘gold standard’ definitions. Facial fractures and concussions were the most frequently specified types of injuries, and the impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. Only three studies (10%) reported the use of helmets. The systematic review highlighted the lack of evidence regarding the reporting of HNF cricket injuries according to international cricket consensus injury definitions, as well as the limited data on helmet usage at the time of injury. Additionally, the review identified gaps in evidence concerning HNF injuries across different age groups, levels of play and diverse populations, along with discrepancies in reporting injury-specific mechanisms. Community-level HNF cricket injuries that required hospitalisation in Victoria, Australia, over a decade, spanning from 2007/8 to 2016/17 were also reviewed under the first objective. During this period, Victorian hospitals treated 3,907 HNF cricket injuries. Male participants accounted for a higher number of injuries than female participants, and the age group most commonly requiring hospital treatment was 10–14 years. Open wounds were the most frequent type of injury (30%), and the primary mechanism for HNF cricket injuries during this decade was being hit, struck, or crushed (86%). Our literature review and the hospital study form the ideal platform for injury prevention efforts by establishing HNF injury prevalence and common injury mechanisms. The second broad objective of this thesis was to investigate the use of cricket helmets among cricket participants, to study the ability of Australian cricket participants to perceive injury risk and to explore the knowledge and awareness of concussion assessment and management. An online survey was conducted to address each facet of our second objective. Over 90% of the players and 50% of the officials reported wearing a helmet during the 2018/19 cricket season, but most did not use a neck protector. Most of the helmets used met the recommended British Standards, and the most common brand used was Masuri. For most of the players and officials who participated in our survey, comfort, and ability to prevent HNF injuries were the two most important factors affecting their decision to purchase a cricket helmet. More than 80% of players and almost 50% of officials expressed the belief that helmets were not necessary for activities such as bowling and fielding at a distance from the batter. Yet, the fact that more than 80% of all participants expressed their willingness to keep using helmets under compulsory regulations indicates that implementing mandatory helmet rules might result in a significant increase in helmet adoption and enhance the overall safety of the sport. Over 70% of our survey participants demonstrated satisfactory levels of knowledge regarding concussion assessment and management. These findings suggest that the potential for severe complications stemming from concussions related to cricket could be reduced, particularly in light of the limited availability of qualified medical professionals at the community-level. The strong understanding of concussion guidelines among our survey participants implies that they would be inclined to prioritise safety and choose helmets that align with the recommended safety standards. In summary, this PhD research has achieved its objective of making the first large-scale scientific contribution to enhance safety and prevent HNF injuries among participants of community-level cricket in Australia. Additionally, this research effectively assessed the participants’ knowledge, comprehension and attitudes regarding utilising protective helmets and the importance of following Cricket Australia’s concussion guidelines.
- Description: Doctor of Philosophy
A systematic review of head, neck and-facial injuries in cricket
- Kodikara, Dulan, Twomey, Dara, Plumb, Mandy
- Authors: Kodikara, Dulan , Twomey, Dara , Plumb, Mandy
- Date: 2022
- Type: Text , Journal article
- Relation: International Journal of Sports Medicine Vol. 43, no. 6 (2022), p. 496-504
- Full Text:
- Reviewed:
- Description: This systematic review was conducted to identify the incidence, nature and mechanisms of head, neck and facial (HNF) injuries in cricket and the reported use of helmets. Five databases were searched up to 30 thNovember 2020. From peer-reviewed cricket injury studies published in English, studies reporting on HNF cricket injuries as per the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were selected. Twenty-nine studies were included. HNF injuries had a cumulative total of 794/5,886 injuries equating to 13% of all injuries. Non- specified HNF injuries (n=210, 26%) were the most prevalent type of injury followed by non-specified head injuries (n=130, 16%), other non-specified fractures (n=119, 15%) and concussions (n=60, 8%).The impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. The use of helmet was reported in only three studies (10%). From studies reporting on HNF cricket injuries, facial fractures, and concussions were the most common specified-types of injury. There is little evidence on reporting of HNF cricket injuries as per the international cricket consensus injury definitions, as well as the use of helmets at the time of injury. © 2022 American Institute of Physics Inc.. All rights reserved.
- Authors: Kodikara, Dulan , Twomey, Dara , Plumb, Mandy
- Date: 2022
- Type: Text , Journal article
- Relation: International Journal of Sports Medicine Vol. 43, no. 6 (2022), p. 496-504
- Full Text:
- Reviewed:
- Description: This systematic review was conducted to identify the incidence, nature and mechanisms of head, neck and facial (HNF) injuries in cricket and the reported use of helmets. Five databases were searched up to 30 thNovember 2020. From peer-reviewed cricket injury studies published in English, studies reporting on HNF cricket injuries as per the Preferred Reporting Items for Systematic Reviews and Meta-Analysis (PRISMA) guidelines were selected. Twenty-nine studies were included. HNF injuries had a cumulative total of 794/5,886 injuries equating to 13% of all injuries. Non- specified HNF injuries (n=210, 26%) were the most prevalent type of injury followed by non-specified head injuries (n=130, 16%), other non-specified fractures (n=119, 15%) and concussions (n=60, 8%).The impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. The use of helmet was reported in only three studies (10%). From studies reporting on HNF cricket injuries, facial fractures, and concussions were the most common specified-types of injury. There is little evidence on reporting of HNF cricket injuries as per the international cricket consensus injury definitions, as well as the use of helmets at the time of injury. © 2022 American Institute of Physics Inc.. All rights reserved.
Cricket ground, Walhalla [picture].
- Authors: Lee, William Harrison
- Date: 1907
- Type: Still Image
- Full Text: false
- Description: Two views of the cricket ground, seven hundred feet above the town. Melbourne Cricket Club are playing a Walhalla team in 1907.
- Description: Item held by Gippsland and Regional Studies Collection, Federation University Australia.
- Description: Record generated from title list.
- Description: The Walhalla story - POT 29|Lee, W.H. - The Switzerland of Australia
- Description: 03-Feb-92
- Authors: Lee, William Harrison
- Date: 1907
- Type: Still Image
- Full Text: false
- Description: Two views of the cricket ground, seven hundred feet above the town. Melbourne Cricket Club are playing a Walhalla team in 1907.
- Description: Item held by Gippsland and Regional Studies Collection, Federation University Australia.
- Description: Record generated from title list.
- Description: The Walhalla story - POT 29|Lee, W.H. - The Switzerland of Australia
- Description: 03-Feb-92
Medical-attention injuries in community cricket : a systematic review
- McLeod, Geordie, O'Connor, Siobhan, Morgan, Damian, Kountouris, Alex, Finch, Caroline, Fortington, Lauren
- Authors: McLeod, Geordie , O'Connor, Siobhan , Morgan, Damian , Kountouris, Alex , Finch, Caroline , Fortington, Lauren
- Date: 2020
- Type: Text , Journal article , Review
- Relation: BMJ Open Sport and Exercise Medicine Vol. 6, no. 1 (2020), p.
- Full Text:
- Reviewed:
- Description: Objectives The aim was to identify and describe outcomes from original published studies that present the number, nature, mechanism and severity of medically treated injuries sustained in community-level cricket. Design Systematic review. Methods Nine databases were systematically searched to December 2019 using terms "cricket
- Authors: McLeod, Geordie , O'Connor, Siobhan , Morgan, Damian , Kountouris, Alex , Finch, Caroline , Fortington, Lauren
- Date: 2020
- Type: Text , Journal article , Review
- Relation: BMJ Open Sport and Exercise Medicine Vol. 6, no. 1 (2020), p.
- Full Text:
- Reviewed:
- Description: Objectives The aim was to identify and describe outcomes from original published studies that present the number, nature, mechanism and severity of medically treated injuries sustained in community-level cricket. Design Systematic review. Methods Nine databases were systematically searched to December 2019 using terms "cricket
Injury epidemiology among Australian female cricketers
- Authors: Perera, Nirmala
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: Cricket is a male-dominated sport; however, its popularity among females is increasing. Like other sports, participation in cricket poses the risk of injury to players. Injury problems for female cricketers are virtually unknown, as studies examining cricket injuries include almost exclusively male participants. In other sports, the types of injuries experienced by men and women are known to be different. Therefore, a clear understanding of the extent and types of injuries sustained by female cricket players is required, to underpin appropriately targeted injury prevention strategies. This thesis provides the first detailed epidemiological profile of cricket injuries sustained by women, by: 1. conducting a systematic review describing injuries in competitive team bat or stick sports in women, to enable cricket injuries to be viewed within the perspective of wider, but relevant, injury data, 2. evaluating existing data sources relating to hospital admissions from Victoria and Queensland and successful insurance claims across Australia, 3. examining the nature and incidence of cricket injuries in elite female players using Cricket Australia’s Athlete Management System, and 4. conducting a nationwide self-report survey of injuries during the 2014–15 season. This PhD research represents participants from different levels of play, across age groups and across Australia. The findings indicate that incidence of injuries for female cricketers were higher than expected based on previous findings in comparable sports, except when considered in relation to insurance claims. The cricket injury rate across hospital presentations, insurance claims, the AMS (Fair Play AMS 2016) and self-reported survey data, each of which represents different level of the sports injury pyramid, identified all-rounders and pace bowlers as having a higher incidence of injury than players in other positions. The highest frequency of reported injuries were in the head, hands, knees and ankles. The nature of the most common injuries were dislocations/sprains/strains, fractures, muscle injury, joint injury and gradual onset injuries. At the elite-level, lumber spine stress fractures accounted for a significant amount of time-loss from the sport. In this thesis, findings from the insurance claims, self-reported survey and AMS (Fair Play AMS 2016) data indicated that most injuries were of a low severity and were more likely to be treated outside of healthcare facilities such as hospitals. In summary, patterns of the most common injuries, in terms of anatomical location and nature of the injuries, were consistent throughout community-level players with some similarities to elite-level players. However, the injury mechanisms and risk factors may differ depending on the level of competition and player’s skill. Recommendations are that ongoing injury surveillance should be conducted at all levels of the sport, and surveillance methodology should be tailored to the specific setting, personnel and available resources. Therefore, before implementing an injury surveillance system at the community-level of the sport, more research is needed to fully understand what type of injury surveillance system might be feasible and suitable in this context.
- Description: Doctor of Philosophy
- Authors: Perera, Nirmala
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: Cricket is a male-dominated sport; however, its popularity among females is increasing. Like other sports, participation in cricket poses the risk of injury to players. Injury problems for female cricketers are virtually unknown, as studies examining cricket injuries include almost exclusively male participants. In other sports, the types of injuries experienced by men and women are known to be different. Therefore, a clear understanding of the extent and types of injuries sustained by female cricket players is required, to underpin appropriately targeted injury prevention strategies. This thesis provides the first detailed epidemiological profile of cricket injuries sustained by women, by: 1. conducting a systematic review describing injuries in competitive team bat or stick sports in women, to enable cricket injuries to be viewed within the perspective of wider, but relevant, injury data, 2. evaluating existing data sources relating to hospital admissions from Victoria and Queensland and successful insurance claims across Australia, 3. examining the nature and incidence of cricket injuries in elite female players using Cricket Australia’s Athlete Management System, and 4. conducting a nationwide self-report survey of injuries during the 2014–15 season. This PhD research represents participants from different levels of play, across age groups and across Australia. The findings indicate that incidence of injuries for female cricketers were higher than expected based on previous findings in comparable sports, except when considered in relation to insurance claims. The cricket injury rate across hospital presentations, insurance claims, the AMS (Fair Play AMS 2016) and self-reported survey data, each of which represents different level of the sports injury pyramid, identified all-rounders and pace bowlers as having a higher incidence of injury than players in other positions. The highest frequency of reported injuries were in the head, hands, knees and ankles. The nature of the most common injuries were dislocations/sprains/strains, fractures, muscle injury, joint injury and gradual onset injuries. At the elite-level, lumber spine stress fractures accounted for a significant amount of time-loss from the sport. In this thesis, findings from the insurance claims, self-reported survey and AMS (Fair Play AMS 2016) data indicated that most injuries were of a low severity and were more likely to be treated outside of healthcare facilities such as hospitals. In summary, patterns of the most common injuries, in terms of anatomical location and nature of the injuries, were consistent throughout community-level players with some similarities to elite-level players. However, the injury mechanisms and risk factors may differ depending on the level of competition and player’s skill. Recommendations are that ongoing injury surveillance should be conducted at all levels of the sport, and surveillance methodology should be tailored to the specific setting, personnel and available resources. Therefore, before implementing an injury surveillance system at the community-level of the sport, more research is needed to fully understand what type of injury surveillance system might be feasible and suitable in this context.
- Description: Doctor of Philosophy
Too many rib ticklers? Injuries in Australian women's cricket (PhD Academy Award)
- Authors: Perera, Nirmala
- Date: 2019
- Type: Text , Journal article , Editorial Material
- Relation: British Journal of Sports Medicine Vol. 53, no. 22 (Nov 2019), p. 1436-1437
- Full Text:
- Reviewed:
- Authors: Perera, Nirmala
- Date: 2019
- Type: Text , Journal article , Editorial Material
- Relation: British Journal of Sports Medicine Vol. 53, no. 22 (Nov 2019), p. 1436-1437
- Full Text:
- Reviewed:
- Authors: Ponsford, Megan
- Date: 2019
- Type: Text , Journal article
- Relation: Sport in Society Vol. 22, no. 1 (2019), p. 97-112
- Full Text: false
- Reviewed:
- Description: This article critiques the symbolism of the journey as a team of Australian cricketers voyaged to India in 1935 embarking on the first Australia cricket tour to the subcontinent. Travel and tourism theories explicate the reactions of the cricketers to the ambivalence of being neither home nor away. This article asks: what did the Australians learn about themselves, their home and their destination whilst in transit? The theme of transition, both physical and emotional, is the central focus of this study. The journey on the ship signifies the team’s last immersion (for the duration of the tour) within exclusively English structures and customs. The cricketers’ insecurity when faced with the looming unknown upon descending the gangplank into India is extrapolated from available sources. The influence of Frank Tarrant as leader and educator intensified in the artificial hermetic vacuum of the ship’s environment. The unceremonious departure scenes in Melbourne, Adelaide and Fremantle are described and contrasted with the formality of the arrival in Bombay; such contrasts epitomize and underpin the cultural differences encountered throughout the tour.
The atmosphere vibrated with triumphant joy
- Authors: Ponsford, Megan
- Date: 2019
- Type: Text , Journal article
- Relation: Sport in Society Vol. 22, no. 1 (2019), p. 185-196
- Full Text: false
- Reviewed:
- Description: This article critiques the Indian material culture located in present-day Pakistan pertaining to the inaugural Australian cricket tour to colonial India in 1935/36. The historical voice of the Indians is evident in the images and it is over the shoulders of the hosts of the tour that new perspectives emerge. It is culturally inappropriate to assume and evaluate how the locals felt about the visit of the Australian cricketers and the raison d’être of the tour. However, archives located in Pakistan provide a deeply subjective perspective. Goodwill and amicability reverberate through the photographs challenging conventional scholarship, which argues that Australian-Indian cricket is based on acrimony. The article concludes that despite the obvious and significant differences between the competing teams the tour experience minimized the racial divide between the Australian and the Indian cricketers.
The launch of Indian-Australian cricket
- Authors: Ponsford, Megan
- Date: 2019
- Type: Text , Journal article
- Relation: Sport in Society Vol. 22, no. 1 (2019), p. 113-142
- Full Text: false
- Reviewed:
- Description: An analysis of the Australian cricket team’s experiences in India in 1935/36 reveals that many elements fulfilled clichéd colonial expectations of extravagance, privilege and hedonism. Yet, the cricketers simultaneously grew tired of conforming to the role of the colonial class. The team’s immersion in India encouraged the players to re-evaluate their attitudes towards racial inclusion, the legitimacy of colonization, Indian sovereignty and the pedagogical role of the white cricketer in the Orient. A close textual analysis of the writings of participating cricketers Wendell Bill, Ron Oxenham, Charlie Macartney and Jack Ryder details their responses to the social and racial codification they encountered, which it is argued was at times unexpectedly liberal. Australian batsman Hunter Hendry’s manuscript has not been critiqued elsewhere and provides valuable insight into his ambiguity towards the role of the white cricketer in India.
The has-beens and never will-bes
- Authors: Ponsford, Megan
- Date: 2019
- Type: Text , Journal article
- Relation: Sport in Society Vol. 22, no. 1 (2019), p. 53-72
- Full Text: false
- Reviewed:
- Description: The Australian team that toured India in 1935/36 comprised atypical cricket personnel. Their cultural and social unorthodoxy contributed to the tour being shunned by cricket officialdom in Australia. Tour manager, Frank Tarrant’s method of team selection was meritocratic unlike that of customary cricket practice where social and cultural hierarchy informed team composition. This article outlines the unorthodox team composition and argues that the official cricket body objected to the exercise because of the professional nature of the tour, social (particularly class) discrimination and preconceptions of racial prejudices. The Maharaja of Patiala’s generous financing of the tour identified it as a definitively professional exercise and encouraged participation considering the precarious status of the global economy following the Great Depression. The goodwill between Australia and India evidenced on tour challenged cricket protocol and reflects a pragmatic and growing recognition that diplomatic and economic unity was desirable in light of the imminent dissolution of the British Empire.