Head, neck, and facial injuries in Australian cricket
- Authors: Kodikara, Dulan
- Date: 2023
- Type: Text , Thesis , PhD
- Full Text:
- Description: Head, neck and facial (HNF) injuries are a significant concern in cricket due to the nature of the game and the potential impact of fast-moving balls and collisions. These types of injuries occur as a result of direct hits from the cricket ball, accidental collisions between players or falls during fielding or batting. HNF injuries can range from minor cuts and bruises to more severe concussions, fractures, or dental trauma. While some HNF injuries in cricket can be career-ending and severe, others may not be as catastrophic. Over the past decade, there has been a noticeable increase in the incidence of HNF injuries in elite-level cricket, and the tragic death of an Australian test cricketer in 2014 from a head injury heightened awareness of the seriousness and potential fatality of such injuries in the sport. To mitigate the risk of serious injuries, cricket players are encouraged to wear protective equipment such as helmets and neck guards. At the elite level of the sport, stringent safety protocols and regulations are enforced to prioritise player wellbeing, ensuring that immediate medical attention is available during training or games. Further, routine injury surveillance at the elite level has proven effective in monitoring and reducing the likelihood of serious HNF injuries. Nevertheless, there is a noticeable lack of research investigating HNF injuries among cricket participants, particularly at the community level. This lack of reporting hampers the identification and implementation of effective strategies to minimise the risk of such injuries. This thesis seeks to bridge this research gap by examining HNF injuries in community-level cricket under two broad objectives, providing valuable insights for injury prevention and risk mitigation strategies. The first objective of this thesis was to develop a comprehensive understanding of HNF cricket injury epidemiology and the reporting of helmet usage. A systematic review was conducted, analysing 29 studies to determine the incidence, nature, and mechanisms of HNF injuries in cricket, the reported use of helmets and ‘gold standard’ definitions. Facial fractures and concussions were the most frequently specified types of injuries, and the impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. Only three studies (10%) reported the use of helmets. The systematic review highlighted the lack of evidence regarding the reporting of HNF cricket injuries according to international cricket consensus injury definitions, as well as the limited data on helmet usage at the time of injury. Additionally, the review identified gaps in evidence concerning HNF injuries across different age groups, levels of play and diverse populations, along with discrepancies in reporting injury-specific mechanisms. Community-level HNF cricket injuries that required hospitalisation in Victoria, Australia, over a decade, spanning from 2007/8 to 2016/17 were also reviewed under the first objective. During this period, Victorian hospitals treated 3,907 HNF cricket injuries. Male participants accounted for a higher number of injuries than female participants, and the age group most commonly requiring hospital treatment was 10–14 years. Open wounds were the most frequent type of injury (30%), and the primary mechanism for HNF cricket injuries during this decade was being hit, struck, or crushed (86%). Our literature review and the hospital study form the ideal platform for injury prevention efforts by establishing HNF injury prevalence and common injury mechanisms. The second broad objective of this thesis was to investigate the use of cricket helmets among cricket participants, to study the ability of Australian cricket participants to perceive injury risk and to explore the knowledge and awareness of concussion assessment and management. An online survey was conducted to address each facet of our second objective. Over 90% of the players and 50% of the officials reported wearing a helmet during the 2018/19 cricket season, but most did not use a neck protector. Most of the helmets used met the recommended British Standards, and the most common brand used was Masuri. For most of the players and officials who participated in our survey, comfort, and ability to prevent HNF injuries were the two most important factors affecting their decision to purchase a cricket helmet. More than 80% of players and almost 50% of officials expressed the belief that helmets were not necessary for activities such as bowling and fielding at a distance from the batter. Yet, the fact that more than 80% of all participants expressed their willingness to keep using helmets under compulsory regulations indicates that implementing mandatory helmet rules might result in a significant increase in helmet adoption and enhance the overall safety of the sport. Over 70% of our survey participants demonstrated satisfactory levels of knowledge regarding concussion assessment and management. These findings suggest that the potential for severe complications stemming from concussions related to cricket could be reduced, particularly in light of the limited availability of qualified medical professionals at the community-level. The strong understanding of concussion guidelines among our survey participants implies that they would be inclined to prioritise safety and choose helmets that align with the recommended safety standards. In summary, this PhD research has achieved its objective of making the first large-scale scientific contribution to enhance safety and prevent HNF injuries among participants of community-level cricket in Australia. Additionally, this research effectively assessed the participants’ knowledge, comprehension and attitudes regarding utilising protective helmets and the importance of following Cricket Australia’s concussion guidelines.
- Description: Doctor of Philosophy
- Authors: Kodikara, Dulan
- Date: 2023
- Type: Text , Thesis , PhD
- Full Text:
- Description: Head, neck and facial (HNF) injuries are a significant concern in cricket due to the nature of the game and the potential impact of fast-moving balls and collisions. These types of injuries occur as a result of direct hits from the cricket ball, accidental collisions between players or falls during fielding or batting. HNF injuries can range from minor cuts and bruises to more severe concussions, fractures, or dental trauma. While some HNF injuries in cricket can be career-ending and severe, others may not be as catastrophic. Over the past decade, there has been a noticeable increase in the incidence of HNF injuries in elite-level cricket, and the tragic death of an Australian test cricketer in 2014 from a head injury heightened awareness of the seriousness and potential fatality of such injuries in the sport. To mitigate the risk of serious injuries, cricket players are encouraged to wear protective equipment such as helmets and neck guards. At the elite level of the sport, stringent safety protocols and regulations are enforced to prioritise player wellbeing, ensuring that immediate medical attention is available during training or games. Further, routine injury surveillance at the elite level has proven effective in monitoring and reducing the likelihood of serious HNF injuries. Nevertheless, there is a noticeable lack of research investigating HNF injuries among cricket participants, particularly at the community level. This lack of reporting hampers the identification and implementation of effective strategies to minimise the risk of such injuries. This thesis seeks to bridge this research gap by examining HNF injuries in community-level cricket under two broad objectives, providing valuable insights for injury prevention and risk mitigation strategies. The first objective of this thesis was to develop a comprehensive understanding of HNF cricket injury epidemiology and the reporting of helmet usage. A systematic review was conducted, analysing 29 studies to determine the incidence, nature, and mechanisms of HNF injuries in cricket, the reported use of helmets and ‘gold standard’ definitions. Facial fractures and concussions were the most frequently specified types of injuries, and the impact of the ball was reported as the most common mechanism for sustaining HNF injuries in cricket. Only three studies (10%) reported the use of helmets. The systematic review highlighted the lack of evidence regarding the reporting of HNF cricket injuries according to international cricket consensus injury definitions, as well as the limited data on helmet usage at the time of injury. Additionally, the review identified gaps in evidence concerning HNF injuries across different age groups, levels of play and diverse populations, along with discrepancies in reporting injury-specific mechanisms. Community-level HNF cricket injuries that required hospitalisation in Victoria, Australia, over a decade, spanning from 2007/8 to 2016/17 were also reviewed under the first objective. During this period, Victorian hospitals treated 3,907 HNF cricket injuries. Male participants accounted for a higher number of injuries than female participants, and the age group most commonly requiring hospital treatment was 10–14 years. Open wounds were the most frequent type of injury (30%), and the primary mechanism for HNF cricket injuries during this decade was being hit, struck, or crushed (86%). Our literature review and the hospital study form the ideal platform for injury prevention efforts by establishing HNF injury prevalence and common injury mechanisms. The second broad objective of this thesis was to investigate the use of cricket helmets among cricket participants, to study the ability of Australian cricket participants to perceive injury risk and to explore the knowledge and awareness of concussion assessment and management. An online survey was conducted to address each facet of our second objective. Over 90% of the players and 50% of the officials reported wearing a helmet during the 2018/19 cricket season, but most did not use a neck protector. Most of the helmets used met the recommended British Standards, and the most common brand used was Masuri. For most of the players and officials who participated in our survey, comfort, and ability to prevent HNF injuries were the two most important factors affecting their decision to purchase a cricket helmet. More than 80% of players and almost 50% of officials expressed the belief that helmets were not necessary for activities such as bowling and fielding at a distance from the batter. Yet, the fact that more than 80% of all participants expressed their willingness to keep using helmets under compulsory regulations indicates that implementing mandatory helmet rules might result in a significant increase in helmet adoption and enhance the overall safety of the sport. Over 70% of our survey participants demonstrated satisfactory levels of knowledge regarding concussion assessment and management. These findings suggest that the potential for severe complications stemming from concussions related to cricket could be reduced, particularly in light of the limited availability of qualified medical professionals at the community-level. The strong understanding of concussion guidelines among our survey participants implies that they would be inclined to prioritise safety and choose helmets that align with the recommended safety standards. In summary, this PhD research has achieved its objective of making the first large-scale scientific contribution to enhance safety and prevent HNF injuries among participants of community-level cricket in Australia. Additionally, this research effectively assessed the participants’ knowledge, comprehension and attitudes regarding utilising protective helmets and the importance of following Cricket Australia’s concussion guidelines.
- Description: Doctor of Philosophy
Injury epidemiology among Australian female cricketers
- Authors: Perera, Nirmala
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: Cricket is a male-dominated sport; however, its popularity among females is increasing. Like other sports, participation in cricket poses the risk of injury to players. Injury problems for female cricketers are virtually unknown, as studies examining cricket injuries include almost exclusively male participants. In other sports, the types of injuries experienced by men and women are known to be different. Therefore, a clear understanding of the extent and types of injuries sustained by female cricket players is required, to underpin appropriately targeted injury prevention strategies. This thesis provides the first detailed epidemiological profile of cricket injuries sustained by women, by: 1. conducting a systematic review describing injuries in competitive team bat or stick sports in women, to enable cricket injuries to be viewed within the perspective of wider, but relevant, injury data, 2. evaluating existing data sources relating to hospital admissions from Victoria and Queensland and successful insurance claims across Australia, 3. examining the nature and incidence of cricket injuries in elite female players using Cricket Australia’s Athlete Management System, and 4. conducting a nationwide self-report survey of injuries during the 2014–15 season. This PhD research represents participants from different levels of play, across age groups and across Australia. The findings indicate that incidence of injuries for female cricketers were higher than expected based on previous findings in comparable sports, except when considered in relation to insurance claims. The cricket injury rate across hospital presentations, insurance claims, the AMS (Fair Play AMS 2016) and self-reported survey data, each of which represents different level of the sports injury pyramid, identified all-rounders and pace bowlers as having a higher incidence of injury than players in other positions. The highest frequency of reported injuries were in the head, hands, knees and ankles. The nature of the most common injuries were dislocations/sprains/strains, fractures, muscle injury, joint injury and gradual onset injuries. At the elite-level, lumber spine stress fractures accounted for a significant amount of time-loss from the sport. In this thesis, findings from the insurance claims, self-reported survey and AMS (Fair Play AMS 2016) data indicated that most injuries were of a low severity and were more likely to be treated outside of healthcare facilities such as hospitals. In summary, patterns of the most common injuries, in terms of anatomical location and nature of the injuries, were consistent throughout community-level players with some similarities to elite-level players. However, the injury mechanisms and risk factors may differ depending on the level of competition and player’s skill. Recommendations are that ongoing injury surveillance should be conducted at all levels of the sport, and surveillance methodology should be tailored to the specific setting, personnel and available resources. Therefore, before implementing an injury surveillance system at the community-level of the sport, more research is needed to fully understand what type of injury surveillance system might be feasible and suitable in this context.
- Description: Doctor of Philosophy
- Authors: Perera, Nirmala
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: Cricket is a male-dominated sport; however, its popularity among females is increasing. Like other sports, participation in cricket poses the risk of injury to players. Injury problems for female cricketers are virtually unknown, as studies examining cricket injuries include almost exclusively male participants. In other sports, the types of injuries experienced by men and women are known to be different. Therefore, a clear understanding of the extent and types of injuries sustained by female cricket players is required, to underpin appropriately targeted injury prevention strategies. This thesis provides the first detailed epidemiological profile of cricket injuries sustained by women, by: 1. conducting a systematic review describing injuries in competitive team bat or stick sports in women, to enable cricket injuries to be viewed within the perspective of wider, but relevant, injury data, 2. evaluating existing data sources relating to hospital admissions from Victoria and Queensland and successful insurance claims across Australia, 3. examining the nature and incidence of cricket injuries in elite female players using Cricket Australia’s Athlete Management System, and 4. conducting a nationwide self-report survey of injuries during the 2014–15 season. This PhD research represents participants from different levels of play, across age groups and across Australia. The findings indicate that incidence of injuries for female cricketers were higher than expected based on previous findings in comparable sports, except when considered in relation to insurance claims. The cricket injury rate across hospital presentations, insurance claims, the AMS (Fair Play AMS 2016) and self-reported survey data, each of which represents different level of the sports injury pyramid, identified all-rounders and pace bowlers as having a higher incidence of injury than players in other positions. The highest frequency of reported injuries were in the head, hands, knees and ankles. The nature of the most common injuries were dislocations/sprains/strains, fractures, muscle injury, joint injury and gradual onset injuries. At the elite-level, lumber spine stress fractures accounted for a significant amount of time-loss from the sport. In this thesis, findings from the insurance claims, self-reported survey and AMS (Fair Play AMS 2016) data indicated that most injuries were of a low severity and were more likely to be treated outside of healthcare facilities such as hospitals. In summary, patterns of the most common injuries, in terms of anatomical location and nature of the injuries, were consistent throughout community-level players with some similarities to elite-level players. However, the injury mechanisms and risk factors may differ depending on the level of competition and player’s skill. Recommendations are that ongoing injury surveillance should be conducted at all levels of the sport, and surveillance methodology should be tailored to the specific setting, personnel and available resources. Therefore, before implementing an injury surveillance system at the community-level of the sport, more research is needed to fully understand what type of injury surveillance system might be feasible and suitable in this context.
- Description: Doctor of Philosophy
Progressive rebels of Boy's Own Adventure? The 1935 Australian Cricket tour of India; breaking down social and racial barriers
- Authors: Ponsford, Megan
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: In October 1935, a touring party embarked on the inaugural tour of India by an Australian cricket team. To a great, and somewhat stereotypical, extent popular representations of IndianeAustralian relations are viewed through the lens of cricket – the national game in both countries. This dissertation about a significant, yet overlooked, chapter in sporting history examines the Australian cricketers’ response to the social, racial and political hierarchies of lateecolonial India. The experience of the touring party encouraged a reeimagining of ideological perspectives and this thesis identifies a uniquely Australian subjectivity to the British colonisation of India. The tour between the colony (India) and the dominion (Australia) can be interpreted as an antie imperial gesture. Both countries were attempting to forge relationships that would be independent from Britain. The role of cricket, itself experiencing a renaissance during the 1930s as it transformed from a largely amateur pursuit to an increasingly professional occupation is interrogated. As part of this transformation international cricket positioned itself as an increasingly politicised global entity within the broader turbulence of the firstehalf of the twentieth century. All those involved in the tour are now dead. However a close historical analysis of previously lost, highly personalised, primary material (letters, manuscripts, photographs and cricket ephemera) enables an interpretation of the players’ experience. This thesis argues that sporting events can be interpreted as cultural ciphers yet scholars and the wider sportsewriting community have neglected the historical significance of the 1935/36 tour. The unofficial status of the tour and its highly professional emphasis alienated it from the amateur ideals of Australian cricket. This transnational, multiedisciplinary approach addresses a lacunae in the professional trajectory of cricket. It also provides a new understanding and historical counter narrative of idetwentieth century IndianeAustralian sporting history and cultural exchange.
- Description: Doctor of Philosophy
- Authors: Ponsford, Megan
- Date: 2016
- Type: Text , Thesis , PhD
- Full Text:
- Description: In October 1935, a touring party embarked on the inaugural tour of India by an Australian cricket team. To a great, and somewhat stereotypical, extent popular representations of IndianeAustralian relations are viewed through the lens of cricket – the national game in both countries. This dissertation about a significant, yet overlooked, chapter in sporting history examines the Australian cricketers’ response to the social, racial and political hierarchies of lateecolonial India. The experience of the touring party encouraged a reeimagining of ideological perspectives and this thesis identifies a uniquely Australian subjectivity to the British colonisation of India. The tour between the colony (India) and the dominion (Australia) can be interpreted as an antie imperial gesture. Both countries were attempting to forge relationships that would be independent from Britain. The role of cricket, itself experiencing a renaissance during the 1930s as it transformed from a largely amateur pursuit to an increasingly professional occupation is interrogated. As part of this transformation international cricket positioned itself as an increasingly politicised global entity within the broader turbulence of the firstehalf of the twentieth century. All those involved in the tour are now dead. However a close historical analysis of previously lost, highly personalised, primary material (letters, manuscripts, photographs and cricket ephemera) enables an interpretation of the players’ experience. This thesis argues that sporting events can be interpreted as cultural ciphers yet scholars and the wider sportsewriting community have neglected the historical significance of the 1935/36 tour. The unofficial status of the tour and its highly professional emphasis alienated it from the amateur ideals of Australian cricket. This transnational, multiedisciplinary approach addresses a lacunae in the professional trajectory of cricket. It also provides a new understanding and historical counter narrative of idetwentieth century IndianeAustralian sporting history and cultural exchange.
- Description: Doctor of Philosophy
The determinants and development of fast bowling performance in cricket
- Authors: Feros, Simon
- Date: 2015
- Type: Text , Thesis , PhD
- Full Text:
- Description: This thesis sought to reveal the physical and kinematic determinants of pace bowling performance. After drawing on these determinants, a secondary aim was to investigate whether pace bowling performance could be enhanced with chronic resistance training and warm-up strategies. However, before the physical and kinematic determinants of pace bowling performance could be identified, and the effects of two training interventions and warm-ups on pace bowling performance, a new pace bowling test was created, and the test-retest reliability of its performance and kinematic measures were evaluated. Knowledge of a variables’ test-retest reliability is important for interpreting the validity of correlations, but also for the determination of a meaningful change following a training intervention. Only one published study to date has explored the test-retest reliability of a pace bowling assessment, and this test only measured bowling accuracy (1). Previous research has not comprehensively examined the relationships between physical qualities and pace bowling performance. Several important physical qualities (e.g., power, speed-acceleration, flexibility, repeat-sprint ability) have been excluded in correlational research, which may be crucial for optimal pace bowling performance. Furthermore, there is only one published training intervention study on pace bowling research (2). Consequently there is scant evidence for coaches to design training programs proven to enhance pace bowling performance. Baseball pitching studies have trialled the effects of heavy-ball throwing in the warm-up on subsequent throwing velocity and accuracy, but this approach has not been studied in cricket pace bowling, especially after several weeks of training. Therefore, four studies were conducted in this PhD project to address these deficiencies in the literature. The purpose of Study 1 (Chapter 3) was to ascertain the test-retest reliability of bowling performance measures (i.e., bowling speed, bowling accuracy, consistency of bowling speed, and consistency of bowling accuracy) and selected bowling kinematics (i.e., approach speed, step length, step-length phase duration, power phase duration, and knee extension angle at front-foot contact and at ball release) in a novel eight-over test, and for the first four overs of this test. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and coefficient of variation (CV) were used as measures of test-retest reliability (3). Following a three week familiarisation period of bowling, 13 participants completed a novel eight-over bowling test on two separate days with 4–7 days apart. The most reliable performance measures in the bowling test were peak bowling speed (ICC = 0.948–0.975, CV = 1.3–1.9%) and mean bowling speed (ICC = 0.981–0.987, CV = 1.0–1.3%). Perceived effort was partially reliable (ICC = 0.650– 0.659, CV = 3.8–3.9%). However, mean bowling accuracy (ICC = 0.491–0.685, CV = 12.5–16.8%) and consistency of bowling accuracy failed to meet the pre-set standard for acceptable reliability (ICC = 0.434–0.454, CV = 15.3–19.3%). All bowling kinematic variables except approach speed exhibited acceptable reliability (i.e., ICC > 0.8, CV < 10%). The first four overs of the bowling test exhibited slightly poorer test-retest reliability for all measures, compared to the entire eight-over test. There were no systematic biases (i.e., p > 0.05) detected with all variables between bowling tests, indicating there was no learning or fatigue effects. The smallest worthwhile change was established for all bowling performance and kinematic variables, by multiplying the SEM by 1.5 (4). It is recommended that the eight-over pace bowling test be used as a more comprehensive measure of consistency of bowling speed and consistency of bowling accuracy, as bowlers are more likely to be fatigued. However, if coaches seek to assess pace bowlers in shorter time, delimiting the test to the first four overs is recommended. Both versions of the pace bowling test are only capable of reliably measuring bowling performance outcomes such as peak and mean bowling speed, and perceived effort. The second study of this PhD project examined the relationships between selected physical qualities, bowling kinematics, and bowling performance measures. Another purpose of this novel study was to determine if delivery instructions (i.e., maximal-effort, match-intensity, slower-ball) influenced the strength of the relationships between physical qualities and bowling performance measures. Given that there were three delivery instructions in the bowling test, an objective of this study was to explore the relationship between bowling speed and bowling accuracy (i.e., speed-accuracy trade-off). Thirty-one participants completed an eight-over bowling test in the first session, and a series of physical tests, spread over two separate sessions. Each session was separated by four to seven days. Mean bowling speed (of all pooled deliveries) was significantly correlated to 1-RM pull-up strength (rs [24] = 0.55, p = 0.01) and 20-m sprint time (rs [30] = -0.37, p = 0.04), but the correlations marginally increased as delivery effort increased (i.e., maximal-effort ball). Greater hamstring flexibility was associated with a better consistency of bowling speed, but only for a match-intensity delivery (rs [29] = -0.49, p = 0.01). Repeat-sprint ability (i.e., percent decrement on 10 × 20-m sprints, on every 20 s) displayed a stronger correlation to consistency of bowling speed (rs [21] = -0.42, p = 0.06) than for mean bowling speed (rs [21] = 0.15, p = 0.53). Bench press strength was moderately related to bowling accuracy for a maximal-effort delivery (rs [26] = -0.42, p = 0.03), with weaker but non-significant (p > 0.05) correlations for match-intensity and slower-ball deliveries. Bowling accuracy was also significantly related to peak concentric countermovement jump power (rs [28] = -0.41, p = 0.03) and mean peak concentric countermovement jump power (rs [27] = -0.45, p = 0.02), with both physical qualities displaying stronger correlations as delivery effort increased. Greater reactive strength was negatively associated with mean bowling accuracy (rs [30] = 0.38, p = 0.04) and consistency of bowling accuracy (rs [30] = 0.43, p = 0.02) for maximal-effort deliveries only. Faster bowling speeds were correlated to a longer step length (rs [31] = 0.51, p < 0.01) and quicker power phase duration (rs [31] = -0.45, p = 0.01). A better consistency of bowling accuracy was associated with a faster approach speed (rs [31] = -0.36, p = 0.05) and greater knee flexion angle at ball release (rs [27] = -0.42, p = 0.03). No speedaccuracy trade-off was observed for the group (rs [31] = -0.28, p = 0.12), indicating that most bowlers could be instructed to train at maximal-effort without compromising bowling accuracy. Pull-up strength training and speed-acceleration training were chosen for the “evidence-based” training program (Study 3). Heavy-ball bowling was also considered as part of the evidence-based training program, as it is a specific form of training used previously, and because there was a shortage of significant relationships (p < 0.05) between physical qualities and bowling performance measures in Study 2. The third investigation of this PhD project compared the effects of an eight-week evidence-based training program or normal training program (not a control group) on pace bowling performance, approach speed, speed-acceleration, and pull-up strength. Participants were matched for bowling speed and then randomly split into two training groups, with six participants in each group. After an initial two-week familiarisation period of bowling training, sprint training, and pull-up training, participants completed two training sessions per week, and were tested before and after the training intervention. Testing comprised the four-over pace bowling test (Study 1), 20-m sprint test (Study 2), and 1-RM pull-up test (Study 2). In training, the volume of bowling and sprinting was constant between both groups; the only differences were that the evidence-based training group bowled with heavy balls (250 g and 300 g) as well as a regular ball (156 g), sprinted with a weighted-vest (15% and 20% body mass) and without a weighted-vest, and performed pull-up training. Participants were instructed to deliver each ball with maximal effort in training, as no speed-accuracy trade-off was observed for the sample in Study 2. The evidence-based training group bowled with poorer accuracy and consistency of accuracy, with only a small improvement in peak and mean bowling speed. Heavy-ball bowling may have had a negative transfer to regular-ball bowling. Although speculative, a longer evidence-based program may have significantly enhanced bowling speed. Coaches could use both training programs to develop performance but should be aware that bowling accuracy may suffer with the evidence-based program. The evidence-based training group displayed slower 20-m sprint times following training (0.08 ± 0.05 s). However, the normal training group was also slower (0.10 ± 0.09 s), indicating the potential for speed-acceleration improvement is compromised if speed training is performed immediately after bowling training; most likely due to residual fatigue. Consequently it is recommended that speed-acceleration training be conducted when bowlers are not fatigued, in a separate session, or at the beginning of a session. The evidence-based training group improved their 1-RM pull-up strength by 5.8 ± 6.8 kg (d = 0.68), compared to the normal training group of 0.2 ± 1.7 kg (d = 0.01). The difference between training groups is due to the fact that the normal training group were not prescribed pull-up training. As many participants could not complete the pull-up exercise due to insufficient strength, the dumbbell pullover may be a suitable alternative that is more specific to the motion of the bowling arm (i.e., extended arm). The fourth study of this PhD project explored the acute effects of a heavy-ball bowling warm-up on pace bowling performance, and determined if these acute effects could be enhanced or negated following an evidence-based training program. This study involved the same participants who completed the evidence-based training program in Study 3. These participants were required to perform two different bowling warm-ups (heavy-ball or regular-ball) in pre and post-test period, followed by the four-over pace bowling test (Study 1). In pre-test period, bowling accuracy was 8.8 ± 7.4 cm worse for the heavy-ball warm-up compared to the regular-ball warm-up (d = 1.19). In post-test period however, bowling accuracy was 5.5 ± 6.4 cm better in the heavy-ball warm-up compared to the regular-ball warm-up (d = -0.90). A similar trend was observed for consistency of bowling accuracy. These findings indicate that pace bowlers adapt to heavy-ball bowling, and bowl more accurately with a regular ball if they warm-up with a heavy ball first (but only after eight weeks of heavy-ball training). Coaches could employ a heavy-ball warm-up prior to training or a match, but only after eight weeks of evidence based training. It is hypothesised that a less biomechanically similar exercise to the pace bowling motion such as resisted push-ups / bench press throws could be more effective in eliciting potentiation by activating higher order motor units without negatively transferring to bowling performance. From the studies presented in this thesis, it is concluded that peak and mean bowling speed are the most reliable bowling performance measures, and all kinematic variables apart from approach speed possess excellent reliability. Furthermore, 1-RM pull-up strength and 20-m speed are significantly correlated to bowling speed. An evidence-based training program can develop peak and mean bowling speed, but the cost to bowling accuracy and consistency of bowling accuracy does not make this training program worthwhile in enhancing pace bowling performance. A heavy-ball warm-up impairs bowling accuracy and consistency of bowling accuracy compared to the regular-ball warm-up, but only prior to training with the heavier balls. Pace bowlers adapt to heavyball bowling after eight weeks of training, but must use the heavy balls in the warm-up to bowl more accurately with a regular ball, otherwise pace bowling performance is below optimal.
- Description: Doctor of Philosophy
- Authors: Feros, Simon
- Date: 2015
- Type: Text , Thesis , PhD
- Full Text:
- Description: This thesis sought to reveal the physical and kinematic determinants of pace bowling performance. After drawing on these determinants, a secondary aim was to investigate whether pace bowling performance could be enhanced with chronic resistance training and warm-up strategies. However, before the physical and kinematic determinants of pace bowling performance could be identified, and the effects of two training interventions and warm-ups on pace bowling performance, a new pace bowling test was created, and the test-retest reliability of its performance and kinematic measures were evaluated. Knowledge of a variables’ test-retest reliability is important for interpreting the validity of correlations, but also for the determination of a meaningful change following a training intervention. Only one published study to date has explored the test-retest reliability of a pace bowling assessment, and this test only measured bowling accuracy (1). Previous research has not comprehensively examined the relationships between physical qualities and pace bowling performance. Several important physical qualities (e.g., power, speed-acceleration, flexibility, repeat-sprint ability) have been excluded in correlational research, which may be crucial for optimal pace bowling performance. Furthermore, there is only one published training intervention study on pace bowling research (2). Consequently there is scant evidence for coaches to design training programs proven to enhance pace bowling performance. Baseball pitching studies have trialled the effects of heavy-ball throwing in the warm-up on subsequent throwing velocity and accuracy, but this approach has not been studied in cricket pace bowling, especially after several weeks of training. Therefore, four studies were conducted in this PhD project to address these deficiencies in the literature. The purpose of Study 1 (Chapter 3) was to ascertain the test-retest reliability of bowling performance measures (i.e., bowling speed, bowling accuracy, consistency of bowling speed, and consistency of bowling accuracy) and selected bowling kinematics (i.e., approach speed, step length, step-length phase duration, power phase duration, and knee extension angle at front-foot contact and at ball release) in a novel eight-over test, and for the first four overs of this test. The intraclass correlation coefficient (ICC), standard error of measurement (SEM), and coefficient of variation (CV) were used as measures of test-retest reliability (3). Following a three week familiarisation period of bowling, 13 participants completed a novel eight-over bowling test on two separate days with 4–7 days apart. The most reliable performance measures in the bowling test were peak bowling speed (ICC = 0.948–0.975, CV = 1.3–1.9%) and mean bowling speed (ICC = 0.981–0.987, CV = 1.0–1.3%). Perceived effort was partially reliable (ICC = 0.650– 0.659, CV = 3.8–3.9%). However, mean bowling accuracy (ICC = 0.491–0.685, CV = 12.5–16.8%) and consistency of bowling accuracy failed to meet the pre-set standard for acceptable reliability (ICC = 0.434–0.454, CV = 15.3–19.3%). All bowling kinematic variables except approach speed exhibited acceptable reliability (i.e., ICC > 0.8, CV < 10%). The first four overs of the bowling test exhibited slightly poorer test-retest reliability for all measures, compared to the entire eight-over test. There were no systematic biases (i.e., p > 0.05) detected with all variables between bowling tests, indicating there was no learning or fatigue effects. The smallest worthwhile change was established for all bowling performance and kinematic variables, by multiplying the SEM by 1.5 (4). It is recommended that the eight-over pace bowling test be used as a more comprehensive measure of consistency of bowling speed and consistency of bowling accuracy, as bowlers are more likely to be fatigued. However, if coaches seek to assess pace bowlers in shorter time, delimiting the test to the first four overs is recommended. Both versions of the pace bowling test are only capable of reliably measuring bowling performance outcomes such as peak and mean bowling speed, and perceived effort. The second study of this PhD project examined the relationships between selected physical qualities, bowling kinematics, and bowling performance measures. Another purpose of this novel study was to determine if delivery instructions (i.e., maximal-effort, match-intensity, slower-ball) influenced the strength of the relationships between physical qualities and bowling performance measures. Given that there were three delivery instructions in the bowling test, an objective of this study was to explore the relationship between bowling speed and bowling accuracy (i.e., speed-accuracy trade-off). Thirty-one participants completed an eight-over bowling test in the first session, and a series of physical tests, spread over two separate sessions. Each session was separated by four to seven days. Mean bowling speed (of all pooled deliveries) was significantly correlated to 1-RM pull-up strength (rs [24] = 0.55, p = 0.01) and 20-m sprint time (rs [30] = -0.37, p = 0.04), but the correlations marginally increased as delivery effort increased (i.e., maximal-effort ball). Greater hamstring flexibility was associated with a better consistency of bowling speed, but only for a match-intensity delivery (rs [29] = -0.49, p = 0.01). Repeat-sprint ability (i.e., percent decrement on 10 × 20-m sprints, on every 20 s) displayed a stronger correlation to consistency of bowling speed (rs [21] = -0.42, p = 0.06) than for mean bowling speed (rs [21] = 0.15, p = 0.53). Bench press strength was moderately related to bowling accuracy for a maximal-effort delivery (rs [26] = -0.42, p = 0.03), with weaker but non-significant (p > 0.05) correlations for match-intensity and slower-ball deliveries. Bowling accuracy was also significantly related to peak concentric countermovement jump power (rs [28] = -0.41, p = 0.03) and mean peak concentric countermovement jump power (rs [27] = -0.45, p = 0.02), with both physical qualities displaying stronger correlations as delivery effort increased. Greater reactive strength was negatively associated with mean bowling accuracy (rs [30] = 0.38, p = 0.04) and consistency of bowling accuracy (rs [30] = 0.43, p = 0.02) for maximal-effort deliveries only. Faster bowling speeds were correlated to a longer step length (rs [31] = 0.51, p < 0.01) and quicker power phase duration (rs [31] = -0.45, p = 0.01). A better consistency of bowling accuracy was associated with a faster approach speed (rs [31] = -0.36, p = 0.05) and greater knee flexion angle at ball release (rs [27] = -0.42, p = 0.03). No speedaccuracy trade-off was observed for the group (rs [31] = -0.28, p = 0.12), indicating that most bowlers could be instructed to train at maximal-effort without compromising bowling accuracy. Pull-up strength training and speed-acceleration training were chosen for the “evidence-based” training program (Study 3). Heavy-ball bowling was also considered as part of the evidence-based training program, as it is a specific form of training used previously, and because there was a shortage of significant relationships (p < 0.05) between physical qualities and bowling performance measures in Study 2. The third investigation of this PhD project compared the effects of an eight-week evidence-based training program or normal training program (not a control group) on pace bowling performance, approach speed, speed-acceleration, and pull-up strength. Participants were matched for bowling speed and then randomly split into two training groups, with six participants in each group. After an initial two-week familiarisation period of bowling training, sprint training, and pull-up training, participants completed two training sessions per week, and were tested before and after the training intervention. Testing comprised the four-over pace bowling test (Study 1), 20-m sprint test (Study 2), and 1-RM pull-up test (Study 2). In training, the volume of bowling and sprinting was constant between both groups; the only differences were that the evidence-based training group bowled with heavy balls (250 g and 300 g) as well as a regular ball (156 g), sprinted with a weighted-vest (15% and 20% body mass) and without a weighted-vest, and performed pull-up training. Participants were instructed to deliver each ball with maximal effort in training, as no speed-accuracy trade-off was observed for the sample in Study 2. The evidence-based training group bowled with poorer accuracy and consistency of accuracy, with only a small improvement in peak and mean bowling speed. Heavy-ball bowling may have had a negative transfer to regular-ball bowling. Although speculative, a longer evidence-based program may have significantly enhanced bowling speed. Coaches could use both training programs to develop performance but should be aware that bowling accuracy may suffer with the evidence-based program. The evidence-based training group displayed slower 20-m sprint times following training (0.08 ± 0.05 s). However, the normal training group was also slower (0.10 ± 0.09 s), indicating the potential for speed-acceleration improvement is compromised if speed training is performed immediately after bowling training; most likely due to residual fatigue. Consequently it is recommended that speed-acceleration training be conducted when bowlers are not fatigued, in a separate session, or at the beginning of a session. The evidence-based training group improved their 1-RM pull-up strength by 5.8 ± 6.8 kg (d = 0.68), compared to the normal training group of 0.2 ± 1.7 kg (d = 0.01). The difference between training groups is due to the fact that the normal training group were not prescribed pull-up training. As many participants could not complete the pull-up exercise due to insufficient strength, the dumbbell pullover may be a suitable alternative that is more specific to the motion of the bowling arm (i.e., extended arm). The fourth study of this PhD project explored the acute effects of a heavy-ball bowling warm-up on pace bowling performance, and determined if these acute effects could be enhanced or negated following an evidence-based training program. This study involved the same participants who completed the evidence-based training program in Study 3. These participants were required to perform two different bowling warm-ups (heavy-ball or regular-ball) in pre and post-test period, followed by the four-over pace bowling test (Study 1). In pre-test period, bowling accuracy was 8.8 ± 7.4 cm worse for the heavy-ball warm-up compared to the regular-ball warm-up (d = 1.19). In post-test period however, bowling accuracy was 5.5 ± 6.4 cm better in the heavy-ball warm-up compared to the regular-ball warm-up (d = -0.90). A similar trend was observed for consistency of bowling accuracy. These findings indicate that pace bowlers adapt to heavy-ball bowling, and bowl more accurately with a regular ball if they warm-up with a heavy ball first (but only after eight weeks of heavy-ball training). Coaches could employ a heavy-ball warm-up prior to training or a match, but only after eight weeks of evidence based training. It is hypothesised that a less biomechanically similar exercise to the pace bowling motion such as resisted push-ups / bench press throws could be more effective in eliciting potentiation by activating higher order motor units without negatively transferring to bowling performance. From the studies presented in this thesis, it is concluded that peak and mean bowling speed are the most reliable bowling performance measures, and all kinematic variables apart from approach speed possess excellent reliability. Furthermore, 1-RM pull-up strength and 20-m speed are significantly correlated to bowling speed. An evidence-based training program can develop peak and mean bowling speed, but the cost to bowling accuracy and consistency of bowling accuracy does not make this training program worthwhile in enhancing pace bowling performance. A heavy-ball warm-up impairs bowling accuracy and consistency of bowling accuracy compared to the regular-ball warm-up, but only prior to training with the heavier balls. Pace bowlers adapt to heavyball bowling after eight weeks of training, but must use the heavy balls in the warm-up to bowl more accurately with a regular ball, otherwise pace bowling performance is below optimal.
- Description: Doctor of Philosophy
- «
- ‹
- 1
- ›
- »