Bridging the research-practice gap: Validity of a software tool designed to support systemic accident analysis by risk managers
- Authors: Goode, Natassia , Salmon, Paul , Taylor, Natalie , Lenné, Michael , Finch, Caroline
- Date: 2015
- Type: Text , Conference proceedings
- Relation: http://purl.org/au-research/grants/nhmrc/1058737
- Full Text: false
- Description: Despite the proposed advantages of systems accident analysis (SAA) methods for understanding incident-causation, these approaches have not been widely adopted by practitioners. This represents a significant gap between research and practice in accident analysis. The Understanding and Preventing Led Outdoor Accidents Data System (UPLOADS) provides a series of tools to address this gap. The aim of this study was to evaluate the validity of UPLOADS by comparing analyses generated by risk managers and researchers experienced in SAA. Twenty-three risk managers used UPLOADS to collect and analyse incident data from their organization over a three month period. The reports were then analyzed by two researchers experienced in SAA, and compared to those generated by participants. Participants identified half the number of factors identified by researchers, and tended to focus on only one or two factors as the causes of each incident. The potential consequences for practitioners' understanding of incident-causation and countermeasure development are discussed, as well as ways of improving the system. © Springer International Publishing Switzerland 2015.
How do I save it? Usability evaluation of a systems theory-based incident reporting software prototype by novice end users
- Authors: Grant, Eryn , Goode, Natassia , Salmon, Paul , Lenné, Michael , Scott-Parker, Bridie , Finch, Caroline
- Date: 2015
- Type: Text , Conference proceedings
- Relation: http://purl.org/au-research/grants/nhmrc/1058737
- Full Text: false
- Description: The level of usability achieved by software tools is a key factor that determines their success and indeed uptake by end users. This paper describes a study that was undertaken to evaluate the usability of a prototype incident reporting software tool. The study involved novice end users completing a series of tasks using the software tool and then completing Ravden and Johnson’s Human Computer Interaction (HCI) checklist. The findings identify aspects of the system that pose particular challenges for participants. Participants appeared to lack a clear understanding of the relationship between the information required from them, and the underpinning accident analysis method of the software tool. This is perhaps unsurprising, considering that most incident reporting systems do not include these functions. The findings indicate that the tool requires better levels of intuitiveness to assist users in complex tasks so the focus is on awareness of accident causation methods rather than task instructions. The implications for the design of incident reporting software tools are discussed. © Springer International Publishing Switzerland 2015.