Improving the non-technical skills of hospital medical emergency teams : The Team Emergency Assessment Measure (TEAM™)
- Authors: Cant, Robyn , Porter, Joanne , Cooper, Simon J. , Roberts, Kate , Wilson, Ian , Gartside, Christopher
- Date: 2016
- Type: Text , Journal article
- Relation: EMA - Emergency Medicine Australasia Vol. 28, no. 6 (2016), p. 641-646
- Full Text: false
- Reviewed:
- Description: Objectives: This prospective descriptive study aimed to test the validity and feasibility of the Team Emergency Assessment Measure (TEAM™) for assessing real-world medical emergency teams' non-technical skills. Second, the present study aimed to explore the instrument's contribution to practice regarding teamwork and learning outcomes. Methods: Registered nurses (RNs) and medical staff (n = 104) in two hospital EDs in rural Victoria, Australia, participated. Over a 10 month period, the (TEAM™) instrument was completed by multiple clinicians at medical emergency episodes. Results: In 80 real-world medical emergency team resuscitation episodes (283 clinician assessments), non-technical skills ratings averaged 89% per episode (39 of a possible 44 points). Twenty-one episodes were rated in the lowest quartile (i.e. ≤37 points out of 44). Ratings differed by discipline, with significantly higher scores given by medical raters (mean: 41.1 ± 4.4) than RNs (38.7 ± 5.4) (P = 0.001). This difference occurred in the Leadership domain. The tool was reliable with Cronbach's alpha 0.78, high uni-dimensional validity and mean inter-item correlation of 0.45. Concurrent validity was confirmed by strong correlation between TEAM™ score and the awarded Global Rating (P < 0.001), with 38.4% of shared variance. RNs praised the instrument as it initiated staff reflection and debriefing discussions around performance improvement. Conclusion: Non-technical skills of medical emergency teams are known to often be suboptimal; however, average ratings of 89% were achieved in this real-world study. TEAM™ is a valid, reliable and easy to use tool, for both training and clinical settings, with benefits for team performance when used as an assessment and/or debriefing tool. © 2016 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine
Promoting quality learning and teaching pedagogy : evaluating a targeted localised academic induction program (AIP) for the impact on continuing professional development
- Authors: Weuffen, Sara , Andrews, Tulsa , Roberts, Kate
- Date: 2020
- Type: Text , Journal article
- Relation: Australian Journal of Adult Learning Vol. 60, no. 2 (2020), p. 245-267
- Full Text:
- Reviewed:
- Description: Despite their position as providers of tertiary education, universities sit beyond normalised discourses of education where qualifications, registration, and continuing professional development are concerned. In this case study, we explore how participation in an academic induction program (AIP) builds foundational andragogy knowledge and skills and fosters individual commitment to continuing professional development (PD) for the critical engagement, maintenance, and enhancement of quality teaching practices. Through a poststructuralist lens, we gathered triangulated evidence via surveys (n=32) and attendance data (n=190). Our findings indicate a positive correlation between AIP attendance and initial PD engagement but identifies a 35% decline in PD uptake six-month post-AIP. Survey responses indicate that while an AIP is a valuable tool for prompting initial engagement in learning and teaching PD, the role and function of teaching within universities needs to be elevated in order to support a career-long commitment to academic enhancement. © 2020, Adult Learning Australia. All rights reserved.