Community expectations and anticipated outcomes for crisis support services—Lifeline Australia
- Ma, Jennifer, Batterham, Philip, Kõlves, Kairi, Woodward, Alan, Bradford, Sally, Klein, Britt, Titov, Nickolai, Mazzer, Kelly, O’Riordan, Megan, Rickwood, Debra
- Authors: Ma, Jennifer , Batterham, Philip , Kõlves, Kairi , Woodward, Alan , Bradford, Sally , Klein, Britt , Titov, Nickolai , Mazzer, Kelly , O’Riordan, Megan , Rickwood, Debra
- Date: 2022
- Type: Text , Journal article
- Relation: Health and Social Care in the Community Vol. 30, no. 5 (2022), p. 1775-1788
- Full Text: false
- Reviewed:
- Description: Crisis lines provide a critical first line of mental wellbeing support for community members in distress. Given the increasing referral to such services, there is a need to understand what the expectations of the community are around the role of such services in our public health responses. A computer assisted telephone interview was undertaken between 28th October and 30th November 2019. The aim was to explore expectations and anticipated outcomes of Lifeline Australia's crisis support services from a nationally representative community sample (N = 1,300). Analysis was undertaken to determine if demographic variables (age, gender, indigenous status, country of birth, culturally and linguistically diverse (CALD) status, sexual orientation, household composition, region and State/territory) and past service use affected community expectations. Results showed that a majority of respondents expected Lifeline to listen and provide support, recommend other services, and provide information. Help-seekers were expected to feel heard and listened to, receive safety advice or support to stay safe, and feel more hopeful. Lifeline was expected to prioritise people feeling suicidal, in immediate personal crisis, and experiencing domestic violence. Findings reveal that community members hold expectations for Lifeline Australia to serve as a suicide prevention and general crisis support service, which are congruent with the service's aims. There was little variation in community expectations of crisis support services based on demographic factors and past service use. The results show that the community has extensive and diverse expectations for this national crisis service to meet both short and longer-term needs for all vulnerable members of the community—entailing a very substantial public health service responsibility. © 2021 John Wiley & Sons Ltd.
Consumer perspectives on the use of artificial intelligence technology and automation in crisis support services : mixed methods study
- Ma, Jennifer, O’Riordan, Megan, Mazzer, Kelly, Batterham, Philip, Bradford, Sally, Kõlves, Kairi, Titov, Nickolai, Klein, Britt, Rickwood, Debra
- Authors: Ma, Jennifer , O’Riordan, Megan , Mazzer, Kelly , Batterham, Philip , Bradford, Sally , Kõlves, Kairi , Titov, Nickolai , Klein, Britt , Rickwood, Debra
- Date: 2022
- Type: Text , Journal article
- Relation: JMIR Human Factors Vol. 9, no. 3 (2022), p.
- Full Text:
- Reviewed:
- Description: Background: Emerging technologies, such as artificial intelligence (AI), have the potential to enhance service responsiveness and quality, improve reach to underserved groups, and help address the lack of workforce capacity in health and mental health care. However, little research has been conducted on the acceptability of AI, particularly in mental health and crisis support, and how this may inform the development of responsible and responsive innovation in the area. Objective: This study aims to explore the level of support for the use of technology and automation, such as AI, in Lifeline’s crisis support services in Australia; the likelihood of service use if technology and automation were implemented; the impact of demographic characteristics on the level of support and likelihood of service use; and reasons for not using Lifeline’s crisis support services if technology and automation were implemented in the future. Methods: A mixed methods study involving a computer-assisted telephone interview and a web-based survey was undertaken from 2019 to 2020 to explore expectations and anticipated outcomes of Lifeline’s crisis support services in a nationally representative community sample (n=1300) and a Lifeline help-seeker sample (n=553). Participants were aged between 18 and 93 years. Quantitative descriptive analysis, binary logistic regression models, and qualitative thematic analysis were conducted to address the research objectives. Results: One-third of the community and help-seeker participants did not support the collection of information about service users through technology and automation (ie, via AI), and approximately half of the participants reported that they would be less likely to use the service if automation was introduced. Significant demographic differences were observed between the community and help-seeker samples. Of the demographics, only older age predicted being less likely to endorse technology and automation to tailor Lifeline’s crisis support service and use such services (odds ratio 1.48-1.66, 99% CI 1.03-2.38; P<.001 to P=.005). The most common reason for reluctance, reported by both samples, was that respondents wanted to speak to a real person, assuming that human counselors would be replaced by automated robots or machine services. Conclusions: Although Lifeline plans to always have a real person providing crisis support, help-seekers automatically fear this will not be the case if new technology and automation such as AI are introduced. Consequently, incorporating innovative use of technology to improve help-seeker outcomes in such services will require careful messaging and assurance that the human connection will continue. © Jennifer S Ma, Megan O’Riordan, Kelly Mazzer, Philip J Batterham, Sally Bradford, Kairi Kõlves, Nickolai Titov, Britt Klein, Debra J Rickwood.
- Authors: Ma, Jennifer , O’Riordan, Megan , Mazzer, Kelly , Batterham, Philip , Bradford, Sally , Kõlves, Kairi , Titov, Nickolai , Klein, Britt , Rickwood, Debra
- Date: 2022
- Type: Text , Journal article
- Relation: JMIR Human Factors Vol. 9, no. 3 (2022), p.
- Full Text:
- Reviewed:
- Description: Background: Emerging technologies, such as artificial intelligence (AI), have the potential to enhance service responsiveness and quality, improve reach to underserved groups, and help address the lack of workforce capacity in health and mental health care. However, little research has been conducted on the acceptability of AI, particularly in mental health and crisis support, and how this may inform the development of responsible and responsive innovation in the area. Objective: This study aims to explore the level of support for the use of technology and automation, such as AI, in Lifeline’s crisis support services in Australia; the likelihood of service use if technology and automation were implemented; the impact of demographic characteristics on the level of support and likelihood of service use; and reasons for not using Lifeline’s crisis support services if technology and automation were implemented in the future. Methods: A mixed methods study involving a computer-assisted telephone interview and a web-based survey was undertaken from 2019 to 2020 to explore expectations and anticipated outcomes of Lifeline’s crisis support services in a nationally representative community sample (n=1300) and a Lifeline help-seeker sample (n=553). Participants were aged between 18 and 93 years. Quantitative descriptive analysis, binary logistic regression models, and qualitative thematic analysis were conducted to address the research objectives. Results: One-third of the community and help-seeker participants did not support the collection of information about service users through technology and automation (ie, via AI), and approximately half of the participants reported that they would be less likely to use the service if automation was introduced. Significant demographic differences were observed between the community and help-seeker samples. Of the demographics, only older age predicted being less likely to endorse technology and automation to tailor Lifeline’s crisis support service and use such services (odds ratio 1.48-1.66, 99% CI 1.03-2.38; P<.001 to P=.005). The most common reason for reluctance, reported by both samples, was that respondents wanted to speak to a real person, assuming that human counselors would be replaced by automated robots or machine services. Conclusions: Although Lifeline plans to always have a real person providing crisis support, help-seekers automatically fear this will not be the case if new technology and automation such as AI are introduced. Consequently, incorporating innovative use of technology to improve help-seeker outcomes in such services will require careful messaging and assurance that the human connection will continue. © Jennifer S Ma, Megan O’Riordan, Kelly Mazzer, Philip J Batterham, Sally Bradford, Kairi Kõlves, Nickolai Titov, Britt Klein, Debra J Rickwood.
- «
- ‹
- 1
- ›
- »