This article was published on March 24, 2021

The Trevor Project enlists Google’s AI to help combat the LGBTQ suicide crisis

AI for good


The Trevor Project enlists Google’s AI to help combat the LGBTQ suicide crisis Image by: The Trevor Project

LGBTQPIA+ youth are five times more likely to attempt suicide.

In 1998 The Trevor Project became the world’s first “national crisis intervention and suicide prevention lifeline for lesbian, gay, bisexual, transgender, queer & questioning youth.” In the time since, the foundation’s services have expanded beyond the telephone lifeline to include an online chat and a text-based counseling service – All of which are available 24 hours a day, 365 days a year.

But it’s not enough. According to a press release from The Trevor Project:

In the U.S. alone, the organization estimates more than 1.8 million LGBTQ youth seriously consider suicide each year, and at least one LGBTQ young person (13-24) attempts suicide every 45 seconds.

It takes a lot of volunteer counselors to make a difference. And, according to the most recent data available, suicide rates continue to rise. That’s why The Trevor Project has committed to increasing the number of trained counselors in its arsenal by three-fold in 2021, and eventually expanding it to ten times its current number.

To accomplish this, the group’s AI division turned to the experts at Google.org to help with funding and know-how. Their combined efforts produced a novel training system called the “Crisis Contact Simulator” and a machine learning-powered assessment tool that detects high-risk users and ensures they get immediate help.

[Read: How to use AI to better serve your customers]

According Amit Paley, CEO and Executive Director of The Trevor Project, the new tools will make the foundation’s planned expansion possible:

Technology and AI are critical tools to empower the special person-to-person connections between our crisis counselors and LGBTQ youth.

We also know that nearly 70% of our digital crisis counselors volunteer on nights and weekends, indicating a need for more training options outside of typical business hours. Adding the Crisis Contact Simulator into our counselor training program offers significant flexibility for our trainees, which creates a better experience for our volunteers and enables us to scale our crisis services to reach even more LGBTQ young people in crisis.

The simulator uses natural language processing AI to, essentially, create a troubled chatbot for counselors to talk to.

Dubbed “Riley,” it apes human behavior in the form of a fictional teen from North Carolina who “feels anxious and depressed.” This is so counselors can practice holding conversations while they’re still training – all of the counselors who work the helplines are fully-trained. Rather than tap fellow counselors for roleplay duties, the AI can provide a uniform experience for all trainees.

Dan Fichter, Head of AI and Engineering at The Trevor Project, said the group was excited to bring the new system online:

Our Crisis Contact Simulator can engage in a prolonged back-and-forth dialogue with trainees and can use language in the same way people do, including language LGBTQ youth often use to describe their experiences and emotions. The simulator maintains a consistent emotional and experiential narrative in talking about real-life feelings and situations.

Google.org, for its part, sent nearly 30 fellows to work alongside The Trevor Project and was involved in the dissemination of some $2.7 million in grant funding.

For more information about The Trevor Project, visit its site here.

If you or someone you know is experiencing a mental health crisis you can contact a suicide helpline by finding the appropriate listing for your location here.

To reach The Trevor Project’s helplines check out the image below:

Credit: The Trevor Project

Greetings Humanoids! Did you know we have a newsletter all about AI? You can subscribe to it right here.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with