[Crisis Text Line is] a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.
But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization's for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.
Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly "anonymized," stripped of any details that could be used to identify people who contacted the helpline in distress.
In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive.
Here's more from the Loris.ai website, on how they use the data from your communications about self-harm attempts:
We're a team of data scientists, technologists and behavioral linguistics experts solving for how to bring more empathetic conversations to the world.
We believe every customer interaction is an opportunity to create a relationship through real-time responses and an obsession with customer satisfaction. We believe every interaction is a chance to not only resolve tickets in record time but to create a customer advocate and bring Voice of Customer insights back to the business, all in real-time.
We're here to deliver on this mission by enabling fast growing online companies with out-of-the-box technology, industry leading implementation times, and by delivering best-in-class client service ourselves
There's a lot more at the Politico link.
Suicide hotline shares data with for-profit spinoff, raising ethical questions [Alexandra S. Levine / Politico]