• Ever wanted an RSS feed of all your favorite gaming news sites? Go check out our new Gaming Headlines feed! Read more about it here.
  • We have made minor adjustments to how the search bar works on ResetEra. You can read about the changes here.
Dec 2, 2017
20,678
Crisis Text Line is one of the world's most prominent mental health support lines, a tech-driven nonprofit that uses big data and artificial intelligence to help people cope with traumas such as self-harm, emotional abuse and thoughts of suicide.

But the data the charity collects from its online text conversations with people in their darkest moments does not end there: The organization's for-profit spinoff uses a sliced and repackaged version of that information to create and market customer service software.

Crisis Text Line says any data it shares with that company, Loris.ai, has been wholly "anonymized," stripped of any details that could be used to identify people who contacted the helpline in distress. Both entities say their goal is to improve the world — in Loris' case, by making "customer support more human, empathetic, and scalable."

In turn, Loris has pledged to share some of its revenue with Crisis Text Line. The nonprofit also holds an ownership stake in the company, and the two entities shared the same CEO for at least a year and a half. The two call their relationship a model for how commercial enterprises can help charitable endeavors thrive.

Ethics and privacy experts contacted by POLITICO saw several potential problems with the arrangement.

Some noted that studies of other types of anonymized datasets have shown that it can sometimes be easy to trace the records back to specific individuals, citing past examples involving health records, genetics data and even passengers in New York City taxis.


Others questioned whether the people who text their pleas for help are actually consenting to having their data shared, despite the approximately 50-paragraph disclosure the helpline offers a link to when individuals first reach out.


Former federal regulator Jessica Rich said she thought it would be "problematic" for third-party companies to have access even to anonymized data, though she cautioned that she was unfamiliar with the companies involved.

"It would be contrary to what the expectations are when distressed consumers are reaching out to this nonprofit," said Rich, a former director of the Federal Trade Commission's Bureau of Consumer Protection. She later added: "The fact that the data is transferred to a for-profit company makes this much more troubling and could give the FTC an angle for asserting jurisdiction."

The nonprofit's vice president and general counsel, Shawn Rodriguez, said in an email to POLITICO that "Crisis Text Line obtains informed consent from each of its texters" and that "the organization's data sharing practices are clearly stated in the Terms of Service & Privacy Policy to which all texters consent in order to be paired with a volunteer crisis counselor."

In an earlier exchange, he emphasized that Crisis Text Line's relationship with its for-profit subsidiary is "ethically sound."

"We view the relationship with Loris.ai as a valuable way to put more empathy into the world, while rigorously upholding our commitment to protecting the safety and anonymity of our texters," Rodriguez wrote. He added that "sensitive data from conversations is not commercialized, full stop."

Loris' CEO since 2019, Etie Hertz, wrote in an email to POLITICO that Loris has maintained "a necessary and important church and state boundary" between its business interests and Crisis Text Line.


www.politico.com

Suicide hotline shares data with for-profit spinoff, raising ethical questions

The Crisis Text Line’s AI-driven chat service has gathered troves of data from its conversations with people suffering life’s toughest situations.
 

Lumination

Member
Oct 26, 2017
12,510
At face value, there are benefits to this. Aggregating data to track things like trends in people's mental health and so forth.

But do you trust that every person in these faceless orgs are doing the right thing with your data?

And if the non-profit holds an ownership stake with the tech company and they even shared CEOs.... why isn't it just part of the non-profit? Is it because there's money to be made between the pinky promises?
 

Chikor

Banned
Oct 26, 2017
14,239
We need some legislations to deal with online privacy, the free market and the regulatory framework we have right now does not work at all.
 
Oct 25, 2017
12,114
Gross, morbid, and predatory. How in the hell are you going to hide behind, well, they consent to it in the ToS. You are talking about people who are reaching out in their darkest hour and you are going to drop the: 'What, you didn't actually read that ToS agreement?'
 

Akira86

Member
Oct 25, 2017
19,598
tech driven non profit...big data....artificial intelligence...private info.....i was like.......LOL lower the boom on me, crush me giant monkey
 

Chikor

Banned
Oct 26, 2017
14,239
tech driven non profit...big data....artificial intelligence...private info.....i was like.......LOL lower the boom on me, crush me giant monkey
the red flags start coming and they don't stop coming.

friendly reminder that nonprofit is a tax designation, nothing more nothing less.

Ic5vp0p.png



Also, I knew I remembered the name of that org

www.cnn.com

Crisis Text Line CEO fired amid staff revolt

Crisis Text Line, a high-profile crisis hotline backed by millions of dollars from some of tech's biggest names, said Friday that the nonprofit's board of directors "voted to terminate" its CEO and founder effective immediately following accusations of inappropriate conduct, according to a...

It's a good sign when the founder and CEO is so racist that the staff revolt to kick them out, right?
 

Dark Ninja

Member
Oct 27, 2017
8,073
I used to work at a non-profit and learned they were actually for profit. This is why I don't trust any charity or non-profit org. I bet they go to the it's just so we can sustain ourselves excuse next.
 

Kastanjemanden

alt account
Banned
Jan 23, 2022
363
the red flags start coming and they don't stop coming.

friendly reminder that nonprofit is a tax designation, nothing more nothing less.

Ic5vp0p.png



Also, I knew I remembered the name of that org

www.cnn.com

Crisis Text Line CEO fired amid staff revolt

Crisis Text Line, a high-profile crisis hotline backed by millions of dollars from some of tech's biggest names, said Friday that the nonprofit's board of directors "voted to terminate" its CEO and founder effective immediately following accusations of inappropriate conduct, according to a...

It's a good sign when the founder and CEO is so racist that the staff revolt to kick them out, right?

lol 40 hours a week

I bet they worked none and did nothing of value for that 350k

insane
 

Deleted member 4461

User Requested Account Deletion
Banned
Oct 25, 2017
8,010
I think that there's a world in which a for-profit company could access this data & have it avoid raising ethical flags.

For example, if the product being sold was mental health training. Or a program to address workplace culture and reduce stressors. Basically, anything that's tied back to the idea of preventing the kinds of problems the suicide hotline is used for.

But for customer service? I think that's too far over the already tight line. And I see that line about empathy, and I just know that they convinced themselves that they're doing the right thing. I think they genuinely believe it, in fact. Happens literally all the time in corporate America.
 

neon/drifter

Shit Shoe Wasp Smasher
Member
Apr 3, 2018
4,071
Oh wow that's the line I've frequently used.

That's twice now I've been fucked by suicide lines. Wow I give up.
 
May 26, 2018
24,045
I'd make this shit pants shittingly illegal if I were a king. For real. What it does to people who need help...who need someone to trust... it must be so destructive I can hardly even comprehend it.
 

Pau

Self-Appointed Godmother of Bruce Wayne's Children
Member
Oct 25, 2017
5,885
How hard is it to not be shitty?
 

construct

Saw the truth behind the copied door
Member
Jun 5, 2020
8,027
東京
this is harmful. it takes a lot of courage to call these lines and now you have to add another worry on top of it. this world is hopeless
 

jb1234

Very low key
Member
Oct 25, 2017
7,237
I don't know a lot of suicidal people who would call these lines and now they'd use it even less. Good going.
 

Rosebud

Two Pieces
Member
Apr 16, 2018
43,778
I don't know a lot of suicidal people who would call these lines and now they'd use it even less. Good going.

I had some thoughts in the past (not now!) and surely wouldn't call it because I heard some bad stories, that's why I also never recommend it

There must be a better way to help people
 
Dec 30, 2020
15,332
That's a nightmare. I can't begin to estimate how many people seeking legit help will now not bother out of fear of having their data harvested by these ghouls.
 

Soda

Member
Oct 26, 2017
8,917
Dunedin, New Zealand
At face value, there are benefits to this. Aggregating data to track things like trends in people's mental health and so forth.

But do you trust that every person in these faceless orgs are doing the right thing with your data?

And if the non-profit holds an ownership stake with the tech company and they even shared CEOs.... why isn't it just part of the non-profit? Is it because there's money to be made between the pinky promises?

Yup, in a perfect world, this would probably be a good thing.

In the world we live in, I don't trust that shit.