Among the other 100s of reasons, it is time to stop using Facebook.
A police officer on the late shift in an Ohio town recently received an unusual call from Facebook.
Earlier that day, a local woman wrote a Facebook post saying she was walking home and intended to kill herself when she got there, according to a police report on the case. Facebook called to warn the Police Department about the suicide threat.
The officer who took the call quickly located the woman, but she denied having suicidal thoughts, the police report said. Even so, the officer believed she might harm herself and told the woman that she must go to a hospital — either voluntarily or in police custody. He ultimately drove her to a hospital for a mental health work-up, an evaluation prompted by Facebook’s intervention. (The New York Times withheld some details of the case for privacy reasons.)
Facebook has computer algorithms that scan the posts, comments and videos of users in the United States and other countries for indications of immediate suicide risk. When a post is flagged, by the technology or a concerned user, it moves to human reviewers at the company, who are empowered to call local law enforcement.
“In the last year, we’ve helped first responders quickly reach around 3,500 people globally who needed help,” Mr. Zuckerberg wrote in a November post about the efforts.
But other mental health experts said Facebook’s calls to the police could also cause harm — such as unintentionally precipitating suicide, compelling nonsuicidal people to undergo psychiatric evaluations, or prompting arrests or shootings.
And, they said, it is unclear whether the company’s approach is accurate, effective or safe. Facebook said that, for privacy reasons, it did not track the outcomes of its calls to the police. And it has not disclosed exactly how its reviewers decide whether to call emergency responders. Facebook, critics said, has assumed the authority of a public health agency while protecting its process as if it were a corporate secret.
Yes you read that right. “Facebook said that, for privacy reasons, it did not track the outcomes of its calls to the police.” B.S. — how about formal clinical trials like the rest of the medical world? Their algorithm should get FDA approval first at a minimum.
“It’s hard to know what Facebook is actually picking up on, what they are actually acting on, and are they giving the appropriate response to the appropriate risk,” said Dr. John Torous, director of the digital psychiatry division at Beth Israel Deaconess Medical Center in Boston. “It’s black box medicine.”
“In this climate in which trust in Facebook is really eroding, it concerns me that Facebook is just saying, ‘Trust us here,’” said Mr. Marks, a fellow at Yale Law School and New York University School of Law.
Right – Trust Facebook? Never. I submit the real reason that miscreant Zuckerberg is doing this is that it is now well known that a plausible link exists between increased social media use and depression and suicide. Just say no to Facebook.