Facebook PixelAI that talks people out of suicide in locations where they are likely to take their lives
Create newCreate new

AI that talks people out of suicide in locations where they are likely to take their lives

Image credit: istockphoto.com

Darko Savic
Darko Savic Aug 04, 2021
Please leave the feedback on this idea

Is it original or innovative?


Is it feasible?


Is it targeting an unsolved problem?


Is it concisely described?

Bounty for the best solution

Provide a bounty for the best solution

Bounties attract serious brainpower to the challenge.

Currency *
Who gets the Bounty *
Siri/Alexa type of AI that talks people out of committing suicide. Mount sensors/cameras and speakers all over popular suicide spots.

When a person is detected approaching a critical spot, have the AI start a casual conversation via the nearest speaker. It could address them in a friendly, soothing voice and try to gauge the state they are in. From there on it proceeds to talk the person out of committing suicide. If successful it can offer to accompany the person home (or into a clinic) by continuing to talk to them via an app or a website that the person loads on their phone. The AI should remember every person and continue future conversations where they left off.

This idea was inspired by Spook's bridge suicide shield.

Training the AI

The AI should be trained on tons of data and examples from real-life situations. It can practice with psychologists worldwide, remotely. It can be released into the wild when it becomes an amazing negotiator. When it goes live and starts saving people, every case should be reviewed and the AI taught about any bad choices it made.

Shock them with dark comedy

I know next to nothing about psychology of talking suicidal people out of jumping, but I'm guessing if they hear something unexpected like: "Hey, how about a joke from a robot before you jump?" it would make them pause for a second. Then launch into a hard-life relatable comedy routine that would make Seinfeld proud. Then gradually start weaving in stories of how people decided to postpone ending their lives until "tomorrow" and kept postponing it by a day ever since. Could something like that work? I don't know.

There would be plenty of opportunities to perfect the routine into something that statistically almost always works. Iterate and reuse.



Creative contributions

Perhaps AI could focus on the environment rather than interacting with the person directly

Spook Louw
Spook Louw Aug 04, 2021
Personally, I think the psychology involved in a situation where someone wants to take their own life might be too complex to be left to AI. Your idea did make me think of something else though, seeing as most suicides done by means of jumping off a bridge seem to be momentary lapses of composure and are committed in moments of despair and darkness (some of the studies cited in my original idea suggest that if a person is not able to commit suicide at the bridge immediately, they tend not to go looking for another place as they recover from feeling hopeless given some time to think) perhaps AI could be used in suicide hotspots to create positive/happy environments.
If a person is feeling suicidal, perhaps wholesome videos, comedy (as you suggested) or feel-good music could create an environment where they feel a little better, especially in places like bridges where a sudden burst of dispair could lead to fatal lapses of judgement.

Maybe by focusing on creating an uplifting environment around suicide hotspots we could help prevent people from feeling that dismal at a place where they could easily take their own lives.
Please leave the feedback on this idea
Darko Savic
Darko Savic3 years ago
At the moment AI should be capable enough to hold a meaningful conversation. This was in 2018: https://youtu.be/lXUQ-DdSDoE?t=57

The reason AI should do the talking is because we can't have psychologists waiting for people 24/7 on all the popular suicide spots. Even with a 5% success rate that would save significantly more people than what we do right now.
Please leave the feedback on this idea
Spook Louw
Spook Louw3 years ago
Darko Savic I realise that AI can hold a conversation quite well, I'm just thinking that in a conversation that is as important as talking someone out of suicide, the last thing you'd want to hear is "sorry, I do not understand". But you're right, AI would be available at all times, and its effectiveness would not be affected by emotional factors.
Please leave the feedback on this idea
Darko Savic
Darko Savic3 years ago
Spook Louw the AI could be trained to never show it doesn't understand. It could use clever ways of changing the topic. It could be taught every trick in the book. The memory is no issue. It would continuously train itself and iterate what works. Eventually, it would turn this into something resembling a game of chess.

Initially, we could say "better AI than nothing" but soon enough, the AI could do a better highly specific job than a human. Let's not forget the tens of thousands of hours of training before the AI gets the first real-life task.
Please leave the feedback on this idea

Add your creative contribution

0 / 200

Added via the text editor

Sign up or


Guest sign up

* Indicates a required field

By using this platform you agree to our terms of service and privacy policy.

General comments