Facebook PixelOnline etiquette app
Create newCreate new

Online etiquette app

Image credit: https://alpha.aeon.co/images/fabcf900-f747-441b-957b-2da68e31a88e/header_ESSAY-swearing.jpg

Spook Louw
Spook Louw Mar 31, 2021
Please leave the feedback on this idea

Is it original or innovative?


Is it feasible?


Is it targeting an unsolved problem?


Is it concisely described?

Bounty for the best solution

Provide a bounty for the best solution

Bounties attract serious brainpower to the challenge.

Currency *
Who gets the Bounty *
We've all experienced the consequences of our own actions or words in a momentary lapse of self-control.

I have this idea for an app that reads what you type, like predictive text, Grammarly or spellcheck but looks for certain keywords. Then upon finding one of the keywords, prompts you to take a different approach.

For example, if I were to type, "Fuck you!" the app could send a pop-up message saying "Are you feeling angry or aroused?"

Depending on my response, the app could then suggest alternative wording. This would be educational, helping people expand their vocabulary, getting them to read a bit and generally also just learning to take a second to think before sending a message. But the approach needs to be entertaining. So, if I said that I am angry, the app could give me a Monty Python or Stephen Fry like paragraph to replace my expletive response. Or if I chose "aroused" the app could quote Oscar Wilde or Jane Austen.

Imagine being able to get rid of all the "haha", "lol", "what the fuck", "wow", "shit", "damn" and the rest of the overused, meaningless placeholders we mistake for conversation nowadays and actually getting access to some proper literature in everyday communications.
Creative contributions

The app puts a hold when the inappropriate word is detected

jnikola Apr 02, 2021
Just as an addition to this extremely cool idea, what if the app had an additional feature to choose the "hold" time for e.g., 10 seconds? Next time, when you write an inappropriate word, the app just puts a hold for 10 seconds and, in the meantime, sends you the above-mentioned contents that @Darko and @salemandreus suggested. It would be like somebody next to you saying "wait, take a big breath, it's not the right way".

What if all these things just bound off and provoke even more anger?
Please leave the feedback on this idea
Samuel Bello
Samuel Bello3 years ago
Instead of a hold, the user can be made to watch ads as punishment. That way their time is wasted whenever they use curse words and revenue is generated for the platform they use the words on. Ads are ideal because they are generally interesting and can sometimes improve a person's mood. Ads are designed to distract people from what they are doing; this time the distraction can be put to some positive use.
Please leave the feedback on this idea
jnikola3 years ago
Samuel Bello Good point!

Please leave the feedback on this idea
salemandreus3 years ago
I agree with you: whether a design is calming or a manner of addressing an angry/anxious person is effective is not a simple question to answer!

So I made a detailed creative post below, proposing how we could address these and related concerns through using trained psychologists and UX/accessibility testing.
Please leave the feedback on this idea

Essential Research and approvals - Psychologist-informed content, culturally (etc) diverse user testing and soothing UX/UI

salemandreus Apr 02, 2021
I'm combining the psychological and UX high-level concerns into one post since in user-evaluating the developed app it would be difficult if not impossible to completely separate the two.
  1. Soothing and smooth UI/UX - We’d need to ensure that the entire user experience and design are specifically calming to the user.
This would require:
  • Efficient App Development - performance optimization of the app across multiple devices (A glitchy delayed popup due to memory inefficiency and poor integration with other apps, interrupting other processes while it loads is a frustrating user experience and would undermine the app's entire purpose and get it quickly uninstalled. )
  • Solid understanding of aesthetics in graphic and UX design
  • User Experience Evaluation and Testing:
  • Meticulous - pinpointing specifically what users find frustrating or helpful as it will inform our design.
  • Diverse sample set of users - crucial to ensuring nothing inflammatory/insensitive to users of different cultural backgrounds AND experiences which would undermine the app's effectiveness.
  • Large sample size of testees - given the sensitivity of this content as a psychological aid and its specificity
  • Ethics Clearance - As UX testing here may be inextricable from some level of psychological testing on users it should get clearance, another reason trained psychologists should be involved to ensure we minimise potential for causing harm in either testing or the released product.

2. Content Research (psychologists) - Aside from being necessary from a marketing point of view to show credibility of the app, we want our messages to be effective in reaching the user, not come across as condescending or tone-deaf to their struggles or even cause harm to them.

Therefore we’d need to defer to trained therapists in terms of how to respond to various situations identified. Giving the wrong response to someone who is psychologically vulnerable - since the developer/UX writers are likely not trained therapists - could be catastrophic both in terms of risking the safety of customers and potentially also called out on social media for any insensitivity/inexperience towards mental health problems, however well-intentioned. There are many subtle examples of where this is easy to do such as the use of well-meaning toxic positivity towards victims of gaslighting abuse, perpetuating c/PTSD triggers and subtle ableism/systemic oppression (eg gender bias) in our use of language, which is of course a big intersection for many abuse victims in the first place. Thus even subtle advice-giving content and scenarios would have to be evaluated if not designed by psychologists.

3. Approving the App:
  • Accessibility testing: Specialists (psychologists and accessibility engineers) should also evaluate the app as a whole to determine whether any UX features could cause problems particularly for users with sensory processing disorders, eg if sounds are too jarring or visuals bright this could cause distress - this would undermine the whole purpose of our app otherwise.
  • Psychological testing: This would have the same bullet points as User Experience testing and also involves usage of the app, but with a different focus, although again potentially there would be some overlap in the two types of testing, given the effectiveness of the app's user experience at least partly relies on its effectiveness as a mental health/mindfulness/cognitive-behavioural aid and vice-versa.
Please leave the feedback on this idea
Spook Louw
Spook Louw3 years ago
Excellent! I think this is exactly how it should be approached.

Just to the point of whether it will actually end up making people angrier, I'm sure it can, but taking control of your actions must be a choice you make. In the same way, Anger Management classes will be infuriating if you do not embrace the process, or a loved one telling you to "calm down" at the wrong time could make you even angrier. The fact is that it will be your decision to download the app and how much you benefit from it will be dependant on how much you embrace it.

That all being said, I think our ideas for the app have moved away from what the original focus point was intended to be. It has improved and there's no denying how important mental health is, yet I do want to take us back to the original idea for just a second because I think you'll find that many of the hurdles we now face disappear if we move back to a simpler idea.

Originally the app was not meant to be a tool for improving your mental wellbeing, it was meant to be a simple, fun way to enrich your vocabulary and be exposed to some great literature.

Focussing on mental health creates a list of problems:
1 - You need a psychologist's input (their methods also vary, so you'd probably need a team to reach a general conclusion).
2 - There is way less room for error when you start working with the fragile human psyche.
3 - It needs to work, psychological help is judged in the same way medicine is, if you are going to charge people for this service, you will need to be able to provide some proof that you can achieve positive results. (except if you're simply running a scam, but I don't think that's what any of us here had in mind.)
And the list will go on, I'm sure the app would need to be approved by some board, the possibility of a negative backlash or even legal action will always be there, etc.

And to be honest, I don't want my phone or computer to stop me from being angry at someone. Sometimes it's necessary. But instead of telling someone to "fuck off" one might quote Sir Terry Pratchett and say "May your genitals sprout wings and fly away.”

Please leave the feedback on this idea

AI-powered helper

Darko Savic
Darko Savic Mar 31, 2021
Depending on your profession (sales, PR, support, creative writing, etc) the AI assistant that watches as you type would monitor for specific cues and provide helpful suggestions when something triggers the need.

Example: if your job was to offer customer support but you got triggered by an angry customer, the AI assistant would detect sarcasm in your words and get you to take a break. It could bring up 2 short video clips in quick succession:
  • the first would remind you that life is beautiful and things are rarely worth stressing about
  • the 2nd would bring up a lesson that you need to see at that specific time - for example how to talk to annoying customers, etc.
It could bring up quotes, memes, anything that relays the message well.

The same concept could be applied to anything where you express yourself through writing.
Please leave the feedback on this idea
salemandreus3 years ago
Could then integrate with wellness apps for larger solutions:
1- Raise self-awareness: "Do you feel anxious/depressed right now?"
2- Suggest self-care behaviours: "You've also not eaten, are you hungry?"
3- Raise concerns from larger patterns: "You've reported anxiety frequently and your sleep cycle has changed significantly over X days, do you need help?"
Please leave the feedback on this idea
Darko Savic
Darko Savic3 years ago
Hi salemandreus 🖖
That sounds like our inevitable future:) Also combined with various sensors (detecting biochemicals in urine and sool), heart rate and breathing could be picked up via sensitive microphones, etc. the AI would be able to help even before we knew that needed it
Please leave the feedback on this idea

Set up a translation programme to filter vulgar words

Samuel Bello
Samuel Bello Sep 08, 2021
Apps that are used for translation can be modified to perceive offensive language as a different language altogether. The message is going to be translated to a more acceptable form and displayed to other users. The option of seeing the original message version can be added. That way one does not get to see any vulgar words without their permission. Those who like to use curse words and do not mind reading them can set their accounts to see the words exactly as they are posted. Everyone is free to write what they like without being frustrated by a correction mechanism that disrupts or delays their messages.
Please leave the feedback on this idea

Add your creative contribution

0 / 200

Added via the text editor

Sign up or


Guest sign up

* Indicates a required field

By using this platform you agree to our terms of service and privacy policy.

General comments

Samuel Bello
Samuel Bello3 years ago
Something like that is already in place on mobile phones, though their approach is more subtle than what the idea here suggests. Vulgar words are not added to the phone's dictionary at first. That way when users want to use those words they usually have to spell them out in full and add them to the phone's dictionary. Even after adding the words to the phone's dictionary, the autocorrect function will frequently change the user's swear words to the words that are closest in spelling to the intended word.

I believe that the same approach can be used more effectively so that every website will have its own dictionary and the vulgar words will have to be added and spelled in full on each site before they can be used freely on any website. The library of vulgar words can be cleared frequently for all free keyboard apps to further discourage the use of such words.
Please leave the feedback on this idea