Facebook PixelConspiracy busting feature for social media where the link you share is not necessarily what others see
Create newCreate new

Conspiracy busting feature for social media where the link you share is not necessarily what others see

Image credit: Picture-alliance, dpa, S. Barbour

Darko Savic
Darko Savic Jun 15, 2022
Please leave the feedback on this idea

Is it original or innovative?


Is it feasible?


Is it targeting an unsolved problem?


Is it concisely described?

Bounty for the best solution

Provide a bounty for the best solution

Bounties attract serious brainpower to the challenge.

Currency *
Who gets the Bounty *
Bias/conspiracy busting feature for social media platforms. A friend sends you a link, but when you open it, you don't see what they saw. Instead, you see the making of fake content they thought was real.
  • Show people how easily they are fooled by content that plays into their pre-existing beliefs or biases.
  • People share content with like-minded friends. Having these same friends see how the shared content was wrong (but their friends didn't know it) has the potential of making everyone wonder what else could be wrong without people realizing it.
  • Instill skepticism in circles that are easily fooled.
  • Get people to use their "inner content filter" and invest more effort into verifying what they see online.
How it works
Imagine getting a video/image/news shared to you by a friend. When you open it, you don't see what they saw. Instead, you see the behind the scenes - making of the fake content that your friend thought was real. You see someone going at great lengths to con people into believing fake stuff. They are doing it for educational purposes so that YOU can tell your friend how the content was made to be believable.
In the end you also see the fake/manipulated content your friend shared with you. This puts you in a perfect position to talk to your friend about what just happened. Maybe even bust their balls a little before revealing the content was fake. If they refuse to believe you, show them the "making of" part.
The website that hosts the content decides when to substitute it even though the link remains the same. 1st tier shares show the fake content, 2nd tier (re-shares) show "the making of" content.
How does the software know whether you are sharing or re-sharing it? The URL of the 1st tier share has an added variable that gets removed as soon as someone opens the link. So if they re-share, they would do it witohut the variable in the URL, thus they would be sharing "the making of" instead. Likewise, the content itself would have a share/link feature that works the same way.
Used as prank
I imagine, after being schooled people will want to see how many others will fall for the same fake. Anyone who already saw "the making of" part, gets to choose which link to share. They can send fake content to all their friends. But when those friends re-share it, the educational content is seen instead and the cycle repeats.
Add-on feature
This feature could be built into any of the popular social media platforms. You could prank people with tweets, tik-tok, Youtube videos, Instagram posts, etc. Outrage people with topics they (wrongfully) strongly believe in.
Strict moderation
Content that plays into people's pre-existing misconceptions and biases is more likely to go viral within groups of similar people. It has the potential to reinforce segregation of people into islands of believers. This makes such content dangerous for society. Anyone that wants to take advantage of the feature proposed in this idea has to have the content pass moderation/review before it can go live.
Creative contributions

Avoiding misunderstandings

Povilas S
Povilas S Jun 20, 2022
The person who got the reshared link should clearly understand that the sender saw a different thing. Therefore a clear explanation of what's happening should be included in the reshared content. Otherwise, the receiver might think that the sender also knows about "the making of" part and send them the link as proof of debunking the fake info, so the receiver might comment something back, but it might not be enough for them both to understand what happened, they might still think they saw the same content.
Even if the situation is briefly explained in the reshared content, the receiver might still think it's some weird part of the game and not what actually happened. The situation should be well explained in the reshared content to make the receiver believe it.
Please leave the feedback on this idea
Darko Savic
Darko Savic16 days ago
Yes, that's what makes the 2nd tier viewers get back to the 1st tier and tell him/her what's really happening and how s/he was tricked
Please leave the feedback on this idea

Add your creative contribution

0 / 200

Added via the text editor

Sign up or


Guest sign up

* Indicates a required field

By using this platform you agree to our terms of service and privacy policy.

General comments

Shubhankar Kulkarni
Shubhankar Kulkarni21 days ago
Great idea! I have a few questions: Who creates the educational content? Also, the "making of" content could be hidden from public view or may not exist on the internet. So, will it be recreated? And who takes the responsibility?
Please leave the feedback on this idea
Darko Savic
Darko Savic21 days ago
Shubhankar Kulkarni the same person that creates the fake also creates the making of, to show that it's a hoax. The "making of" could be just an explainer from the creator, telling people why/how whatever they put together is flawed but why it looks plausible. I'm not sure about the responsibility. I imagine there would be people who want to "cancel" whoever is doing this even if it's done with good intentions.
Please leave the feedback on this idea
Shubhankar Kulkarni
Shubhankar Kulkarni20 days ago
Darko Savic I get it now. So, when you encounter a conspiracy, you study it, first create a fake video and send it. The people who try to forward it receive the alternative video that is educational. This will need a significant amount of funding though.
Please leave the feedback on this idea
J. Nikola
J. Nikola14 days ago
Shubhankar Kulkarni Not only funding, but it would require a person to open the link twice. In other words, if you don't share the link, you will find out only the lie, not the truth, too.
Please leave the feedback on this idea
Shubhankar Kulkarni
Shubhankar Kulkarni10 days ago
J. Nikola This is a potential downside. What if I don't forward it but believe it? What if I tell it to my friends and family verbally but do not forward it at all.
Please leave the feedback on this idea