Bias/conspiracy busting feature for social media platforms. A friend sends you a link, but when you open it, you don't see what they saw. Instead, you see the making of fake content they thought was real.
Show people how easily they are fooled by content that plays into their pre-existing beliefs or biases.
People share content with like-minded friends. Having these same friends see how the shared content was wrong (but their friends didn't know it) has the potential of making everyone wonder what else could be wrong without people realizing it.
Instill skepticism in circles that are easily fooled.
Get people to use their "inner content filter" and invest more effort into verifying what they see online.
Imagine getting a video/image/news shared to you by a friend. When you open it, you don't see what they saw. Instead, you see the behind the scenes - making of the fake content that your friend thought was real. You see someone going at great lengths to con people into believing fake stuff. They are doing it for educational purposes so that YOU can tell your friend how the content was made to be believable.
In the end you also see the fake/manipulated content your friend shared with you. This puts you in a perfect position to talk to your friend about what just happened. Maybe even bust their balls a little before revealing the content was fake. If they refuse to believe you, show them the "making of" part.
The website that hosts the content decides when to substitute it even though the link remains the same. 1st tier shares show the fake content, 2nd tier (re-shares) show "the making of" content.
How does the software know whether you are sharing or re-sharing it? The URL of the 1st tier share has an added variable that gets removed as soon as someone opens the link. So if they re-share, they would do it witohut the variable in the URL, thus they would be sharing "the making of" instead. Likewise, the content itself would have a share/link feature that works the same way.
I imagine, after being schooled people will want to see how many others will fall for the same fake. Anyone who already saw "the making of" part, gets to choose which link to share. They can send fake content to all their friends. But when those friends re-share it, the educational content is seen instead and the cycle repeats.
This feature could be built into any of the popular social media platforms. You could prank people with tweets, tik-tok, Youtube videos, Instagram posts, etc. Outrage people with topics they (wrongfully) strongly believe in.
Content that plays into people's pre-existing misconceptions and biases is more likely to go viral within groups of similar people. It has the potential to reinforce segregation of people into islands of believers. This makes such content dangerous for society. Anyone that wants to take advantage of the feature proposed in this idea has to have the content pass moderation/review before it can go live.