Really nice idea with concrete and a real problem to solve. I think I may be found one more way to tackle psychopaths/sociopaths.
Random or on-demand "matches" and conversations of moderators with people on the platform.
To track people's behavior on the platform
To check someone if a person reports him/her
To keep harmful people "busy"
I imagine it to work in three ways:
Moderators, professional psychologists, would randomly "match" with someone and start talking to him naturally. Eventually they would use some bait questions or topics to see a person's response. If they found everything ok, they will eventually get out of the conversation. If they detect some traits of psycho/sociopathic behaviour, they continue with conversation until they are ready to "sandbox" them.
2) On-demand conversations
It would work the same as the above-mentioned conversations, but the people would be chosen based on the user's experience, feedback or a report.
3) Once the harmful people are detected, they would be "sandboxed" in a way that they stay in a loop of matching with moderators, "therapeutic" or "equally harmful" chat bots. As Darko mentioned, matching two sociopaths could work, too.