What moderation aims to do
We want VoiceChatMate to remain usable for people who follow the rules. That means addressing spam, harassment, sexual coercion, illegal activity, and other serious violations described in our community guidelines. Moderation is one layer; your own judgment and the disconnect button are another.
Reporting tools
When the product exposes reporting, it helps tie concerns to specific activity so reviewers can investigate. If you need practical steps, read report abuse. We do not promise instant outcomes for every report—queues and severity affect timing.
Behavior expectations
Treat matches as people deserving basic respect. No threats, no doxxing, no non-consensual sexual content, and no attempts to exploit vulnerable users. Guidelines spell out more detail; ignorance of the rules is not an excuse for harming others.
Anti-abuse direction
We review reports, apply restrictions when justified, and iterate on signals that indicate coordinated abuse. We avoid describing exact thresholds or internal dashboards here so bad actors cannot optimize around them.
Safety systems (high level)
The platform may combine automated signals, user reports, and human review. No system is perfect: safe habits still matter. For product basics, see how it works and the FAQ.
Know the rules before you chat