Community Guidelines — mmeet
Effective date: April 17, 2026
mmeet is built on a simple idea: skip the endless chat and agree on a real-life meeting in under 5 minutes. For that to work, everyone here needs to feel safe and respected. These Community Guidelines apply to every profile, every mmeet video, and every interaction on the platform. Breaking them can lead to warnings, content removal, or permanent account suspension — at our sole discretion.
Every profile photo and every mmeet video is automatically checked by AI moderation before it goes live. Fake photos, deepfakes, AI-generated faces, and content that breaks these guidelines are rejected automatically. Borderline cases are reviewed by our team.
mmeet is strictly for adults aged 18 and over. Age is verified at registration and again on the server. If we suspect an account belongs to a minor, we delete it immediately and permanently.
Photos and videos are screened by AI moderation for nudity and sexual content on upload, and rejected before they reach other users.
If you see something that violates these guidelines, use the report button on the profile. Our team reviews reports and can take action ranging from a warning to permanent account removal. Serious violations — especially those involving minors, violence, or criminal activity — are escalated to law enforcement.
We don't always publish the outcome of a report to protect the privacy of everyone involved, but every report is reviewed.
We may update these guidelines as the community grows and new situations appear. Material changes will be announced in the app. The date at the top always reflects the current version.
Questions about these guidelines, or a report that needs follow-up? Write to us: