← Back to home
Safety & Child Safety Standards
FRE is for adults looking for real connections. This page covers how we keep the community safe, what's prohibited, how to report, and our zero-tolerance policy against Child Sexual Abuse and Exploitation (CSAE).
Adults only platform
FRE is exclusively for users aged 18 or older. Anyone under 18 is strictly prohibited from creating an account or using the service. If we have any reason to believe a user is under 18, the account is suspended immediately while we investigate, and removed if the suspicion is confirmed.
Identity verification
Every new account must pass live face verification before they can swipe, match, or send messages. The check uses AWS Face Liveness to confirm the user is a real human in the moment, not an impersonator using stolen photos. Accounts that haven't passed verification can't reach other users.
Prohibited content and behavior
The following are not allowed on FRE and will result in content removal, account action up to a permanent ban, and where applicable a report to law enforcement:
- Any sexual content involving anyone under 18 — zero tolerance. See the CSAE section below.
- Nudity, sexually explicit content, or solicitation
- Harassment, threats, stalking, doxxing, or hate speech targeting people based on race, ethnicity, religion, gender, sexual orientation, disability, or other protected attributes
- Impersonation, fake profiles, or stolen photos
- Romance scams, financial fraud, or off-platform payment requests
- Violence, weapons, or content that promotes self-harm
- Spam, advertising, recruiting for other services, or commercial solicitation
- Sharing of someone else's private information without their consent
- Illegal activity or content that violates applicable law
Blocking and reporting
Every profile has a Block and Report option. Blocking is instant and mutual — neither user can find or message the other again. Reporting flags the user and their content for our moderation team.
- Open the user's profile or chat.
- Tap the menu icon (top right).
- Tap Report or Block.
- Choose a reason and confirm.
You can also reach the safety team directly by email — see the contact section below.
How we moderate
We combine automated checks with human review:
- Photo uploads pass through automated screening for nudity and other policy-violating content before they appear in the app.
- Reports are reviewed by our safety team. Confirmed violations trigger content removal and account action up to a permanent ban.
- Accounts that accumulate multiple confirmed reports are automatically suspended pending review.
- We cooperate with law enforcement when content violates the law, including reporting child sexual abuse material to the National Center for Missing & Exploited Children (NCMEC) and equivalent authorities.
Zero tolerance for CSAE
The following are strictly prohibited and result in immediate account termination, content removal, and where applicable a report to the National Center for Missing & Exploited Children (NCMEC) and law enforcement:
- Any sexual content depicting, describing, or implying a minor (anyone under 18).
- Solicitation of minors for sexual or romantic purposes ("grooming").
- Sharing, requesting, or trading Child Sexual Abuse Material (CSAM) of any kind.
- Sexualised conversation involving minors, including roleplay.
- Promotion, glorification, or normalisation of child sexual abuse.
- Linking out to external sites or services that host or distribute CSAM.
- Any other content that sexualises, exploits, or endangers a minor.
How to report
If you encounter any of the above on FRE, please report it immediately. We treat CSAE reports as the highest priority and review them ahead of other moderation queues.
- From inside the app: open the offending profile or chat, tap the menu icon, then tap Report. Choose the reason that best matches (we have a category for minor / sexual content involving a child).
- By email: write to the safety team using the contact below. Include screenshots and the offender's username or profile URL if you have them. Do not include or attach the offending material itself.
How we respond
Once we receive a CSAE report or our automated systems detect potential CSAE content:
- The reported content is hidden from the app pending review.
- The reported account is suspended pending review.
- Our safety team reviews the report. Confirmed violations result in permanent account termination, deletion of the offender's data, and a permanent device-level block.
- Where required by law, we preserve evidence and report to NCMEC's CyberTipline and to relevant law enforcement.
- We cooperate with valid legal process from law enforcement worldwide.
Automated detection
All photos uploaded to FRE pass through automated screening before they become visible to other users. The screening checks for nudity and other policy-violating content. Photos that fail screening are blocked from publication and surfaced for human review. We continue to invest in detection technology and partner with industry initiatives focused on combating CSAE online.
Staff training
Our moderation team is trained on recognising CSAE indicators, including grooming patterns and coded language. Suspected CSAE cases are escalated to a designated safety lead for review and reporting.
Compliance
This policy is designed to meet or exceed the standards in:
- Google Play's Child Safety Standards policy
- Apple App Store Review Guideline 1.1.1 (Objectionable Content)
- 18 U.S.C. ยง 2258A (US reporting requirements) and equivalents in other jurisdictions
- The UK Online Safety Act, EU Digital Services Act, and other applicable regional laws
Contact
For CSAE reports, urgent safety concerns, or law enforcement inquiries:
If a child is in immediate danger, contact local emergency services first, then report to NCMEC at report.cybertip.org.