← Back to home

Safety & Child Safety Standards

FRE is for adults looking for real connections. This page covers how we keep the community safe, what's prohibited, how to report, and our zero-tolerance policy against Child Sexual Abuse and Exploitation (CSAE).

Adults only platform

FRE is exclusively for users aged 18 or older. Anyone under 18 is strictly prohibited from creating an account or using the service. If we have any reason to believe a user is under 18, the account is suspended immediately while we investigate, and removed if the suspicion is confirmed.

Identity verification

Every new account must pass live face verification before they can swipe, match, or send messages. The check uses AWS Face Liveness to confirm the user is a real human in the moment, not an impersonator using stolen photos. Accounts that haven't passed verification can't reach other users.

Prohibited content and behavior

The following are not allowed on FRE and will result in content removal, account action up to a permanent ban, and where applicable a report to law enforcement:

Blocking and reporting

Every profile has a Block and Report option. Blocking is instant and mutual — neither user can find or message the other again. Reporting flags the user and their content for our moderation team.

  1. Open the user's profile or chat.
  2. Tap the menu icon (top right).
  3. Tap Report or Block.
  4. Choose a reason and confirm.

You can also reach the safety team directly by email — see the contact section below.

How we moderate

We combine automated checks with human review:

Zero tolerance for CSAE

The following are strictly prohibited and result in immediate account termination, content removal, and where applicable a report to the National Center for Missing & Exploited Children (NCMEC) and law enforcement:

How to report

If you encounter any of the above on FRE, please report it immediately. We treat CSAE reports as the highest priority and review them ahead of other moderation queues.

  1. From inside the app: open the offending profile or chat, tap the menu icon, then tap Report. Choose the reason that best matches (we have a category for minor / sexual content involving a child).
  2. By email: write to the safety team using the contact below. Include screenshots and the offender's username or profile URL if you have them. Do not include or attach the offending material itself.

How we respond

Once we receive a CSAE report or our automated systems detect potential CSAE content:

Automated detection

All photos uploaded to FRE pass through automated screening before they become visible to other users. The screening checks for nudity and other policy-violating content. Photos that fail screening are blocked from publication and surfaced for human review. We continue to invest in detection technology and partner with industry initiatives focused on combating CSAE online.

Staff training

Our moderation team is trained on recognising CSAE indicators, including grooming patterns and coded language. Suspected CSAE cases are escalated to a designated safety lead for review and reporting.

Compliance

This policy is designed to meet or exceed the standards in:

Contact

For CSAE reports, urgent safety concerns, or law enforcement inquiries:

Child safety reports

If a child is in immediate danger, contact local emergency services first, then report to NCMEC at report.cybertip.org.