Content Moderation & Protection Policy

Updated: 14th April 2025

This Content Moderation & Protection Policy outlines Fanvue’s procedures for moderating user-generated content, verifying user identity, and protecting platform content. Its purpose is to ensure the safety, legality, and security of the Platform for all Users.

This Policy supplements the General Terms & Conditions and other applicable Fanvue Policies. If there is any conflict between this Policy and the General Terms, the terms of this Policy shall prevail for the specific subject matter it governs.


DEFINITIONS

“Creator” means a Creative User authorised to upload and monetise Content under the Creator Terms of Service.

“KYC” (Know Your Customer) means the process of verifying a user’s identity and age through an approved third-party provider.


1. USER SAFETY AND CONTENT SCREENING

  1. All Users must confirm their age as 18 or older upon first access to the Platform. If a User confirms they are under 18, they are immediately redirected and denied further access.
  2. If Fanvue discovers that a User has misrepresented their age and is under 18, their account will be removed and their email address blacklisted.
  3. Fanvue uses automated tools to scan all uploaded User Content both pre-publication and post-publication (e.g. profile photos, banners, and intro videos) for nudity and sensitive material that don’t comply with the Acceptable Use Policy.
  4. Flagged Content is automatically blurred and requires User confirmation to view Display sensitive content.
  5. Users may report any Content they believe violates the Acceptable Use Policy or any other Policy. All reports are reviewed by Fanvue moderators under the procedures outlined in the Complaints Policy.

2. KYC AND CREATOR IDENTITY VERIFICATION

  1. Any User applying to become a Creator must undergo a full KYC verification through Fanvue’s approved provider (Ondato).
  2. The KYC process includes:
    1. Submission of valid identification documents;
    2. A liveness check involving real-time video or photo capture;
    3. Confirmation that the user is at least 18 years old.
  3. Creators must truthfully declare whether they intend to upload explicit content. This declaration is locked and cannot be changed by the User once submitted.
  4. Manual reviews are conducted on:
    1. All new Creators before their first withdrawal;
    2. Any Creator earning more than $100 in a 24-hour period;
    3. Suspicious behaviour patterns detected by platform tools or community reports.
  5. Fanvue retains ID documentation securely via Ondato and links to internal moderation systems for ongoing compliance and verification.
  6. If evidence arises that a Creator is impersonating another individual or does not have the rights to their uploaded Content, Fanvue may immediately disable or delete the relevant Content or account. For impersonation-related takedowns, see our DMCA Policy.

3. CONTENT MODERATION PROCESS

  1. All uploaded Content, is automatically analysed through HIVE (AI-based industry standard content moderation tool), which flags:
    1. Nudity or explicit acts;
    2. Weapons, drugs, or illegal activity;
    3. Violence or abuse;
    4. Age classification based on visible facial and body features;
    5. Hand signs, written language (digital or handwritten), and people counting;
    6. AI generated Content.
  2. Content flagged by HIVE is queued for human moderator review. Moderators may:
    1. Delete the Content;
    2. Escalate for further investigation;
    3. Contact the Creator for clarification or documentation.
  3. Moderation is also supported by:
    1. Random spot checks (to ensure impartiality);
    2. Instructional checks based on system-defined behavioural patterns or risk thresholds.
  4. All moderators undergo regular training in legal compliance, safety protocols, and platform policies, and are subject to audit review for accuracy.
  5. Staff are trained to identify and remove Content that violates Fanvue’s standards and/or policies, and are subject to internal performance review procedures.
  6. Where Content is found to breach this Policy, Fanvue may apply a proportional deduction to the Creative User’s Balance, in accordance with Acceptable Use Policy (Section 7.1.2).

4. APPEALS AND ACCOUNT REVIEW

  1. Users whose Content or Accounts have been restricted, suspended, or permanently removed as a result of moderation enforcement may request a formal appeal.
  2. Appeals must be submitted within 7 calendar days of the enforcement via support@fanvue.com, or through the live support feature.
  3. Fanvue will consider all available evidence, including new context or clarification provided by the User.
  4. While most appeal decisions are issued within 7 days, complex cases may take longer.
  5. This appeal process operates in conjunction with, and does not replace, your rights under the Complaints Policy.
  6. Moderation SLAs:
    1. Fanvue aims to review all content reports and moderation flags within 48 hours of submission. Reports are prioritised based on severity and potential risk to user safety.
    2. In urgent cases (e.g. CSAM, impersonation, serious legal risk), moderation response may occur within hours.

5. CONTENT PROTECTION MEASURES

  1. Fanvue has implemented multiple features to detect and deter unauthorised use, reproduction, or scraping of Creator Content, including:
    1. Watermarking;
    2. Right-click prevention;
    3. Screenshot detection (on supported devices);
    4. Session analytics and account locking for suspicious behaviour.
  2. Any attempt to access, scrape, download, or copy Content in a way that violates the Acceptable Use Policy will result in immediate investigation and possible account termination under General Terms – Section 16 (Termination).

6. LAW ENFORCEMENT AND INDUSTRY PARTNERSHIPS

  1. Fanvue partners with leading content protection and anti-piracy vendors to monitor the internet for unauthorised distribution of User Content.
  2. Fanvue cooperates with law enforcement in all jurisdictions in response to reports of:
    1. Child sexual abuse material (CSAM);
    2. Human trafficking;
    3. Content involving non-consensual acts;
    4. Other criminal matters involving uploaded Content or communications on the Platform.
  3. Fanvue may share user information in accordance with the Privacy Policy and in line with applicable legal obligations.

Need Help?

If you have questions about any of our policies or need assistance:

Fanvue reserves the right to amend this Policy from time to time. The latest version will always be accessible at legal.fanvue.com, and significant changes will be communicated where required.