Inlo's Child Safety Policy and Standards Against Child Sexual Abuse and Exploitation (CSAE)
Last Updated: July 24, 2025
1. Our Unwavering Commitment to Child Safety
At Inlo, we are dedicated to creating a safe and positive environment for all users of the Inlo apps. We have a zero-tolerance policy for Child Sexual Abuse and Exploitation (CSAE) and Child Sexual Abuse Material (CSAM). Our commitment extends to preventing, detecting, and responding vigorously to any content or activity related to CSAE on our platform. We adhere to applicable laws and industry best practices to ensure the safety of children.
2. Definitions
To ensure clarity, we define the following terms within this policy:
Child/Minor: Any individual under 18 years of age.
Child Sexual Abuse and Exploitation (CSAE): Refers to any content or behavior that sexually exploits, abuses, or endangers children. This includes, but is not limited to: grooming, child sex trafficking, sextortion of children, online enticement for sexual acts, and any form of sexual abuse or exploitation of a minor.
Child Sexual Abuse Material (CSAM): Any visual depiction, including but not limited to photos, videos, and computer-generated imagery, involving a minor engaging in sexually explicit conduct. The production and distribution of CSAM is illegal and will be treated as such.
3. Prohibited Content and Behavior
The following content and behaviors are strictly prohibited on the Inlo apps:
Sharing or Facilitating CSAM: Uploading, sharing, or attempting to share any form of CSAM.
Grooming: Any attempt to establish an inappropriate relationship with a child for exploitative purposes.
Solicitation: Soliciting sexual acts from a child or attempting to entice a child for sexual purposes.
Exploitative Content: Any content (text, images, video) that promotes, solicits, or facilitates CSAE.
Illegal Activities: Any other behavior that violates laws related to child protection.
False Reporting: Intentionally submitting false reports of CSAE.
4. Reporting Mechanisms
We encourage all users to report any suspected instances of CSAE or CSAM immediately. Your vigilance is crucial in maintaining a safe environment.
In-App Reporting: Users can report concerns directly within the Inlo apps using the designated reporting feature. This mechanism is designed to be accessible and straightforward.
Email Reporting: You may also send a detailed report to our dedicated child safety contact at: techatinlo@gmail.com
When submitting a report, please include as much detail as possible, such as usernames, descriptions of the content or behavior, and timestamps, to help us investigate efficiently.
5. Our Moderation and Enforcement Actions
Upon receiving a report or detecting potential CSAE/CSAM through our proactive measures, Inlo will take immediate and decisive action:
Rapid Review: All CSAE reports are prioritized and reviewed by trained personnel. We utilize available technology to aid in the detection of known CSAM.
Content Removal: Any identified CSAM or content that violates our CSAE policy will be removed immediately.
Account Termination: User accounts found to be involved in CSAE will be permanently terminated, and the individual will be banned from creating new accounts on Inlo.
Data Retention: We retain relevant data related to confirmed incidents of CSAE for a period as required by law, to assist law enforcement in their investigations.
6. Cooperation with Law Enforcement and Authorities
Inlo is fully committed to cooperating with law enforcement and relevant child protection agencies.
Reporting to Authorities: We will promptly report all confirmed instances of CSAM and other CSAE activities to the National Center for Missing and Exploited Children (NCMEC) CyberTipline (for US-based incidents) and/or other relevant local and international law enforcement agencies.
NCMEC CyberTipline: https://www.missingkids.org/gethelpnow/cybertipline
Information Sharing: We will provide any requested user data and content to law enforcement in accordance with legal processes to aid in their investigations.
7. Proactive Measures
In addition to user reporting, Inlo implements proactive measures to detect and prevent CSAE:
[Optional: If you use any specific technology like automated scanning for known CSAM hashes or AI/ML for suspicious behavior, you can mention it here. Otherwise, you can keep this section general or remove it if no specific proactive tech is in place beyond standard monitoring.]
Our internal teams receive training on identifying potential CSAE, understanding reporting protocols, and responding effectively to child safety concerns.
8. Child Safety Point of Contact
For any inquiries related to child safety or to report concerns, please contact:
Email: techatinlo@gmail.com
9. Policy Updates
This Child Safety Policy may be updated periodically to reflect changes in laws, best practices, or platform functionality. We encourage you to review this page regularly for any revisions.
By using the Inlo apps, you agree to comply with this Child Safety Policy and our [Terms and Conditions](https://www.theinlo.com/pages/childsafetystandards)