Last updated: April 27, 2026
Audio Status Maker prohibits any content that constitutes child sexual abuse material (CSAM), including AI-generated or synthetic CSAM, or that exploits, endangers, or facilitates the grooming of children. Users may not create, upload, store, or share such content using Audio Status Maker.
Audio Status Maker is a content creation tool. All user-generated content — including recordings and audio files — is processed and stored on-device. We do not upload, host, or have access to user-created media on our servers. We do not enable in-app communication between users. When users share content, they do so via their chosen third-party platform, which applies its own content moderation policies. Audio Status Maker is not responsible for content once it leaves the app, though we remain committed to cooperating with authorities if notified of misuse.
To report suspected CSAE involving Audio Status Maker, email audiostatusmaker@gmail.com with the subject line "CSAE Report." Include any relevant details such as a description of the content and how Audio Status Maker was involved. Reports are reviewed promptly.
We review every CSAE report promptly. Where we become aware of CSAM, we will voluntarily report findings to the National Center for Missing & Exploited Children (NCMEC) and cooperate with law enforcement as required by applicable law. As a tool that does not host user-generated content, Audio Status Maker may not be a covered electronic service provider under 18 U.S.C. § 2258A, but we are committed to reporting voluntarily and in good faith. We reserve the right to pursue any available means to restrict access for users found to be violating these standards.
CSAE point of contact (Trust & Safety Lead): audiostatusmaker@gmail.com
For information about how we handle your data, please see our Privacy Policy.