Introduction
Apple’s App Store Guideline 1.2 – Safety – User Generated Content requires apps that host user-generated content to implement moderation and abuse-prevention measures. If your app was rejected with a notice like:
Missing required UGC precautions: no custom EULA, no content filtering, no user-blocking, no rapid response to reports
This guide explains how to add the necessary safeguards so Apple can approve your app without submitting a new build.
Implement Required Precautions
To address this issue, update your app to include the following safeguards:
- Enforce a clear EULA: Require users to agree to terms that explicitly prohibit objectionable content and abusive behavior.
- Filter inappropriate content: Implement automated or manual mechanisms to detect and hold objectionable posts for review.
- Enable user blocking: Provide a way for users to block accounts that engage in abusive or harassing behavior.
- Respond to reports within 24 hours: Remove any reported content and eject the offending user within a one-day timeframe.
Respond in App Store Connect
In the Resolution Center, reply with a note such as:
“We’ve added a custom EULA, enabled automated content filtering, implemented user-blocking, set moderation thresholds to 1, and committed to a 24-hour response policy.”
Do I Need to Generate a New Build?
No. These changes are metadata and backend configurations—no new binary is required.
Troubleshooting and FAQs
Q: What if I don’t have a built-in filtering tool?
A: Use a third-party moderation plugin or API (e.g., Akismet, CleanSpeak) that integrates with WordPress to scan and flag content.
Q: Will blocking a user in the app sync with my website?
A: Yes—user-blocking should be managed by your central WordPress database so it applies across web and app interfaces.