Content Moderation
Iconics Group > Content moderation
At Iconics Group we moderate ?
Our Multilingual moderators help your brand moderate prohibited content includes:
- Child Sexual Abuse Material (CSAM)
- Graphic violence, drugs, or weapons
- Hate speech, insults, abuse, bullying, or harassment
- Misogyny
- Nudity
- Scams or fraud
- Self-harm
- Sex solicitation
- Terrorism or radicalization
- Racism
- Politics
- and more …
Our Approaches to content moderation
Proactive moderation:
Also known as pre-moderation, a person or tool publishes,rejects, or edits UGC before it is published.
Post moderation:
In this approach, a person or tool reviews UGC after publication. This means that UGC goes live prior to moderation.
Reactive moderation:
In the reactive moderation approach, other users can flag inappropriate content for the moderator(s) to delete.
Real-time automated moderation:
Automated moderation means that all UGC submitted to an online platform is automatically accepted, rejected, or sent to human moderation, according to the platform’s specific guidelines and rules.