Case Study, Downloadables

Not safe for work? Not a problem.

Marjorie Yoro

Patreon is an international platform that connects artists and creators with their biggest fans. Founded in 2013 by musician Jack Conte, the website provides visual artists, musicians, and other creators with business tools to run their own subscription content service. Members can build relationships, provide their subscribers with exclusive perks, and get funding directly from fans and patrons. The platform has over 100,000 monthly active creators and over 2 million monthly active patrons.


Patreon has had to navigate some choppy waters since its inception. It needs to vigilantly draw the line between funding art vs. adult content and encouraging free discourse vs. removing speech some may consider offensive. As much as the site’s openness allows original content to thrive, it could also be a megaphone for people with hateful views on race, gender, and sexuality.

The Challenge

Timely and efficient content moderation (aka kick NSFW content to the curb)

As Patreon’s community base continues to grow, there are many challenges that come with enforcing Community Guidelines among their thousands of creators. Sometimes, there are materials posted that many consider to be borderline pornographic, racist, or sexist. Patreon recently decided to flag and remove these offenders from the site. With over 100,000 creator accounts, the Patreon team decided to leverage a mix of technology and human intervention for the audit process.

Patreon’s Trust & Safety Team partnered with Boldr to thoroughly audit over 7,000 creator profiles within a week, unprecedented swiftness.

The Approach

Content review, one by one

Patreon needed a reliable partner, who could audit large numbers of flagged profiles with discernment and sensitivity. Download PDF to read more.

Curious to learn more?