YouTube said Tuesday that it has removed over 100,000 flagged videos under its new hate speech policies.
The company says it also terminated over 17,000 channels and 500 million comments under the same rules.
In June, YouTube updated its hate speech policies to ban content that promotes extremist ideologies such as white supremacy as well as certain conspiracy theories, like the false claim that the Sandy Hook Elementary School shooting never happened.
YouTube says it relies on a combination of people and technology to flag content for review. It uses what are called hashes, or digital fingerprints, to catch copies of known prohibited content before it's available to view.
YouTube said in a blog post: "We're investing significantly in these automated detection systems. ... An update to our spam detection systems in the second quarter of 2019 [led] to a more than 50% increase in the number of channels we terminated for violating our spam policies."
Across Google, YouTube's parent company, over 10,000 people have been tasked with detecting, reviewing and removing content that violates the site's guidelines. YouTube says it will release more information in the coming months on how it plans to promote a positive environment on the platform.