YouTube said machine learning was helping its human moderators remove almost five times as many videos that they were previously, and that 98% of videos removed for violent extremism are now flagged by algorithms.
In a blog post, Wojcicki said the company was already taking "aggressive action" on comments, and was testing new systems to counter threats that combine human and automated checks.
YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions. Adidas has said the situation is "completely unacceptable" while Mars, along with other companies, has pulled advertising until safeguards are in place. By training those algorithms to do the same for other types of videos, such as those questionable uploads that targeted children, the platform will be able to take them down a lot faster than it now can.
On December 4, Alphabet-owned YouTube announced to expand its team of reviewers to manage extremist, violent content by 2018. Last month, a number of high-profile brands suspended YouTube and Google advertising after reports revealed that they were placed alongside videos filled with exploitative and sexually explicit comments about children.
The Mountain View tech giant has been facing a revolt by advertisers over ads paired with disturbing videos, such as those made by hate groups and religious extremists. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors.
Talking about the machine-learning program, Wojcicki noted that human reviewers are an essential part of removing content and training the machine-learning system because human judgment is significant in making "contextualized" decisions on content.
Wojcicki said 180,000 people would have had to work 40 weeks to assess the same amount of content.
Susan Wojcicki, chief executive of YouTube, noted how the video site can be used for good but also has a dark side, allowing bad people spread harassment, harm and hate.
In order to combat this issue, the video hosting intends to "apply stricter criteria and conduct more manual curation" while simultaneously boosting its team of human reviewers "to ensure ads are only running where they should".