The video streaming giant has made a change to their policy which will see their moderators - who are employed to hunt for and remove unsuitable and violent content - limited to working just four hours a day in order to protect their mental health against what they might see.
Chief executive Susan Wojcicki told the South by Southwest conference in Austin, Texas, this week that the company would be hiring an extra 10,000 people to join the moderating team to help combat the issue, as well as making sure workers receive "wellness benefits".
She added: "This is a real issue, and I myself have spent a lot of time looking at this content over the past year. It is really hard."
The news comes after it was reported that a number of employees hired by social media companies such as YouTube and Facebook to moderate content leave their job within a year of starting, because of the psychological toll it takes on them.
In December last year, a woman hired by Facebook claimed she was tasked with reviewing as many as 8,000 posts a day with little training on how to handle the distress caused by the content, which can include images and videos of horrifying topics such as child sexual abuse, violence to animals, murder, and suicide.
Professor Neil Greenberg from the Royal College of Psychiatrists has said that people taking on jobs as moderators can become traumatised by what they see and can, in some cases, develop post-traumatic stress disorder (PTSD).
He told BBC News: "This is a positive first move from YouTube - a recognition that this material can be toxic.
"Supervisors needs mental health awareness and peers should be trained to look for problems. Those developing issues need early intervention and treatments such as trauma-focused cognitive therapy."