From the CoHost Blog:

YouTube Restricted Mode - Good Intentions, Bad Execution

It’s no secret that YouTube is fighting an uphill battle when it comes to moderating their content.

Between the dark humour of some of their largest creators and the mass-boycotting of YouTube advertising by large brands that found themselves appearing beside extremist content and hate speech videos, YouTube have been making headline news a lot recently, and rarely for good reasons.

In an effort to try and salvage some face, YouTube introduced Restricted Mode; an optional setting that promises to automatically filter out adult content from users feeds. Good idea, right? Nothing could possibly go wrong.

It was discovered that YouTube’s new filter decided to block out all LGBT related content, regardless of how family friendly the videos actually were, from same-sex kisses and coming out videos, to a lesbian couple’s wedding vows. YouTube themselves were quick to reply, stating “The intention of Restricted Mode is to filter out mature content for the tiny subset of our users who want a more limited experience. LGBTQ+ videos are available in Restricted Mode, but videos that discuss more sensitive issues may not be.”

It was quickly noted however that this wasn’t entirely the case, with same sex weddings and even music videos being blocked with little to no context as to why.

So what went wrong?

YouTube say that Restricted Mode uses several signals to find what to class as mature content, including the videos title, description and metadata along with age restrictions and community guideline reviews. Whilst it’s possible that the algorithms were created to intentionally filter out LGBT content by a team with a very restricted view of what is socially acceptable and what isn’t, but far more likely that prejudice in design is that YouTube learned some misguided lessons from the community itself. What is deemed as offensive varies hugely by age, race, social group and just from person to person. If enough people flag something as offensive, the chances are the algorithm will just take people’s word for it.

YouTube has had a long running problem with community moderation. Let’s not forget about the YouTube Heroes feedback. Any algorithm designed to learn from community feedback needs some solid guidelines and safeguards built in beforehand to deal with the inevitable trolls and false flags. Let’s not forget, the internet turned an innocent AI into a Hitler-loving sexbot in under a day. Any algorithm that deals with content moderation needs a way of understanding social context, especially around potentially controversial issues.

YouTube’s response was surprisingly straightforward, stating “Over the last several months, and most definitely over the last few days from LGBTQ and other communities, we’ve gotten lots of questions around what Restricted Mode is and how it works. We understand that this has been confusing and upsetting, and many of you have raised concerns about Restricted Mode and your content being unfairly impacted. The bottom line is that this feature isn’t working the way it should. We’re sorry and we’re going to fix it”. They admitted that the system may never be completely perfect, due to the difficulty in getting automated algorithms to understand the nuances and context of topics, but pledged to continue training and improving their systems to make sure this sort of thing doesn’t happen again.

This reassurance from YouTube seemed to appease a lot of people's concerns, with the community at large being thankful for the honesty, and for the assurance that this would be fixed swiftly.

Has your channel been affected by these changes? Let us know in the comments below.

Share this:

Comments

You must log in to leave a comment

Sign in