The second largest search engine of the world You Tube, has always stood against hate speech. Their hate speech policy for the creators clearly states that they would remove the content that promotes violence against an individual or a section of the society bases on factors like age, cast, nationality, ethnicity, race, religion, sex and etc.
But it is important to note that, only the creators are not responsible for hate speech. There are certain users as well whose comments might spread hatred across the community.
As a solution, You Tube has introduced a feature where the system identifies potentially inappropriate comments. When it detects such comment an alert will pop out asking them to be respectful with the comments. But the comment wouldn’t be deleted as the alert pops out. It will allow the user to either post it or edit it.
A similar feature was introduced to Instagram not so long ago. It also followed the same procedure as above and reminded the user to “keep comments respectful” via a pop up.
The feature is now rolling out on Android and the You Tube claims that it would be available across all platforms soon.
Along with the introduction of this feature, You Tube is testing a new one where the negative comments giving out hurtful vibes would be filtered out allowing the You Tubers not to see them. So these initiatives make it pretty clear that You Tube intends to create a non-toxic atmosphere where the creators are treated with respect.
At the same time You Tube has received complaints from creators all across the world stating that they are following a biased procedure in its monetization. According to them all creators are not treated as equals by the company. You Tube admitted that the creators are having such concerns and they went on saying that the current data that they have on creator’s identity is not sufficient to solve this issue for them. What they mean by “identity” are attributes such as race, gender, ethnicity and etc. of the creator.
You Tube addressed this whole issue in a blog post and wrote that they are going to ask for these key information from the creators and then properly investigate whether some sort of discrimination has happened as the creators claim. Then they added that, there is going to be a complete check on their monetization systems to verify whether the issue actually exists. Furthermore the blog says they are working on methods to identify the patterns of hate and discrimination which targets specific communities.
It is clear that You Tube has acknowledged some main ongoing issues with their platform, but they couldn’t provide an exact time on how long it would take to fix these.