Hate speech has been a big threat to a social media account. For business pages, it’s very critical to handle this kind of threat. Some social media platforms added tools to help combat them. Facebook enabled warning screens for comments that fall under their bullying. Instagram added anti-bullying tools. Twitter enabled offensive comment warnings. Today, YouTube is doing the same as it enables offensive comment warnings.
YouTube’s offensive comment warnings is a pop-up reminding a user to reflect before posting a comment. Such a pop-up will appear on possible hurtful comments. The pop-up message is quoted like this:
“If you’re not sure whether your comment is appropriate, review our Community Guidelines.
Did we make a mistake? Let us know.”
This aims to give the commenter an option to reflect before posting a comment that may be hurtful to other users. Still, the commenter fully decides whether to delete, edit, or post the comment. Together with offensive comment warnings, YouTube is testing out new message filters. This feature will automatically hold any hurtful comment for review.
YouTube enables offensive comment warnings as of 03 December 2020.
Implications for Marketers:
Offensive comment warnings are a great measure to promote respect not only on the YouTube platform but for all social media networks as well. This feature opens up the eyes of many users to inclusivity and social inequalities. For marketers, it can serve as the protection of ads and brands. It may not have a direct impact on a campaign lift, but it can help build a good reputation online.