May 17, 2018
Facebook never really had an issue with advertisers. While mainstream media highlighted how some have opted to pull spend, by and large, Facebook remains the juggernaut it is because of the truck loads of money pumped into the business by marketers.
Additionally, Facebook never really had an issue with members. While a handful have decided to close their account, this is a small fraction of a miniscule percentage in the grand scheme of things.
What Facebook does have a serious problem with, is that it’s perceived behaviour has become reckless and the responsibility it should take, relative to the size of its user base, has not been met. This has brought the ire of a number of national governments to its doorstep. This is a serious threat and something the company needs to address before, we as marketers, decide that the long-term future of Facebook is too unstable to allocate large advertising budgets to.
To this end, Facebook has released further transparency details to show exactly how much content is removed. When you peruse the 81-page Community Standards Enforcement Report, you likely come to the conclusion that between the automated systems and human reviewers, the company is doing a fairly good job of removing the torrent of pornography, terrorist propaganda, fake accounts, spam, and graphic violence from its online estate.
To give one example, when it comes to adult nudity or sexual content, Facebook removed 21 million items of offensive material, 95.8% of the time before it was reported by users.
All this is reassuring, and that’s the intention. Facebook has a lot it needs to do and that’s what comes with the power it has with the community is has built. But importantly, it seems the social network is rolling up its sleeves to put the minds of advertisers, users, governments and your clients, at ease.