Facebook temporarily suspended one of its users from commenting on public posts. The punishment was supposed to last a week. It has now been two weeks. This can happen to any Facebook user. Facebook user Rima Regas is being censored by Facebook. The social network put her on time out after apparently getting complaints about public posts she has made. Unfortunately, this does not appear to be an isolated incident.
Last weekend, technical evangelist Robert Scoble saw his comment blocked because Facebook deemed it “irrelevant or inappropriate”. When I inquired about the issue, Facebook told me the block was a false positive caused by an automatic spam filter. I also wrote this:
Facebook’s algorithms for comments made on Subscriber posts are apparently much pickier because anyone can reply to a public Facebook post. To be honest, I only find that slightly more comforting.
In her original Facebook status, Regas said “I don’t see a way to file a complaint or defend myself.” She isn’t the first to realize getting in touch with Facebook is very difficult. In some two years of writing about Facebook, I have received hundreds of complaints about the company’s communication problems, both from members and journalists alike. Regas’ story is just one of many I have written about publicly in order to get Facebook to respond.
Again, my problem with all this is not that Facebook’s reporting systems screwed up or were abused. That is bound to happen with any anti-spam implementation.
The worrying trend here is that Facebook continues to add features like this one without giving users an option to fight back.
#OccupyWallStreet demonstrates that there are many ways to intentionally, accidentally or unconsciously but automatically disrupt the free flow of information
Events that could occur:
- Spam algorithms blocking e-mails containing certain references;
- Video distribution algorithms arguing that spam, copyright or other policies have been violated and removing content;
- Video distribution algorithms turning off the option to have third parties embed videos on their sites;
- Video distribution algorithms arguing that ‘the user’ has removed content or that his account has been terminated for some reason;
- Trending and ranking algorithms forgetting to trend and rank certain content;
- Content, websites and blogs accidentally being taken down “due to an automated process;”
- Network algorithms stating “The server encountered a temporary error and could not complete your request. Please try again in 30 seconds. That’s all we know;”
- Algorithms deciding that posted content should only be showing up to the person who posted it;
- Algorithms accidentally polluting search indices with an abundance of irrelevant search results;
- Algorithms stating that there was a content delivery failure for whatever reason;
- Algorithms redirecting internet users from the content the user intended to visit to content that is probably much more to the user’s liking.
Seemingly mundane technical specifications of Internet routers and social-networking software platforms have powerful political implications. In virtual realms, programmers essentially set the laws of physics, or at least the rules of interaction, for their cyberspaces. If it sometimes seems that media pundits treat Facebook’s Mark Zuckerberg or Apple’s Steve Jobs as gods, that’s because in a sense they are—sitting on Mount Olympus with the power to hurl digital thunderbolts with a worldwide impact on people.
It’s The Algorithm Stupid! Part IV – Humanity becomes redundant
It’s The Algorithm Stupid! Part III
It’s the algorithm stupid! Part II
It’s the algorithm, stupid! Do algorithms offer the ultimate grounds for exoneration? Can they fail, or only the people writing them?