Unwanted Content

There are websites online dedicated to just about everything you can imagine.  That's a wonderful thing, if you imagine wonderful things, but it is also a horrendous thing if you imagine horrendous things and realise there are websites dedicated to those too.  Knowing this fact, there are places on the internet neither I nor most of you, will ever visit.  There is an understanding that those sites exist, we don't like them, if we could we would take them all down, but for most of us we don't have the power to do that, and those that do have the power have seemingly no motivation to do anything about them.

Therein lies one of the growing concerns people have with the internet and the wealth of content that can be found on it - that it encompasses all of human nature and not all of human nature is kind and caring and basked in rainbows and sunshine.  The presence of such darkness online is a reflection of the presence of darkness in our society, and again whilst we don't like it and we want it to disappear, the vast majority of us have little power to actually tackle any of it - that's what law enforcement is meant to be for.

In both of these scenarios, often the darkness as a whole is not something that law enforcement actively seeks out, but rather it is something they respond to when reported.  The limitations of reactive policing is that it relies on someone to actually see it as bad and report it for them to become aware of it.  We can leave the debate about effectiveness of enforcement for another time as that is a real problem both online and offline which is an entire issue onto itself.  No the major problem beyond effectiveness is that prerequisite of seeing it as bad enough to report.  The bulk of this content is such that you have to go out of your way to find it, you won't do so by accident.  If you went out of your way to find it, then you probably have an interest in it or a motivation to see it.  That makes you much less likely to report it by simple consequence of your seeking it out is in itself a crime in most cases.

We then come to the conclusion that if you want to tackle this problem then you really need a proactive policing policy which requires law enforcement to hire people who will actively seek it out.  The trouble with that option is the simple question - who would want to?

I spent three years as a moderator on a popular online forum and in that time I had to read thousands of posts and review content that was uploaded by users and throughout my time in that post one thing was apparent - it was depressing.  Not because of the monotony but because people are vile.  I was lucky enough in my time to never have observed anything that in itself was illegal, just things that broke the terms of use and community guidelines that users had agreed to when they signed up.  Even at that though, some of the things people said which had to be removed were disgusting.  I've been a moderator on a few forums over the years, some were a lot happier than others.  From best to worst I gained enough of an understanding of people to know that anyone is capable of anything and to imagine someone as incapable of anything is naive.  It's often the people you least suspect who turn out to be the worst offenders.

You don't have to look very far online to see some of the hate and abuse that people post.  The comment section on YouTube alone provides an abundance of content that contains so much negativity that you eventually come to associate it with negativity so much that you actually avoid it entirely.  One argument I often hear for allowing all forms of comment is that of free speech and seeing both sides of an argument - I don't accept that as valid.  Up until this point you have probably agreed with almost everything I have written and considered the content I referred to - whatever you inferred from it - to be content that needed removed, yet when you reach this point there is resistance.  Moderation it seems is only tolerated when it is something you personally do not agree with being moderated, not something people in general would want to be moderated.  Everything up to this point in this post would be content people in general agree should be removed, yet at this point you now likely challenge that concept because it could infringe upon what you can post.

The limitation has to come from you not from society, that limitation has to be the recognition that whilst you have the right to think and feel and say whatever you want, you don't have the right to act upon it or to direct it at other people when they don't want to hear it or when it breaks the law.  You have the right to free speech, but I and everyone else has the right to ignore you and not listen to you.  If I don't want to hear it nor does anyone else, then we have the right to remove that content.

There's a growing challenge to the long held position that many tech companies have had that their websites are platforms not publishers and should be exempt from the rules that apply to publishers.  That challenge is based on the fact that they as websites, particularly those such as Facebook and YouTube, host the content, and distribute it to those who come to see it, the same way traditional publishers put out content for people to purchase.  The argument that Facebook et al are not responsible for the content stored on their own servers cannot be held any longer, that same argument could be used by all of those who publish the content I mentioned at the start of this post, the content you likely agreed should be removed.  If it's not legal for you to print it and sell it then it shouldn't be legal to display it on a website, serve ads next to it, and make money from it.

No comments:

Post a Comment

All comments are moderated before they are published. If you want your comment to remain private please state that clearly.