Closing chatrooms won't stop child abuse

The rise in cases could be put down to the fact that those who abuse over the internet may be traced

Philip Hensher
Thursday 25 September 2003 00:00 BST
Comments

Microsoft have announced, quite out of the blue, that they are closing down all their UK chatrooms on the internet with effect from 14 October. Other providers of internet chatrooms are continuing their services for the moment, and the company is not proposing to close down its American chatrooms; however, it is going to ask users for a credit card number and a billing address before they can log on in America.

Never having used a chatroom myself, and indeed not being at all sure how you would set about entering one, the news doesn't immediately strike me as interesting or important. However, 1.2 million people in this country use chatrooms every month, so it is clearly a popular and perhaps even useful resource.

Chatrooms are principally used, it seems, for fairly desultory conversation about trivial things, and, very enthusiastically, for lonely or oversexed people to talk dirty to strangers or even to pick them up. Conceivably, however, they could form a resource through which users with a particular, uncommon enthusiasm, for growing cacti or for the novels of Maud Diver could contact each other and discuss the subject to their hearts' content.

I'm not sure that this actually does happen to an appreciable extent, but clearly the resource does theoretically permit this, and could become a useful tool. It allows, too, the possibility of a direct, written conversation between two or three people, somewhere between an exchange of e-mails and a telephone conversation; something of obvious attractiveness.

The resource is a good idea in theory, and possesses a lot of useful applications as far as communication goes. The reason that Microsoft are closing down the UK chatrooms is that a series of cases of misuse and abuse have surfaced. The dangers arising from misuse are, it seems, substantial enough to justify removing the whole operation.

These misuses are to do with child abuse, and are certainly very frightening and worrying cases. Many users of the chatrooms are teenagers or younger, for obvious reasons; at what may be a difficult and confusing time in life, it is tempting to acquire the instant "friends" of a chatroom, and to pass yourself off as more attractive, more confident, funnier or more charming than you are in real life.

Chatrooms are, for all realistic purposes, anonymous. You can be anyone you choose. Unfortunately, that also means that you have no real idea whether the person you are talking to is who he says he is. In one high-profile case, a former US marine eloped with a 12-year-old girl. He claimed that he had no idea of her real age. In other cases, young teenagers have thought they were talking to someone their own age, and talked with greater freedom than was sensible; subsequently, they found they were talking to adults who were intending to abuse them.

It's not quite clear to me, or indeed to the authorities, how widespread such cases are, and whether in reality they justify closing down what is evidently a service which gives harmless pleasure to a great many people. It is perfectly possible to imagine that the loss of the service would be a great blow to many people who live in isolated or unsympathetic places, who use it simply for a little society. The question is whether the provision of the service creates the conditions for child abuse, or whether it is just a convenient medium. Would the abuse have happened anyway, or does the chatroom create the abuser?

It's a very difficult question to answer. Certainly, the vast ocean of pornography visible through the internet has clearly led to a sickeningly tedious level of sexual imagery throughout our culture. Everything is sex nowadays, and I can't help thinking that the limitless provision of sexual material has in part created the hunger. Perhaps an abuser does start more easily because the internet allows him to disguise himself so thoroughly; perhaps the medium does create the user.

If we could be sure that this was the case, then there should be no hesitation in closing down chatrooms, or at the least, putting measures in place which would make it impossible to use them entirely anonymously. But it doesn't seem undoubtedly true, as yet, and only a relatively small number of cases have surfaced.

It is worth saying, too, that when such tragic cases have come to light, the evidence of the chatroom exchanges has been of great importance in demonstrating the facts of the case. It is a very fine line to draw, but it might be argued that a more closely controlled and supervised structure might make it easier for the police to identify cases of abuse.

However many people are currently using MSN's chatrooms for these disgusting purposes, it can't be thought that when the chatrooms close down, the problem will vanish. Some of them, no doubt, are opportunistic abusers who would only abuse through this specific medium. Some of them, however, will certainly pursue their obsession through other, much more shadowy means.

In the case of Michael Wheeler, a 36-year old man from Cambridge, he could be prosecuted for 11 sex offences against young girls on the strength of two 13-year-old girls he contacted via a chatroom. There seems little doubt that, without the chatroom, he would have found some other means to perpetrate the abuse. It is no consolation to his victims, but his use of the chatroom provided important evidence, and may have prevented other girls from being abused.

Microsoft, I suspect, are pulling the plug not through concerns for the welfare of its users, but through simple commercial considerations. The costs of properly policing the chatrooms would be considerable. As in America, it could be done at a stroke by requiring a user to give a credit-card number, and effectively barring children from these services. Other forms of identification which could be asked for, such as passport numbers, might be considered, unwilling as one is to hand over personal information of that sort to a private company, and they certainly could not be given the means to check whether the information is correct.

But it is surely not outside the bounds of ingenuity to create a tool capable of identifying users with suspicious patterns of behaviour - ones who approach a series of other users in exactly the same terms, for instance, or users in teenage chatrooms who engage with one other user at enormous length. It would just be expensive.

Still more expensive, of course, is the risk of a civil law suit, and perhaps what has frightened Microsoft off is the idea that the parents of Michael Wheeler's victims, say, might with good reason start a case against Microsoft for failing their duty of care. They have washed their hands of the whole business, and many children's charities have welcomed the move.

I wonder whether it was quite wise to do so. The resource is, potentially, a useful and beneficial one, given better supervision. Moreover, one might reasonably ask whether the considerable increase in the number of child abuse cases recently may not be down, in part, to the fact that those who abuse over the internet may be traced. If chatrooms are all closed down and the number of child abuse cases subsequently falls, we would be most unwise to hail it as a success. It might simply be that the abuse has gone elsewhere, unnoticed.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in