Out-Law News 3 min. read
27 Mar 2008, 6:17 pm
The measures are just some of the recommendations of a report by child psychologist and television personality Dr Tanya Byron which was commissioned by the Government.
Education minister Ed Balls said that the Government was "fully committed to implementing the report's recommendations," and that it would immediately begin establishing the recommended UK Council for Child Safety.
Byron addressed widespread fears about children's use of computers and the internet, including worries about violent computer games, exposure to potentially harmful discussions about self-harm and suicide, exposure to inappropriate conduct and to contact from people who want to harm them.
She said that parents' worries are not helped by the fact that they often understand the technology less well than the children they hope to protect.
"There is a generational digital divide which means that parents do not necessarily feel equipped to help their children in this space – which can lead to fear and a sense of helplessness," she said in the report. "This can be compounded by a risk-averse culture where we are inclined to keep our children 'indoors' despite their developmental needs to socialise and take risks."
Byron did say, though, that news coverage often distorted the issues and stripped children of the credit they deserve for being in control of the technologies they use.
"Headlines have contributed to the climate of anxiety that surrounds new technology and created a fiercely polarised debate in which panic and fear often drown out evidence. The resultant clamour distracts from the real issue and leads to children being cast as victims rather than participants in these new, interactive technologies," she said.
Byron has proposed the establishment of a national information campaign to inform parents and children about how to stay safe while using the internet.
Byron has proposed that the new Council establish codes of practice which internet publishers and social networking sites would sign up to.
"The incentive for signing up to one of these codes would be the opportunity for companies to promote themselves as responsible businesses with an interest in online child safety," she said in her report. "It is likely that the main consequence of breaching the codes would be public censure by the Council. Avoiding this kind of reputational damage would be a strong incentive for companies to co-operate."
Byron said that social networking sites, for example, could adhere to the code of practice by having more stringent privacy controls for children than for adults.
Byron has recommended that the Council be established with its own secretariat by this time next year, although Balls did not give a concrete funding commitment to reporters at the launch of the study.
Byron said that user-generated content can pose dangers for children of exposure to inappropriate material. Though she recognised that moderation by the site operators themselves was often impractical, she did say that sophisticated models of moderation could help keep harmful content away from children.
"Sites harness the social capital of their community of users to improve moderation. Reports from more than one user, from long-standing users or users who have been ‘rated’ highly by their peers can be flagged for attention with a higher priority, and such users can even be given moderation powers themselves, so that content they flag is removed until a moderator can look at it," said Byron in the report.
"This approach to moderation empowers children, young people and adults to be active participants in keeping themselves and others safe online, and making their web communities the kind of place they want to be."
Byron said that content producers and hosts had told her during the consultation period of the report that they were reluctant to monitor any content because it would make them liable for it under the E-Commerce Act.
She said that the Council, when founded, should investigate the issue.
"There have been suggestions that companies could minimise the risk of liability by engaging a third party to monitor the content on their site and explicitly inform them about content which breaches their site’s acceptable use policies," she said. "I recommend that the Council explores the possibility of developing such arrangements to minimise the risks of liability for companies that take steps to make their products safer for children."
Struan Robertson, a technology lawyer with Pinsent Masons and editor of OUT-LAW.COM, said that he doubted this measure would alter a company's responsibility for content.
"Monitoring user-generated content before or after it appears on a site is a good way to maintain the quality of a site's content," said Robertson. "If a site is aimed at children, there are even stronger arguments for pre-moderation. But as soon as you monitor content, you risk liability for anything that slips through the net, whether you check the content before or after it appears on the site and whether you monitor the content yourself or outsource the task to a third party."
Using an overseas monitor can save money, though, according to Robertson. The terms of engagement can also protect a site operator against fines or damages. "Your contract with a third party monitor can be used to shift the financial burden if something goes wrong," he said.