The response paper states that the duty of care will “only apply to companies that facilitate the sharing of user generated content, for example through comments, forums or video sharing”. This is a step in the right direction, but one hopes that later in the legislative process the idea of a duty of care will be put to bed. Lawyers, and lawmakers, know that in our jurisdiction the concept of a 'duty of care' has developed incrementally through case law, with some ebbs and flows, but with an overarching incremental and logical sense of legal narrative. In the limited scenarios in which duties of care have been established by statute, that has happened in order to fill a vacuum, for example in relation to occupiers’ liability. Applying this badge as a kind of panacea to online harms frankly smacks of crowd-pleasing rather than careful, legally-informed thinking.
If anything, other indications in the response paper only serve to highlight how problematic continuing with the duty of care label will be. As already indicated, the response paper is clear that the regulator will not investigate or adjudicate on individual complaints. As a result, we currently face the prospect of a regulatory framework which will have the badge of a duty of care but which will leave individuals distinctly confused about what that duty of care means for them in terms of claiming redress. They will not be able to pursue a complaint to the regulator about their individual circumstances. They may be able to bring a claim through ordinary legal proceedings concerning what has happened to them, but that claim will need to be based on the existing framework of their legal rights. They will not be able to contend that they are entitled to remedies based on the new duty of care because the new regime will not establish a new private right of action in tort. How are judges supposed to navigate all of this?
Another confusing aspect of the response paper is the continuing lack of clarity around in-scope services. The scope originally proposed in the OHWP was that the legislation would “apply to companies that allow users to share or discover user-generated content, or interact with each other online.” The breadth of this was quite alarming. In the response paper the government said:: “The legislation will only apply to companies that provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example through comments, forums or video sharing.” However, the definition remains broad.
The government has assessed that "only a very small proportion of UK businesses (estimated to account to less than 5%) fit within that definition". If this was meant to be greeted as good news then it may well miss the mark. Assuming that “less than 5%” should be read as “something approaching 5%”, this still represents a very substantial number of UK businesses.
Getting the scope clear will be one of the biggest challenges for the government as it moves towards drafting legislation. For example, take the term 'website'. A common understanding of this term might be that it refers to “a page on the internet”. That’s a quite outdated notion. Where do apps, chat rooms and chat functionality in games fit in, for example?
Connected with this is the fundamental problem around private communications. The OHWP and the response paper both make the point that many illegal or harmful communications take place in private spaces. Most respondents to the OHWP consultation said that private communications should not be within scope of regulation at all. There is a strong sense that the government is not happy with this, but does not have a clear plan for addressing the very difficult question of how private communications could be effectively brought within scope in a way which is lawful. Our current best guess is that there may be some lighter touch regulation around providing appropriate functionality for users to protect themselves in a private environment. Clearly, though, there is a lot still to do in this area.
The unclear
Much remains unclear and there is hard work to do to make progress on the more difficult aspects of the online harms agenda.
It looks almost certain that Ofcom will be appointed as the regulator for online harms. Ofcom appears to have been the clear preference among those respondents who expressed a preference, and it is understandable that the government is reluctant to establish a brand new regulatory body in circumstances where the regulator’s task will not be straightforward. What is unclear, however, is just how effective Ofcom will be. It will be tempting for Ofcom to play to its strengths and to shape its role based on the strategies and tactics which it has learned from its current remit, especially in the context of broadcasting. That will only work to some extent, though, and Ofcom is likely to need to bring in new expertise and skills, particularly on the technology side, if it is to have credibility in its role.