In the 2000s, I attended various seminars in which commentators complained that there had not yet been an ‘internet election’ in the UK. The complaint ran that election debates were still largely led by elites in the broadcast and print media, with citizens on the digital media playing a more limited role than hoped. Fast forward to 2018, and it is clear that the era of the internet campaign has truly arrived. However, the shift has not generated the celebrations that the early optimists anticipated. The last couple of years has seen a spate of books published about democracy being under threat, often with considerable emphasis placed on the effects of the social media. Over the weekend, the House of Commons Digital, Culture, Media and Sport Select Committee published an Interim Report as part of its long-running inquiry into fake news. The Committee adds its voice to the concerns about the digital media, telling readers that ‘democracy is at risk, and now is the time to act, to protect our shared values and the integrity of our democratic institutions’.
The Committee’s inquiry started life in 2017 to look into general issues about fake news. As evidence about the methods used in the 2016 referendum campaign came to light, the Committee’s work expanded to look at some specific allegations. The Interim Report contains a summary of evidence concerning the campaign to leave the EU, while also providing various suggestions for prospective reforms. The Committee’s work is complicated by the fact that several other bodies are investigating the role of social media in elections and public life more generally – including the Information Commissioner, the Electoral Commission, the Cairncross Review, Ofcom and the House of Lords Communications Committee. The Government is also expected to publish a White Paper on the digital media in the autumn. Given the various overlapping investigations, the Interim Report can be seen as one stage in a broader process. As a result, the Interim Report is not a blueprint for comprehensive reform, but is a call for action that highlights areas of concern for other bodies to consider in detail.
The urgency for action stressed by the Committee contrasts with the ‘wait and see’ attitude that was more common in the earlier days of the internet. Some of the regulatory issues mentioned in the Interim Report were foreseeable in the 2000s. However, the thinking then tended to be that additional regulation was not yet necessary given that the use of digital communications in elections was still evolving. There was also a fear that new controls would disproportionately stifle participation and discourage the use of the technology by citizens. While it is easy to see why such an approach was taken, the price is being paid now and some damage has already been done. Worse still, the problems have become prominent in relation to a divisive referendum campaign on a fundamental constitutional issue – a matter where the integrity of the process is particularly important.
The point can be seen in relation to the Interim Report’s recommendations on transparency. With printed election material (such as election leaflets), there is a requirement to include the name of the printer and promoter (the ‘imprint requirements’). The rule means that people know who is responsible for a publication and can thereby assess its credibility. In 2003, the Electoral Commission recommended extending the imprint requirements to digital communications (using powers under s143(6) of the Political Parties, Elections and Referendums Act 2000). However, that recommendation was not acted upon by the Government. Since then, concerns have been expressed about the role of anonymous communications in elections (which I commented on for this blog several years ago). The Interim Report renews the demand for the transparency requirements to be extended to digital material (as does a recent report from the Electoral Commission). The Committee also goes further and calls for the Government to consider ‘the feasibility of clear, persistent banners on all paid-for political adverts and videos, indicating the source and making it easy for users to identify what is in the adverts, and who the advertiser is’.
There are challenges in implementing such reforms to maximise transparency. Political communications online go far beyond the digital equivalents of traditional leaflets and posters. For example, issues of transparency arise if an organisation employs people to post political messages on social media posing as ordinary users, or pays ‘influencers’ to endorse a particular candidate. These examples show how much rests on the definition of ‘election material’ that the imprint requirements would apply to. There are also legitimate concerns about unduly burdening citizens, for example if a name and address has to be included whenever a person makes a partisan comment available to the public on the social media (and if such details have to be included on the user’s profile). Moreover, there are also questions of policy about whether it is desirable to allow certain anonymous communications in political debate more generally. The issues are not insurmountable and even if there are limits to what can be regulated, applying the imprint requirements to the most obvious types of online political advertising would be a sensible starting point.
A further issue in relation to election communications concerns the fragmentation of audiences and the impact on political debate. There is much discussion about the digital media enabling people to go to their preferred media outlet that offers a specific political perspective and interpretation of events, and steering audiences away from a common national conversation. The Interim Report, however, focuses on a more specific different type of fragmentation, in the form of micro-targeted political messages (using social media data to tailor messages to reflect the priorities of the recipient). While micro-targeting is not new, the level of data acquired via the digital media has allowed the tool to be developed with greater precision. One issue raised by the Interim Report is whether the acquisition and use of such data infringes a person’s privacy and data rights. More generally, micro-targeting raises ethical questions. While targeted campaigns can give people valuable information, it can also lead to the segmentation of campaigns. The criticism runs that different sections of society receive separate sets of promises, and people are less likely to get a sense of the overall package of issues at stake. People may have little idea about what is being said to others, which in turn gives greater scope for inconsistent promises to be made and fewer opportunities for messages to be scrutinised.
The primary solution put forward by the Committee is for greater transparency, including a ‘public register for political advertising, requiring all political advertising work to be listed for public display’. As the Electoral Commission noted earlier this year, such transparency is not only valuable to inform citizens, but also to enable regulators to track activity and ensure the compliance with election spending rules. Beyond transparency, the Committee calls for an agreement on ‘a minimum limit for the number of voters sent individual political messages’. That latter proposal appears to invite political actors to agree to disarm and prevent an escalation in the precision of targeted messages.
There are a number of other steps that could be taken to address problems of fragmentation and micro-targeting, which were not taken up in the Interim Report. Digital intermediaries could be required to publicly display paid election advertisements in a special database (rather than listing details of ‘political advertising work’), so that people can see what is being promised to other voters. More broadly, the digital intermediaries could be subject to some election related ‘public service’ obligations. This could include (non-targeted) free advertising for political parties on digital services in the run up to an election, to ensure that the users receive a number of general messages aimed at the public at large. Such an approach would attempt to harness the near monopoly position of certain intermediates to facilitate a national conversation, and perform a function analogous to that played by the broadcast media.
While the discussion above refers to the problems in identifying a publisher and the fragmentation of the audience, the problem of fake news is primarily about the content of communications. That issue is particularly challenging, as the regulation of content is most likely to raise free speech issues. The Interim Report puts forward several approaches to deal with the issue. In relation to misinformation and misleading content, the Committee calls for Government to support research into the proliferation of falsities online and into the methods to verify content and grade sources. When discussing the ways to define ‘fake news’, the Committee puts forward a more puzzling recommendation that the Government ‘uses the rules given to Ofcom under the Communications Act 2003 to set and enforce content standards for television and radio broadcasters, including rules relating to accuracy and impartiality, as a basis for setting standards for online content’. I say it is puzzling, as it is not clear from the Interim Report who those standards should apply to (and whether standards for professional broadcasters are appropriate for digital services or publishers). Maybe I am taking the reference to broadcast standards too literally, and the point could be for broadcast regulations to simply offer a starting point when devising a method to determine when content is false and in identifying certain types of public service obligation to be imposed on digital intermediaries.
In relation to content that is harmful or illegal (as opposed to false or misleading), the Interim Report places considerable responsibility on the digital intermediary. The Committee sensibly concludes that an intermediary is neither a traditional publisher nor a passive platform. Instead, the intermediary fits into a category of its own, with its own distinct responsibilities. Accordingly, the Committee calls for clear legal liability to require tech companies to act against illegal and harmful content (both on complaint and where the content is ‘easy for the tech companies to identify’). While action in the form of notice and takedown procedures is well established, an obligation to act prior to notification is more likely to generate controversy. In particular, there may be concerns that such an obligation requires monitoring, which may fall foul of the prohibition of general monitoring under EU law and also raise issues under Article 10 of the ECHR (on which the case law is mixed).
While not at the forefront of the Interim Report, the issue of money in politics underpins some of the problems identified. While early optimists hoped that the digital media would reduce the costs of campaigning, the experience shows that money can still provide a significant advantage (for advertising, and for acquiring and analysing data). The lack of transparency discussed earlier provides one challenge, as it can be difficult to know what sums are spent online and to monitor compliance. There are also some longstanding problems, such as the ability of wealthy individuals to make very large donations to bankroll campaigns and the political activities of some companies. These problems are not new and reform proposals have been put forward on several occasions. However, the issues are politically sensitive and attempts to introduce change have previously been caught in political gridlock.
The history of election law is full of examples in which piecemeal reforms were enacted to deal with yesterday’s scandal, while failing to foresee the problems around the corner. The Digital, Culture, Media and Sport Committee is alert to this risk and the number of other investigations into related issues suggests that comprehensive change is on the cards. The challenge ahead is to flesh out detailed proposals for forward-looking reform, while navigating various political interests and ensuring that any measure respects political rights to freedom of expression and association.
Jacob Rowbottom is a Fellow of University College, Oxford and author of Media Law (2018).
(Suggested citation: J. Rowbottom, ‘Digital Communications, Social Media and the Integrity of Elections’, U.K. Const. L. Blog (31st Jul. 2018) (available at https://ukconstitutionallaw.org/))