affiliated to the International Association of Constitutional Law
The chair of the Electoral Commission recently warned that British elections face a ‘perfect storm’ of threats to their integrity from developments in social media and political advertising, and third-party state actors. Such assessments, supported by allegations about the UK-EU referendum and the 2017 general election campaigns, call into question the effectiveness of the current regulatory regime applicable to political campaign advertising and data protection. The series of allegations, currently being investigated by the Information Commissioners Office (ICO) and the Electoral Commission, relate to election campaign overspending and the misuse of personal data for the purpose of micro-targeting on social media platforms. Micro-targeting involves the gathering up of personal data primarily from online sources which is processed and analysed to reveal patterns and make predictions about specific individuals’ behaviour and political opinions. By this insight, it is alleged that tailored political campaign adverts are directed to individuals on social media platforms like Facebook and Twitter. However, the individual does not know they have been profiled on the basis of their data or that they have been specifically selected for targeting. The supposed aim of this practice is to influence and change voting behaviour for the benefit of the political party or group deploying these techniques.
There is no evidence to show that micro-targeted adverts influence election or referendum results and it is unlikely that any single campaign affected the outcome of the UK-EU referendum or the 2017 general election solely as a result of these tactics. The concern of this blog is simply whether, in the context of micro-targeted political campaign adverts, the regulatory framework fulfils its purpose of maintaining fair, clean and free elections. Despite the caveats above, what is certain is that a new approach to shaping political discourse and influencing democratic decision-making is emerging. Because this is a complex issue governed by a complex body of law, this post highlights only some of the issues but argues that the law is piecemeal and there are specific gaps in the way the law regulates this area. This in turn raises questions about how to better protect democratic processes.
The applicable regulatory framework
In several related ways, the law attempts to maintain a level playing field when it comes to political campaigning. Primarily, it does this by regulating the amount of money that can be spent during a referendum and election campaign through the Political Parties, Elections and Referendums Act 2000; by banning political adverts from television and radio broadcasting through the Communications Act 2003; and, from preventing over-zealous campaigners from ‘robocalling’ the electorate in breach of the Privacy and Electronic Communications Regulations 2003. The laws governing elections and political advertising assume that the electorate must maintain a ‘free-mind’. The law on data protection aims to maintain privacy for individuals online. But, these new techniques reveal gaps in the law’s aims.
Data Protection Act 1998
Profiling and the use of big data is not unlawful, but for data analytics to be lawful, requirements under the Data Protection Act (DPA) 1998 must be met. The DPA, drafted before big data and social media, regulates the processing of personal data by restricting how data can be recorded, stored, altered, used or disclosed. In summary, there are several gaps within this regime. Consent is a critical feature to the use of personal data and although not the only way to lawfully process personal data, it is a major factor in this context because there is no public interest justification which would release the data processor from otherwise needing consent. Even when consent is given, individuals often do so without realising the implications – or, consent is obtained when there is no real alternative. Beyond consent, another gap lies in the reasonableness test for whether there is an unfair use of sensitive personal data, which is hard to satisfy for some elements of data and types of data use. Finally, even though the data can be processed for legitimate political activities what that means is unclear and could enable potential misuses of data to be regarded as legitimate.
Consent is predominantly granted to Data Controllers, like Facebook or Twitter when setting up an account, who pass it onto Data Processors such as data analytics companies, for a secondary use, without obtaining further consent. As data related to political opinions is sensitive personal data, it can only be lawfully and fairly processed if at least one of the schedule 3 DPA 1998 requirements is met. These conditions include ‘explicit consent’ meaning: was consent freely given; was the consent specific and informed (as defined by article 2(h) of the European Data Protection Directive 95/46/EC); is there a level of transparency about what type of data is collected; is it clear what the data will be used for and who it will be shared with; and, are details about how an individual may exercise their right to object or withdraw consent provided.
Even if the explicit consent condition is met, there is still the reasonable expectation assessment as to the use of that data. Meaning, even if explicit consent has been given there could be a breach of the DPA 1998 if the processing of the data does not satisfy the fairness principle. Unfairness can most starkly be seen when there is a disconnect between the initial cause for data collection or data provision and the subsequent use of that data. We all submit details online for various purposes, but the clincher is the deceptive receipt of personal information, including political views, for direct marketing. To assess whether such data use is unfair attracts a reasonableness test usually applied to the release of sensitive personal data in the context of Freedom of Information requests. The regulator may ask ‘has someone’s sensitive personal data been handled in a way they would reasonably expect’? and ‘has the organisation obtaining the data been open about the reason for obtaining it?’. Further, consent, if given, may not be adequate to satisfy the condition for processing if the individual had no real choice about giving it, such as when having to agree terms and conditions to set up a Facebook account.
Sensitive personal data may be processed by political parties and campaigners for ‘legitimate political activities’ permitting reference to the electoral register as per section 8 of the Data Protection (Processing of Sensitive Personal Data) Order 2000. What precisely ‘legitimate political activities’ can encompass is unclear, but section 4 of Schedule 3 of the DPA 1998 lists exceptions that do not readily apply to the use of data for political adverts. More importantly, the scale of the data processed in such a scenario is likely to be excessive for the processing purpose as per schedule 1, Part 1(3) DPA 1998. Even if breaches are established the remedies built into the DPA 1998 are so weak that claimants are bringing claims in tort for misuse of private information against large technological corporations rather than relying on the right to compensation, which is limited to distress or financial loss as a result of a breach of the DPA 1998.
Political advertising and direct marketing
Social media exposes the restrictions on political advertising to be utterly inadequate. Political campaign adverts are banned from television and radio broadcast, but party broadcasts are allowed because they are not classed as advertising. Political advertisements sent to individual voters on social media, however, are unregulated because non-broadcasting political advertising was specifically excluded from regulatory oversight under the Communications Act 2003 (an Act which does not even mention the internet).
However, the practice may be caught to a limited extent by the DPA 1998. Under section 11(3) DPA 1998, direct marketing includes political communications directed to an individual. The ICO interprets that provision to include material designed to promote a political view, gain support in an upcoming election or referendum, or otherwise influence an individual. In the context of direct marketing, the political party or group commissioning the marketing has a duty to tell people on collection of their data how their data will be used. Critically, as per section 27(5) DPA 1998, even if information is publicly assessable, this does not automatically mean that it can be reused for other (political) purposes – at least not without providing fair processing information.
The direct marketing laws are behind the times because they prohibit the practice through automated telephone calls, fax, email, text messages and post (unless the receiver has given consent) but there is no provision that explicitly applies to social media. Section 11 of the DPA 1998 gives the individual a right to request that their data is not processed for marketing purposes – a fairly ineffective protection in this context as individuals do not usually know that their data has been processed for direct marketing purposes because Facebook, data analyst companies, data brokers, political parties and whoever else might be collecting, accessing and analysing data do not inform the individual that their data is being used to evaluate their political preferences to send them tailored targeted political adverts. Further, micro-targeting is incredibly difficult to detect as only the sender, the social media platform and the receiver know what has been sent to whom. This is a particularly problematic feature of new campaign techniques which parliament will have to tackle.
The use of these techniques is dependent on social media but monitoring the amount of resources channelled through this medium escapes meaningful oversight. In the last few years, a significant amount of campaign spending has been directed to market research and social media advertising. In 2015, the first year spending on digital advertising was reported, £1.6million was spent by the main parties on digital marketing, mostly on Facebook. Analysis suggests such spending pays off. As researchers at London School of Economics and Political Science and Kings College have argued in a report on new political campaigning, election campaigning laws are weak and helpless in preserving the democratic process. The report points to the fact that it is easy to blur whether spending was directed towards national or local campaigns because spending records do not make clear exactly what money was spent on and it is much harder to track online spending. This is important because campaigns can be won or lost by only a fraction of votes cast. In being able to identify swing seats or areas with untapped voting potential, campaigners can funnel resources into those areas to intensify their message towards a fraction of the electorate most likely to sway the result. This, the report argues, undermines the principle of a level playing field. Dr Martin Moore recently explained how spending on digital targeting evades transparency and oversight from the regulator. One way in which spending returns are ‘opaque’ is that on social media, money can be spent to isolate and target Facebook account holders living within a particular region or constituency but this will not necessarily be recorded under local spending, and as such much more money can be allocated to that particular element of the campaign as the national spending limit is much greater.
Does the use of social media to target voters in a way that is not is not transparent break the law? Possibly not and that may be part of the problem. The law and its regulators are, apparently, such a ‘joke’ that ‘anybody who wanted to cheat the law could do it easily without people realising.’ As the Electoral Commission knows, trust is central to the electoral system. To ensure trust does not further deteriorate the ICO and Electoral Commission must respond robustly and openly to the allegations, particularly if no breaches are found. It is unlikely they will, however. More broadly, as the relevant institutions and regulators are unable to keep pace with changing technology urgent legislative reform is necessary. One step towards reform is the General Data Protection Regulations (GDPR) which will take direct-effect in the UK on 25 May 2018 and is designed to be better suited to this data-driven world. The Data Protection Bill which will implement the GDPR, repeal and replace the DPA 1998 is proceeding through the Houses. Whether it is ambitious enough in dealing with the misuse of data for corporate gain remains to be seen but Members of the House of Lords have indicated their fear that it is not. In addition, wholesale reform of electoral laws is necessary to at least enable the electorate to see who is spending money to influence them and how. Further, the gap in the regulation of social media political campaign adverts needs addressing so that the law is at least consistent. Ideally, the legislature would resist the urge to take another piecemeal approach and engage in a comprehensive re-think of data and the implications of its uses in politics.
Bethany Shiner, Lecturer in law at Middlesex University and solicitor-advocate
(Suggested citation: B. Shiner, ‘Just the Politics of Persuasion: Are Gaps in Regulation Affecting Political Campaigning Methods?’, U.K. Const. L. Blog (1st Mar. 2018) (available at https://ukconstitutionallaw.org/))