Earlier this week, I delivered a talk at the MmIT 2016 Annual Conference in Sheffield about digital privacy and digital citizenship. The talk covers a range of themes (to the extent I think I possibly try to cover too much ground in one short talk), with everything from ethics to democracy to surveillance to encryption touched upon to varying degrees. As is my way, the slides I posted online make little sense to the casual observer, because they are mainly text light and image heavy. So I thought I’d break it down here into various chunks by way of providing context for the talk (out of sheer laziness, all references are all on the slides at the end of this post in the relevant places…where they aren’t, I’ve added them in the text below).
I think our ethics as library workers (as outlined by CILIP and IFLA) are crucial to how we see privacy, surveillance and the relationship with democracy. Two ethical principles in particular stand out for me:
“Commitment to the defence, and the advancement of, access to information, ideas and works of the imagination.”
“Respect for confidentiality and privacy in dealing with information users.”
IFLA argue that:
“Library and information services should respect and advance privacy both at the level of practices and as a principle.”
(The key element for me in that quote is the notion that we should “advance” privacy, we should not be passive, we should actively promote and encourage it amongst library users.)
Compare and contrast with what is potentially coming down the track:
“Small-scale networks such as those in cafes, libraries and universities could find themselves targeted under the legislation and forced to hand over customers’ confidential personal data tracking their web use.”
There’s a clear and present threat here to library and information services, in all their forms. If we are required to retain data on the information seeking habits of our users and pass to the security services on demand, then our users have no privacy and we are complicit in its violation. How we tackle this threat to our ethics is crucial, both in terms of our relevance (if we violate ethical principles as a matter of course, what is the point in their existence?) and, more importantly, in terms of the communities that rely on us.
When it comes to ethics and government surveillance policy there are big questions we need to confront and we need to find the answers that defend our communities. Ultimately the communities we serve must take priority over government policy. Governments come and go, the social inequality afflicting our communities never goes away.
What is surveillance?
Surveillance is presented as a tool of protection. It’s a way to protect you, your communities, your country. But surveillance is not solely about protection, it has a number of other effects. David Lyon, a leading figure when it comes to surveillance studies (I’d urge those engaged in labour and information labour to seek out his works on this topic), defines surveillance as follows:
“…the focused, systematic and routine attention to personal details for purposes of influence, management, protection or direction.”
It’s not solely a tool for protection. When we consider it in the other direction, it’s also about influencing, managing and directing. When a CCTV camera is placed on the streets, it’s not merely there to protect citizens, it’s effect is to manage the behaviour of those under its gaze, to make them behave in a particular way. This is the crucial element of surveillance that we need to consider, particularly when it comes to mass surveillance. Its existence, as Foucault argues, is enough on its own. It does not need to be active, its “permanent visibility…assures the automatic functioning of power”.
History of surveillance
Of course, the use of new technology in conducting surveillance is nothing new. In 1913, for example, suffragette prisoners had their photos taken without their knowledge, photos that were then used to conduct surveillance upon them after their release. The reasoning? They were a threat to the British Empire.
Similarly, in 1963, Robert Kennedy authorised the FBI to wiretap the telephones of Martin Luther King Jr. Following King’s assassination in 1967, Johnson ordered the army to monitor domestic dissident groups. The adaption of new technologies to be utilised for “national security” purposes has a long history. It should have come as no surprise to anyone that the internet would also be used in this way.
But it’s not as though surveillance was pursued uncritically by the state. In a report published in the same year as King’s assassination, the President’s Commission on Law Enforcement and Administration of Justice argued:
“In a democratic society privacy of communication is essential if citizens are to think and act creatively and constructively. Fear or suspicion that one’s speech is being monitored by a stranger, even without the reality of such activity, can have a seriously inhibiting effect upon the willingness to voice critical and constructive ideas.”
The ability to communicate and seek out information freely is vital in a functioning democracy. As Bauman notes:
“Democracy expresses itself in a continuous and relentless critique of institutions; democracy is an anarchic, disruptive element inside the political system: essentially, a force for dissent and change. One can best recognize a democratic society by its constant complaints that it is not democratic enough.”
The ability to investigate and critique is crucial, without that ability our system simply cannot be defined as democratic. Post-Snowden we can already see the impact mass surveillance has had on people’s willingness to seek out information on controversial topics. As Penney notes, Wikipedia pages on Al Qaeda et al have seen a marked decrease in views. The consequences of being discouraged from seeking out information on such topics is the impoverishment of political debate, something the National Telecommunications and Information Administration have warned of.
The growth of the internet has been coupled with the growing importance of data as a commodity. As with all commodities that can be harvested, companies seek to find ways to gather a larger and larger amount of data. As Sadowski warns:
“It has created an arms race for data, fueling the impulse to create surveillance technologies that infiltrate all aspects of life and society. And the reason for creating these massive reserves of data is the value it can or might generate.”
We see this approach taken by companies such as Google and Facebook who seek out new and innovative ways to collect more data that they can use to generate a profit.
Corporations also work with the state, sharing these new innovative data harvesting techniques. For example, Operation Mickey Mouse is a partnership between the Department of Defense and Disney whereby the former studies Disney’s use of technology and works in conjunction to “collect information on Beta testing operations that the popular theme park uses on their customers”.
21st Century Surveillance
Some terms to be familiar with:
The Five Eyes – an intelligence sharing partnership that comprises the United States, the United Kingdom, Canada, Australia and New Zealand.
Karma Police – Initiative launched in 2008 by GCHQ intending to record the browsing habits of “every visible user on the internet”. The system was designed to provide GCHQ with a web browsing profile for every visible user or a user profile for every visible website on the internet.
Tempora – GCHQ programme that led to interceptors being placed on 200 fibre optic cables catting internet data into and out of the UK. Potentially gives GCHQ access to 10 gigabits of data a second, or 21 petabytes a day. Around 300 GCHQ and 250 NSA operatives are tasked with sifting through the data.
Investigatory Powers Bill
The key thing to look out for here are ICRs (internet connection records). From the Bill:
190 Subsection (9)(f) provides for the retention of internet connection records. Internet connection records are a record of the internet services that a specific device connects to – such as a website or instant messaging application – captured by the company providing access to the internet.
Those that hold the data requested for under the provisions of the bill are also prevented from communicating this request with the individual who created the data requested. So, for example, if a request was made to a public library authority for information regarding an individual’s search history, the library authority would not be able to inform the individual in question. An invasion of their privacy compounded by the inability to flag this violation with them. Ultimately, the Bill undermines the ethical principles by which we should adhere and prevents us from warning our users of any violation of their privacy.
The UK government have been publicly hostile to the use of encryption technologies for some time, despite the fact that such technologies protect every single one of us from rogue states or individuals with malign intent. For David Cameron, the notion that individuals can communicate in private was an affront and a threat. Whereas in reality, in terms of democracy, the reverse is true: invasions of the privacy of communications are a threat and one that citizens should take seriously.
As for Theresa May, the new Prime Minister, she rejects the notion that we experience mass surveillance and yet proposed the investigatory powers bill which legislates for…well, mass surveillance. A bill that has also been rubber-stamped following an “independent” review by David Anderson QC who argued that there was a “clear operational purpose” in gathering large volumes of data about individuals.
The “danger” of encryption
Repeatedly and persistently, encryption has been portrayed as a tool that assists terrorists perpetrate violent acts. This was true in Paris and in Brussels. In both cases, politicians and law enforcement pointed to encryption technology and the awareness of such technologies by the perpetrators as a key component in their ability to plan such attacks. In neither case has it been demonstrated that encryption played a crucial role. In terms of the latter attack, a laptop was found in a rubbish bin, which included an unencrypted folder called “Target”.
There has also not been any evidence in the growth in the use of encryption technologies. A 2015 wiretap report, for example, found a decline in the instances where law enforcement encountered encryption when authorised to conduct wiretaps.
Nothing to hide?
Of course, any discussion around security results in the old “nothing to fear” trope being thrown around by those seeking to degrade privacy. This is, of course, a nonsense. Did Doreen Lawrence have anything to hide when she and her family were placed under surveillance as a result of their efforts to apply pressure upon Scotland Yard to investigate the racist murder of Stephen Lawrence?
People of colour, immigrants, welfare recipients and political activists are all in the front lines when it comes to testing out surveillance techniques that are then utilised on the general public. As Virginia Eubanks argues in terms of America:
“Poor and working-class Americans already live in the surveillance future. The revelations that are so scandalous to the middle-class data profiling, PRISM, tapped cellphones–are old news to millions of low-income Americans, immigrants, and communities of color. To be smart about surveillance in the New Year, we must learn from the experiences of marginalized people in the U.S. and in developing countries the world over.”
As true in the United Kingdom and Australia as it in the United States.
And of course, we must remember that the state is fluid, not fixed. It changes and adapts and criminalises. Furthermore, it is not us that determines whether we as citizens have done nothing wrong, it is the state. We simply do not have the power to determine that our actions will not result in sanction by the state. We may believe that they cannot sanction us, but ultimately it is not a decision that rests on our intuition, it rests on the interpretation and actions of the state.
The tools to help
There are, however, tools that can help protect our privacy. Tor Browser, for example, can help obscure our web browsing, protecting our intellectual privacy as we seek out information. PGP (Pretty Good Privacy) encryption helps ensure that individuals can communicate with each other securely and privately. But using PGP is not easy, it requires effort and a degree of social and cultural capital that not everyone can call upon.
Indeed, for many tools that provide protections, there are difficulties in terms of economic, social and cultural capital. In terms of smartphones, for example, 95% of Apple devices are encrypted by default, only 10% of Android devices in circulation currently are encrypted (estimates from earlier this year). Not everyone can afford an Apple device, and not everyone is aware of how to encrypt an Android device – resulting in what Chris Soghoian describes as a “digital security divide” (which I’d argue reinforces an intellectual privacy divide).
There are also a range of smartphone apps that offer secure communications (or at least claim to). But these must be treated with care. Smartphones are not a secure device for communication, no matter how secure the app claims to be (or how secure the app actually is). They leak metadata like nothing else. Alongside location data, they have a tendency to leak your mobility pattern (ie commuter routes between home and work which can easily identify individuals), calls received, numbers dialled, keywords, mobile device ID etc etc.
Tools such as Signal provide the best protection, but they protect for confidentiality not anonymity. Consequently, there is a need to know which app is best (Signal is a “better” choice than Whatsapp for example). Again, social and cultural capital are key components in being better able to secure communicates and information seeking activities.
Given the extent of the digital divide, it is questionable to what extent individuals have the knowledge and capability to protect their communications and seek information in private. For example, 65% of C2DE households (defined as skilled, semi-skilled and unskilled manual workers and non-working individuals) lack basic online skills (managing, communicating, transacting, creating and problem solving). 42% of internet users use the same password on multiple platforms and only 25% of individuals read a privacy statement before using a service. On the other hand, 39% of internet users claim to be reluctant to hand over personal information before they can use a service.
The role of library workers
Of course, library workers have played a key role in helping to extend digital inclusion. But they have also seen their jobs diminished, libraries closed and services they previously provided outsourced to the private sector, eg Barclays Bank. The consequences of this are obvious. Many private sector companies have no interest in ensuring the privacy and security of individuals on the internet because that limits their opportunities to market towards them or to generate profit from the data they create.
In the case of Barclays, helping individuals create a Google Account then showing them around the internet before closing by directing users to the help guides on the Barclays websites, runs the risk of delivering Barclays ads directly to the individual’s inbox. An individual that, by virtue of the fact that sought our guidance on getting online, will more likely than not lack the knowledge and awareness to understand or limit the delivery of such adverts.
How library workers can help
A Council of Europe statement (backed by CILIP) on freedom of expression, declared that individuals must “decide for themselves what they should, or should not, access” and those providing the service must “respect the privacy of users and treat knowledge of what they have accessed or wish to access as confidential”. IFLA’s Statement on Privacy in the Library Environment reminded library workers that they have a responsibility to “reject electronic surveillance”, provide training on “tools to use to protect their privacy” and “respect and advance privacy at the level of practices and as a principle”.
The Library Freedom Project in the United States has been leading the way in this area, and slowly but surely it is being recognised in the UK by library workers that this is an area we need to be taking a lead on. The collaboration between Newcastle City Library and the North East branch of the Open Rights Group has shown the way. It is possible to teach privacy skills, to work to protect the intellectual privacy of our users, either within the confines of our work, or outside of it. It is possible. We just need to act collectively to ensure that it happens.
We are in a position to empower our library users, to give them the freedom to seek out information without impediment, to think freely, to exchange ideas freely and, ultimately, provide them with the tools to truly and meaningfully engage with the democratic process. Our ethical principles demand this of us, and we should not falter in resisting government policy that undermines these core ethical principles and that threatens the freedom of our users.