The moral case for stealing data
This is a write-up of a talk at Gresham College. Opinions are those of the speaker and author and do not represent positions of the UK Cyber Security Council.
Morality is a complex subject; we each likely view our version of morality as “correct” and consider any variations from it to be sliding towards “incorrect”. Morality is subjective and incredibly divisive, so much so that there are whole areas of philosophy dedicated to it.
Most of us probably consider the theft of data to be wrong, after all we’re all here striving to be cybersecurity professionals and therefore perhaps find the idea of there being a moral case for data theft contradictory, perhaps even alarming.
The excellent lecture presented by Professor Victoria Baines tackles this in a very engaging, thought provoking way, something that shouldn’t really come as a surprise when you consider the speaker’s history. You can read Professor Baines’ biography and achievements in a number of places online (including on the Gresham College pages1) but in brief; Facebook’s Trust and Safety manager for Europe, the Middle East, and Africa, Leader of the Strategy Team at Europol’s European Cybercrime Centre, Principal Analyst at the UK Serious Organised Crime Agency, graduate of Trinity College, Oxford, the list goes on.
With whistleblowers in the news frequently this is a great time to cover the topic of data theft and morality, after all, whistleblowing is in essence taking information that is not publicly known or available and bringing it into the light of the public domain due to a sense of professional morality. The UK has law that governs whistleblowing2 and that law clearly states that whistleblowing “…must be in the public interest.” The lecture given by Professor Baines discusses whistleblowing as well as the language around whistleblowing from both sides of the act itself but she also discusses examples of data theft by threat actors that claim to be acting out of a sense of moral outrage rather than profit and allows us to consider the wider ramifications of whether they succeed or not.
The lecture also discusses the history of data theft and perhaps the earliest recorded example of unauthorised access to data all the way back in 1194 when, at the Battle of Fréteval, Richard I captured the archives of Phillip II which allegedly detailed, among other things, the names of deserters to the French as well as the names and details of Phillip’s spies and agents.
Since I’m a bit of history nerd I found this fascinating along with the follow up discussing the Medieval Copyists, described by Professor Baines as the ideological ancestors of modern hackers as they treated text based knowledge as information that should be available to all, openly distributed and freely recycled. Learning from history may seem a bit redundant when considering cyber security, it is not an ancient discipline, but the old cliché of ‘those who do not learn from history are doomed to repeat it’ is a cliché for a reason; it holds some truth. Clifford Stole’s book, the Cuckoo’s Egg3, whilst hugely outdated in terms of technology remains relevant to this day as it deals with the theft of data from computer systems, how it happened and how it was resolved. There’s a reason it is in the Ohio State University Cybersecurity Canon4 (a fantastic resource that any budding cybersecurity professional should check out) and that is despite being decades old, the themes and subject matter are still relevant today- so ignore history at your peril!
Following the brief, interesting history lesson the lecture focused on whistleblowing and hacktivism, two subjects that are ever relevant in the landscape of today.
The main topics on whistleblowing were The Pentagon Papers and the fates of Chelsea Manning and Edward Snowden.
The Pentagon Papers were published by the New York Times and Washington Post in 1971 having obtained them from Daniel Ellsberg, an employee of RAND Corporation who had access to a top-secret pentagon study of US decision making in relation to the war in Vietnam. The papers revealed to the public details about US involvement in the conflict including the fact that the geographic scope was wider than reported and that the US government knew early in the conflict that they could not win Ellsberg photocopied this data to prove what he was alleging and therefore had stolen top secret documentation, a crime that could, in theory, be labelled as treason (and in fact labelled as treason by President Nixon in a recorded phone call between the president and Henry Kissinger when the leak happened).
The question here becomes are you, as a whistleblower who believes what they are doing is morally right in this situation, prepared to follow through with your actions and risk being prosecuted under law, in this case for treason and espionage? Ellsberg was charged under the 1917 Espionage Act, but the charges were dropped and a mistrial declared due to procedural irregularities. Daniel Ellsberg was not prosecuted but it is important to note that this is not because what he did was protected under law but due to irregularities in the process of the trial and evidence gathering. He may have acted on his moral conscience, but it potentially came with a hefty price tag.
Chelsea Manning was sentenced to 35 years in prison for leaking hundreds of thousands of classified or sensitive documents to WikiLeaks, including video footage of a US military helicopter killing civilians in Iraq. She served 7 of those years before President Obama commuted her sentence.
Edward Snowden faces a 30 year prison sentence should he ever return to the US for his role in leaking classified information to the press that he had access to as an NSA contractor including the PRISM programme of mass surveillance (@War by Shane Harris5 is worth a read if you are interested in how the NSA started out with signals intelligence in Iraq and helped give rise to the Military-Internet complex).
Something that was brought up in the lecture was the impact of leaks. In the case of the Snowden leaks, secret information had been made public that could benefit adversaries. This presents a moral dilemma as you may be harming active, serving military personnel. It was deemed in a US House of Representatives report that in collecting and stealing the files when he left the country Snowden infringed on the privacy of thousands of government employees and contractors, a data breach by any other name. He is also accused of using his administrator access to search his coworkers’ personal drives and of obtaining his coworkers credentials by misleading them, in both cases obtaining data by unauthorised means.
Whistleblowing safeguards are in place throughout public and private institutions in the present day, but they don’t necessarily protect a whistleblower from retaliation as shown by a Barclay’s Chief Executive being fined for attempting to identify an anonymous whistleblower. The further impact from this was on internal whistleblowing itself with the number of investigations being opened falling from 364 in 2018 to just 67 by 2023.
Hacktivism was also covered in the lecture, from the likes of Anonymous shutting down dark web child abuse websites to The Impact Team, the group that exposed the user database of the Ashley Madison affairs dating site. It is argued that hacktivists use similar tactics to whistleblowers though they usually involve committing cybercrime for their own ideological ends. These people and groups are operating outside of any established framework for whistleblowing and usually utilising illegal methods to expose information by breaking in from the outside. You may argue that you lose the moral high ground if you are committing illegal acts to obtain and expose information that you had no right in being privy to in the first place, but in most cases of hacktivism talked about in the lecture the involved parties had a moral imperative to release the information they obtained, however, if you release everything, unfiltered, into the public domain you risk inviting unintended consequences. When a whistleblower releases information through the press it is often reworded, made easier to read, and may have personally identifiable information removed where appropriate to prevent the very real threat of doxxing. If you publish the database in full online , such as in the case of the Ashley Madison hack, you may well be putting name and address details of the service users out in the open for all to see and inadvertently putting people in very real, physical danger. Though the threat actors may have moral reasons for undertaking these attacks and publishing data they are not protected by the law in the way that a whistleblower may be.
The lecture also discusses ethical hacking and bug bounties, their place in cybersecurity and the dangers that come with them by inviting hackers to hack your service/website for potential reward. There have been examples of staff working for coordinated bug bounty platforms stealing reports and selling them on as their own work. There is also the issue of whether or not a notified company actually remediates the disclosed vulnerabilities, as an auditor I am unfortunately familiar with risks being highlighted and actions being agreed to mitigate them only to find nothing has happened to remedy the issues when this is followed up some time later.
There’s also the issue of being in control of your data to start with and not exposing it to people who should not have access to it, particularly if that data, or the way that data is being handled, is in the public interest to reveal. Jeffrey Goldberg, the editor in chief of the Atlantic, being inadvertently invited to a private Signal app chat group being used by high ranking US government officials is a prime example and one that makes me groan as an auditor, particularly where a member of the chat states “…I will do all we can to enforce 100% OPSEC.” whilst not realising they have invited a journalist to the chat in error, something that would likely be impossible if they were using authorised communications channels.
The whole lecture was a great watch and the questions, such as can we draw parallels between whistleblowing and civil disobedience in terms of exhausting all other avenues first, and a question regarding coming across evidence of war crimes surely rendering any commitment you have made to secrecy for your government null and void, felt they help to round the discussion out , particularly in terms of the moral quandary that may result from whistleblowing.
As someone who is on the periphery of Cybersecurity looking in, someone looking to make a career change in the future, is this worth watching? Yes. Cyber security is an inherently human issue and humans are complex, all the “blinky light security” 6 in the world cannot protect data if someone is using admin/Password123 as credentials. We each have a moral compass, and it invariably points in a slightly different direction to everyone else, that morality, our belief in what is ethical and what is not, can take a perceived black and white issue such as data theft and turn it all very grey. It’s not just about hackers breaking in and stealing your information, it’s also about doing what you can to ensure your organisation is doing the right things, that it is behaving ethically, and also about ensuring that if frameworks or procedures are in place that they are fit for purpose and subsequently followed. If you wanted to delve further into the morality of data theft, I’d recommend getting a copy of Cult of the Dead Cow by Joseph Menn7, it covers similar themes from within the hacking group and where they find themselves differing on the morality of hacking and security.
Cyber security is a discipline that I’ve learned is vast, it encompasses far more than I had considered when I first start looking into it, and insightful lectures like this I find to be good at adding depth and dimension to a part of it that I had perhaps previously considered to be more black and white than grey. Learning more and more about different areas of the discipline has made it easier for me to narrow down where I want to go, moving on from the initial mindset of “I want to work in Cyber security” to the more practical mindset of “I want to work in this specific area of Cyber security”.
About the Author
Gareth Lawrence is an auditor in the public sector and a member of the Cyber Access Network (CAN). He carries out risk-based audits across business areas with a particular focus on IT and digital strategy, and is passionately upskilling into cyber security.
Notes
1 - https://www.gresham.ac.uk/speakers/professor-victoria-baines
2 - https://www.gov.uk/whistleblowing
3 - https://www.goodreads.com/book/show/18154.The_Cuckoo_s_Egg
4 - https://icdt.osu.edu/cybercanon/bookreviews
5 - https://www.goodreads.com/book/show/20448184-war
6 – A phrase used frequently by Kip Boyle and Jake Bernstein in the Cyber Risk Management Podcast when referring to new technical controls - https://cr-map.com/podcast/
7 - https://www.goodreads.com/en/book/show/42283862-cult-of-the-dead-cow