Lessons from the Cambridge Analytica Files: Don’t Be Evil

Kings College Cambridge. Photo via Charles Rolfe.

By Zoe Marks and Erica Chenoweth for Denver Dialogues

Over the weekend, The Guardian broke a series of stories about the misuse of Facebook data by Cambridge Analytica, a data-mining firm paid by its clients to influence elections, markets, and more. What struck us about this story is that a seemingly common set of professional opportunities enabled an enterprising academic to sell a tool of political manipulation that may have changed the course of history. Dr. Aleksandr Kogan – an American citizen working at the University of Cambridge with a fellowship at the University of St. Petersburg – departed from a collaborative research team that had developed the software behind Cambridge Analytica’s data mining enterprise. Kogan, who had at one time apparently changed his name to Dr. Spectre, then launched a business as a solo consultant, replicated his collaborators’ method, and sold the product to Cambridge Analytica for a million dollars. You can read more about this episode here and here.

Kogan’s case is exceptional not only because of the scale of the potential fallout from his actions, but also because of his apparent disregard for basic tenets of professional ethics and collegiality. Accordingly, for those interested in engaged scholarship, some urgent questions and emergent lessons are arising from this unfolding story.

What is research and who is it for? Scientists at Cambridge University interested in social media and personality types created an app to download tens of thousands of Facebook profiles for research. Kogan was on the team and sold the research methodology to Cambridge Analytica, a company built for the purpose of buying and redeploying this data. As engaged scholars, it is hard to fathom selling a methodology designed to plunder people’s private information. However, this case highlights a few urgent tensions in the current research system. Professional demands for research “impact” create expectations to apply our work to policy and practice. Scholars also often face salary or institutional pressures to undertake consultancy work and short-term projects that further motivate finding applications for research. Finally, such public-private partnerships are increasingly valued by universities themselves because of government-led “Research Excellence Frameworks” in the UK or Centers of Excellence in the US. As our partnerships, research teams, and networks proliferate, it becomes harder to control where research “products” go – and it becomes harder to predict their mercurial market value. There may be few Aleksandr Kogans in the world, but there are many scholars undertaking both private sector and governmental partnerships that lead to downstream research applications with which they may be uncomfortable, or (blissfully) unaware. This requires an institutional commitment to protect the integrity of research as a craft so that scholars are not forced into unsavory partnerships for either professional success or economic survival. It should also embolden us as scientists to ask hard questions of our clients and policy partners about how they intend to guarantee the ethical application of our research methods and findings.

To whom should we disclose our work and partnerships? Kogan sold Facebook data-mining technology without consent from his colleagues who built it. His departmental colleagues were unaware of his affiliation with St. Petersburg University and his consulting arrangements with Cambridge Analytica, neither of which are on his professional CV. To us, this raises obvious questions about transparency, not just to the public or to one’s superiors, but also to colleagues and collaborators. Collegial relationships are fundamentally built on intellectual openness and trust. We assume that we’re engaged in a common academic purpose; that our colleagues and collaborators will disclose any outside financial or institutional interests that compromise the integrity of the project or rightful ownership over our ideas. This goes far beyond questions regarding intellectual property and highlights the urgent need for a two-fold culture shift. First, in our research projects, we need to set clear intentions about the purpose and applications of research from the outset. A useful visioning exercise for collaborative research teams would be for every research partner to write down a personal ethical statement and then share it with the team before writing a set of collective commitments. This is something we intend to apply with immediate effect in our own collaborative work. Second, although scholars engage in more and more collaborative research, they tend to talk to their own colleagues less and less. In our institutions and departments, we need to overcome increasing atomization and isolation of research by starting conversations regularly about ideas and ethics. This will create dialogue in conferences and in office corridors about reciprocal accountability, transparency, and mutual ethical investment in the collective enterprise of social science research.

If your research invents a genie, how do you ensure someone else won’t open the bottle? Kogan’s initial collaborators, Michal Kosinski and David Stillwell, created the original personality tests and Facebook apps that eventually were controlled by Cambridge Analytica for micro- and nano-targeting political messages on behalf of clients, including Steve Bannon and numerous nameless shell companies. Their data-mining and profiling techniques are being described as “Frankenstein’s monster” who is “not under any human’s control”. This raises the terrifying reality that even with transparent partnerships and scrupulous collaborators, research methods and data can be appropriated for nefarious ends and in ways we cannot imagine. As engaged scholars, we need to be proactive in securing our data, but also creative in improving systems for accountability. Just as liability standards cannot guarantee ethical research practice, we cannot rely on domestic or international legal regulations to ensure compliance with professional ethical norms. And without conversations and community transparency about who we do research for and why, it’s almost impossible to imagine ways we could become meaningfully accountable to the common good. Now is the time to think about more dynamic and supportive ethics review processes nimble enough to respond to problems as they emerge, rather than abdicating responsibility for unintended or unanticipated consequences.

Certainly this developing case raises numerous moral and political questions and concerns that extend far beyond the scope of this post. But one can sum up the main takeaway with Google’s own motto: “don’t be evil”.

Zoe Marks is a Chancellor’s Fellow and Lecturer at the University of Edinburgh, where she is director of the MSc in African Studies and the Global Development Academy. Her research focuses on peace and conflict, gender, and inequality in Africa. Erica Chenoweth is a Lead Editor and regular contributor at PV@Glance. 

1 comment
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like