International Debate

Will Cambridge Analytica Hurt Legitimate Research?

March 26, 2018 2882

Cambridge Analytica logoThe scandal that has erupted around Cambridge Analytica’s alleged harvesting of 50m Facebook profiles assembled from data provided by a UK-based academic and his company is a worrying development for legitimate researchers.

Political data analytics company Cambridge Analytica – which is affiliated with Strategic Communication Laboratories (SCL) – reportedly used Facebook data, after it was handed over by Aleksandr Kogan, a lecturer at the University of Cambridge’s department of psychology.

The Conversation logo

This article by Annabel Latham originally appeared at The Conversation, a Social Science Space partner site, under the title “Cambridge Analytica scandal: legitimate researchers using Facebook data could be collateral damage”

Kogan, through his company Global Science Research (GSR) – separate from his university work – gleaned the data from a personality test app named “thisisyourdigitallife”. Roughly 270,000 US-based Facebook users voluntarily responded to the test in 2014. But the app also collected data on those participants’ Facebook friends without their consent.

This was possible due to Facebook rules at the time that allowed third-party apps to collect data about a Facebook user’s friends. The Mark Zuckerberg-run company has since changed its policy to prevent such access to developers.

Whistleblower Christopher Wylie, who previously worked as a contractor at Cambridge Analytica, told the Guardian that the company used the data to target American voters ahead of President Donald Trump’s victory in 2016. He claimed that Cambridge Analytica was a “full-service propaganda machine.”

Cambridge Analytica has denied any wrongdoing and said that the business tactics it used are widespread among other firms. For his part, Kogan insists that what he did was at all times compliant with the law – and also says, according to CNN, that he would be happy to testify before US Congress and talk to the FBI about the work he did for the company.

Facebook said on March 18 that it had suspended SCL, alleging that Kogan had “lied to us and violated our platform policies by passing data from an app that was using Facebook login to SCL/Cambridge Analytica.” Facebook states under part three of its platform policy that developers do not have permission to “transfer any data that you receive from us (including anonymous, aggregate, or derived data) to any ad network, data broker or other advertising or monetisation-related service.”

In a statement to Cambridge News, the University of Cambridge said:

We are aware that Dr Kogan established his own company, Global Science Research (GSR), of which SCL/Cambridge Analytica was a client. It is not uncommon for Cambridge academics to have business interests, but they must satisfy the university that these are held in a personal capacity and that there are no conflicts of interest.

It is our understanding that the thisisyourdigitallife app was created by GSR. Based on assurances from Dr Kogan as well as the evidence available to us, we have no reason to believe he used university data or facilities for his work with GSR, and therefore that there is no reason to believe the university’s data and facilities were used as the basis for GSR’s subsequent work with any other party.

A day after the Cambridge Analytica scandal hit, Facebook’s shares plummeted on Wall Street amid the privacy backlash. But could the incident affect legitimate academic research?

Implications

Social media data is a rich source of information for many areas of research in psychology, technology, business and humanities. Some recent examples include using Facebook to predict riots, comparing the use of Facebook with body image concern in adolescent girls and investigating whether Facebook can lower levels of stress responses, with research suggesting that it may enhance and undermine psycho-social constructs related to well-being.

It is right to believe that researchers and their employers value research integrity. But instances where trust has been betrayed by an academic – even if it’s the case that data used for university research purposes wasn’t caught in the crossfire – will have a negative impact on whether participants will continue to trust researchers. It also has implications for research governance and for companies to share data with researchers in the first place.

Universities, research organisations and funders govern the integrity of research with clear and strict ethics procedures designed to protect participants in studies, such as where social media data is used. The harvesting of data without permission from users is considered an unethical activity under commonly understood research standards.The fallout from the Cambridge Analytica controversy is potentially huge for researchers who rely on social networks for their studies, where data is routinely shared with them for research purposes. Tech companies could become more reluctant to share data with researchers. Facebook is already extremely protective of its data – the worry is that it could become doubly difficult for researchers to legitimately access this information in light of what has happened with Cambridge Analytica.

Data analytics

Clearly, it’s not just researchers who use profile data to better understand people’s behavioural patterns. Marketing organisations have been profiling consumers for decades – if they know their customers, they will understand the triggers that prompt a purchase of their product, enabling them to adjust marketing messages to improve sales. It has become easier with digital marketing – people are constantly tracked online, their activities are analysed using data analytics tools and personal recommendations are made. Such methods are core to the business strategies of tech giants’ such as Amazon and Netflix.

Information from online behaviour can be used to predict people’s mood, emotions and personality. My own research into Intelligent Tutoring Systems uses learner interactions with software to profile personality type so it can automatically adapt tutoring to someone’s preferred style. Machine learning techniques can combine theories from psychology with new patterns found – such as Facebook “likes” – to profile users.

Eli Pariser, who is the CEO of viral content website Upworthy, has been arguing against personalisation tools since 2011. He has warned against the dangers of information filtering, and believes that the use of algorithms – to profile people to show them information tailored to personal tastes – is bad for democracy.

While these fears appear to be borne out by some allegations levelled against Cambridge Analytica, it’s worth noting that there has been no evidence to show that US votes were swung in favour of Trump due to Cambridge Analytica’s psychometric tool.

However, given his academic status, Kogan’s apparent decision to transfer Facebook data for commercial ends in violation of the social network’s policies could yet have explosive consequences, not least because researchers might find it more difficult to get Facebook – and its users – to agree to hand over the data for research alone.


Annabel Latham is a senior lecturer in the School of Computing, Mathematics and Digital Technology at Manchester Metropolitan University. She is a fellow of the HEA and a member of the BCS, the IEEE, the IEEE Women in Engineering Group and the IEEE Computational Intelligence Society.

View all posts by Annabel Latham

Related Articles

Julia Ebner on Violent Extremism
Insights
November 4, 2024

Julia Ebner on Violent Extremism

Read Now
Emerson College Pollsters Explain How Pollsters Do What They Do
Communication
October 23, 2024

Emerson College Pollsters Explain How Pollsters Do What They Do

Read Now
All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture
Event
October 10, 2024

All Change! 2024 – A Year of Elections: Campaign for Social Science Annual Sage Lecture

Read Now
‘Settler Colonialism’ and the Promised Land
International Debate
September 27, 2024

‘Settler Colonialism’ and the Promised Land

Read Now
Webinar: Banned Books Week 2024

Webinar: Banned Books Week 2024

As book bans and academic censorship escalate across the United States, this free hour-long webinar gathers experts to discuss the impact these […]

Read Now
Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

Research Assessment, Scientometrics, and Qualitative v. Quantitative Measures

The creation of the Coalition for Advancing Research Assessment (CoARA) has led to a heated debate on the balance between peer review and evaluative metrics in research assessment regimes. Luciana Balboa, Elizabeth Gadd, Eva Mendez, Janne Pölönen, Karen Stroobants, Erzsebet Toth Cithra and the CoARA Steering Board address these arguments and state CoARA’s commitment to finding ways in which peer review and bibliometrics can be used together responsibly.

Read Now
Revisiting the ‘Research Parasite’ Debate in the Age of AI

Revisiting the ‘Research Parasite’ Debate in the Age of AI

The large language models, or LLMs, that underlie generative AI tools such as OpenAI’s ChatGPT, have an ethical challenge in how they parasitize freely available data.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments