• Latest News

    1 Apr 2017

    Facebook Failed to Protect 30 Million Users From Having Their Data Harvested by Trump Campaign Affiliate

    IN 2014, TRACES of an unusual survey, connected to Facebook, began appearing on internet message boards. The boards were frequented by remote freelance workers who bid on “human intelligence tasks” in an online marketplace, called Mechanical Turk, controlled by Amazon. The “turkers,” as they’re known, tend to perform work that is rote and repetitive, like flagging pornographic images or digging through search engine results for email addresses. Most jobs pay between 1 and 15 cents. “Turking makes us our rent money and helps pay off debt,” one turker told The Intercept. Another turker has called the work “voluntary slave labor.”
    The task posted by “Global Science Research” appeared ordinary, at least on the surface. The company offered turkers $1 or $2 to complete an online survey. But there were a couple of additional requirements as well. First, Global Science Research was only interested in American turkers. Second, the turkers had to download a Facebook app before they could collect payment. Global Science Research said the app would “download some information about you and your network … basic demographics and likes of categories, places, famous people, etc. from you and your friends.”
    “Our terms of service clearly prohibit misuse,” said a spokesperson for Amazon Web Services, by email. “When we learned of this activity back in 2015, we suspended the requester for violating our terms of service.”
    Although Facebook’s early growth was driven by closed, exclusive networks at college and universities, it has gradually herded users to agree to increasingly permissive terms of service. By 2014, anything a user’s friends could see was also potentially visible to the developers of any app that they chose to download. Some of the turkers noticed that the Global Science Research app appeared to be taking advantage of Facebook’s porousness. “Someone can learn everything about you by looking at hundreds of pics, messages, friends, and likes,” warned one, writing on a message board. “More than you realize.” Others were more blasé. “I don’t put any info on FB,” one wrote. “Not even my real name … it’s backwards that people put sooo much info on Facebook, and then complain when their privacy is violated.”
    In late 2015, the turkers began reporting that the Global Science Research survey had abruptly shut down. The Guardian had published a report that exposed exactly who the turkers were working for. Their data was being collected by Aleksandr Kogan, a young lecturer at Cambridge University. Kogan founded Global Science Research in 2014, after the university’s psychology department refused to allow him to use its own pool of data for commercial purposes. The data collection that Kogan undertook independent of the university was done on behalf of a military contractor called Strategic Communication Laboratories, or SCL. The company’s election division claims to use “data-driven messaging” as part of “delivering electoral success.”
    SCL has a growing U.S. spin-off, called Cambridge Analytica, which was paid millions of dollars by Donald Trump’s campaign. Much of the money came from committees funded by the hedge fund billionaire Robert Mercer, who reportedly has a large stake in Cambridge Analytica. For a time, one of Cambridge Analytica’s officers was Stephen K. Bannon, Trump’s senior adviser. Months after Bannon claimed to have severed ties with the company, checks from the Trump campaign for Cambridge Analytica’s services continued to show up at one of Bannon’s addresses in Los Angeles.
    “You can say Mr. Mercer declined to comment,” said Jonathan Gasthalter, a spokesperson for Robert Mercer, by email. 
    The Intercept interviewed five individuals familiar with Kogan’s work for SCL. All declined to be identified, citing concerns about an ongoing inquiry at Cambridge and fears of possible litigation. Two sources familiar with the SCL project told The Intercept that Kogan had arranged for more than 100,000 people to complete the Facebook survey and download an app. A third source with direct knowledge of the project said that Global Science Research obtained data from 185,000 survey participants as well as their Facebook friends. The source said that this group of 185,000 was recruited through a data company, not Mechanical Turk, and that it yielded 30 million usable profiles. No one in this larger group of 30 million knew that “likes” and demographic data from their Facebook profiles were being harvested by political operatives hired to influence American voters.
    Kogan declined to comment. In late 2014, he gave a talk in Singapore in which he claimed to have “a sample of 50+ million individuals about whom we have the capacity to predict virtually any trait.” Global Science Research’s public filings for 2015 show the company holding 145,111 British pounds in its bank account. Kogan has since changed his name to Spectre. Writing online, he has said that he changed his name to Spectre after getting married. “My wife and I are both scientists and quite religious, and light is a strong symbol of both,” he explained.
    The purpose of Kogan’s work was to develop an algorithm for the “national profiling capacity of American citizens” as part of SCL’s work on U.S. elections, according to an internal document signed by an SCL employee describing the research.
    “We do not do any work with Facebook likes,” wrote Lindsey Platts, a spokesperson for Cambridge Analytica, in an email. The company currently “has no relationship with GSR,” Platts said.
    “Cambridge Analytica does not comment on specific clients or projects,” she added when asked whether the company was involved with Global Science Research’s work in 2014 and 2015.
    The Guardian, which was was the first to report on Cambridge Analytica’s work on U.S. elections, in late 2015, noted that the company drew on research “spanning tens of millions of Facebook users, harvested largely without their permission.” Kogan disputed this at the time, telling The Guardian that his turker surveys had collected no more than “a couple of thousand responses” for any one client. While it is unclear how many responses Global Science Research obtained through Mechanical Turk and how many it recruited through a data company, all five of the sources interviewed by The Intercept confirmed that Kogan’s work on behalf of SCL involved collecting data from survey participants’ networks of Facebook friends, individuals who had not themselves consented to give their data to Global Science Research and were not aware that they were the objects of Kogan’s study. In September 2016, Alexander Nix, Cambridge Analytica’s CEO, said that the company built a model based on “hundreds and hundreds of thousands of Americans” filling out personality surveys, generating a “model to predict the personality of every single adult in the United States of America.”
    Shortly after The Guardian published its 2015 article, Facebook contacted Global Science Research and requested that it delete the data it had taken from Facebook users. Facebook’s policies give Facebook the right to delete data gathered by any app deemed to be “negatively impacting the Platform.” The company believes that Kogan and SCL complied with the request, which was made during the Republican primary, before Cambridge Analytica switched over from Ted Cruz’s campaign to Donald Trump’s. It remains unclear what was ultimately done with the Facebook data, or whether any models or algorithms derived from it wound up being used by the Trump campaign.
    In public, Facebook continues to maintain that whatever happened during the run-up to the election was business as usual. “Our investigation to date has not uncovered anything that suggests wrongdoing,” a Facebook spokesperson told The Intercept.
    Facebook appears not to have considered Global Science Research’s data collection to have been a serious ethical lapse. Joseph Chancellor, Kogan’s main collaborator on the SCL project and a former co-owner of Global Science Research, is now employed by Facebook Research. “The work that he did previously has no bearing on the work that he does at Facebook,” a Facebook spokesperson told The Intercept.
    Chancellor declined to comment.
    Cambridge Analytica has marketed itself as classifying voters using five personality traits known as OCEAN — Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism — the same model used by University of Cambridge researchers for in-house, non-commercial research. The question of whether OCEAN made a difference in the presidential election remains unanswered. Some have argued that big data analytics is a magic bullet for drilling into the psychology of individual voters; others are more skeptical. The predictive power of Facebook likes is not in dispute. A 2013 study by three of Kogan’s former colleagues at the University of Cambridge showed that likes alone could predict race with 95 percent accuracy and political party with 85 percent accuracy. Less clear is their power as a tool for targeted persuasion; Cambridge Analytica has claimed that OCEAN scores can be used to drive voter and consumer behavior through “microtargeting,” meaning narrowly tailored messages. Nix has said that neurotic voters tend to be moved by “rational and fear-based” arguments, while introverted, agreeable voters are more susceptible to “tradition and habits and family and community.”
    • Blogger Comments
    • Facebook Comments

    0 comments:

    Post a Comment

    Thanks For Sharing Your Views

    Item Reviewed: Facebook Failed to Protect 30 Million Users From Having Their Data Harvested by Trump Campaign Affiliate Rating: 5 Reviewed By: Orraz
    Scroll to Top