The big news over the weekend was how Facebook, Trump and Cambridge Analytica worked together to weaponize people’s personal information against them to help Trump win the 2016 election, perhaps with the assistance of Russia. The truth is this harvesting and manipulation of data is Facebook’s model, and anyone who uses Facebook is participating. Facebook is “free”, how exactly do you think they make their billions?
American and British lawmakers demanded on Sunday that Facebook explain how a political data firm with links to President Trump’s 2016 campaign was able to harvest private information from more than 50 million Facebook profiles without the social network’s alerting users. The backlash forced Facebook to once again defend the way it protects user data.
Senator Amy Klobuchar of Minnesota, a Democratic member of the Senate Judiciary Committee, went so far as to press for Mark Zuckerberg, Facebook’s chief executive, to appear before the panel to explain what the social network knew about the misuse of its data “to target political advertising and manipulate voters.”
The calls for greater scrutiny followed reports on Saturday in The New York Times and The Observer of London that Cambridge Analytica, a political data firm founded by Stephen K. Bannon and Robert Mercer, the wealthy Republican donor, had used the Facebook data to develop methods that it claimed could identify the personalities of individual American voters and influence their behavior. The firm’s so-called psychographic modeling underpinned its work for the Trump campaign in 2016, though many have questioned the effectiveness of its techniques.
But Facebook did not inform users whose data had been harvested. The lack of disclosure could violate laws in Britain and in many American states.
(click here to continue reading Facebook’s Role in Data Misuse Sets Off a Storm on Two Continents – The New York Times.)
If you have time, you should read the tale of the ex-Cambridge Analytica whisteblower, Christopher Wylie in The Guardian/Observer.
which includes this revelation:
Dr Kogan – who later changed his name to Dr Spectre, but has subsequently changed it back to Dr Kogan – is still a faculty member at Cambridge University, a senior research associate. But what his fellow academics didn’t know until Kogan revealed it in emails to the Observer (although Cambridge University says that Kogan told the head of the psychology department), is that he is also an associate professor at St Petersburg University. Further research revealed that he’s received grants from the Russian government to research “Stress, health and psychological wellbeing in social networks”. The opportunity came about on a trip to the city to visit friends and family, he said.
There are other dramatic documents in Wylie’s stash, including a pitch made by Cambridge Analytica to Lukoil, Russia’s second biggest oil producer. In an email dated 17 July 2014, about the US presidential primaries, Nix wrote to Wylie: “We have been asked to write a memo to Lukoil (the Russian oil and gas company) to explain to them how our services are going to apply to the petroleum business. Nix said that “they understand behavioural microtargeting in the context of elections” but that they were “failing to make the connection between voters and their consumers”. The work, he said, would be “shared with the CEO of the business”, a former Soviet oil minister and associate of Putin, Vagit Alekperov.
“It didn’t make any sense to me,” says Wylie. “I didn’t understand either the email or the pitch presentation we did. Why would a Russian oil company want to target information on American voters?”
Lukoil is a private company, but its CEO, Alekperov, answers to Putin, and it’s been used as a vehicle of Russian influence in Europe and elsewhere – including in the Czech Republic, where in 2016 it was revealed that an adviser to the strongly pro-Russian Czech president was being paid by the company.
When I asked Bill Browder – an Anglo-American businessman who is leading a global campaign for a Magnitsky Act to enforce sanctions against Russian individuals – what he made of it, he said: “Everyone in Russia is subordinate to Putin. One should be highly suspicious of any Russian company pitching anything outside its normal business activities.”
(click here to continue reading ‘I made Steve Bannon’s psychological warfare tool’: meet the data war whistleblower | News | The Guardian.)
The attention led to Facebook suspending Mr. Wylie’s Facebook and Instagram accounts…
In the latest turn of the developing scandal around how Facebook’s user data wound up in the hands of Cambridge Analytica — for use in the in development in psychographic profiles that may or may not have played a part in the election victory of Donald Trump — the company has taken the unusual step of suspending the account of the whistleblower who helped expose the issues.
(click here to continue reading Facebook has suspended the account of the whistleblower who exposed Cambridge Analytica | TechCrunch.)
— Christopher Wylie (@chrisinsilico)
Alexis Madrigal of The Atlantic writes:
Academic researchers began publishing warnings that third-party Facebook apps represented a major possible source of privacy leakage in the early 2010s. Some noted that the privacy risks inherent in sharing data with apps were not at all clear to users. One group termed our new reality “interdependent privacy,” because your Facebook friends, in part, determine your own level of privacy.
For as long as apps have existed, they have asked for a lot of data and people have been prone to give it to them. Back in 2010, Penn State researchers systematically recorded what data the top 1,800 apps on Facebook were asking for. They presented their results in 2011 with the paper “Third-Party Apps on Facebook: Privacy and the Illusion of Control.” The table below shows that 148 apps were asking for permission to access friends’ information.
But The Guardian’s reporting suggests that the company’s efforts to restuff Pandora’s box have been lax. Wylie, the whistleblower, received a letter from Facebook asking him to delete any Facebook data nearly two years after the existence of the data was first reported. “That to me was the most astonishing thing,” Wylie told The Guardian. “They waited two years and did absolutely nothing to check that the data was deleted. All they asked me to do was tick a box on a form and post it back.”
But even if Facebook were maximally aggressive about policing this kind of situation, what’s done is done. It’s not just that the data escaped, but that Cambridge Analytica almost certainly learned everything they could from it. As stated in The Guardian, the contract between GSR and Strategic Communications Laboratories states, specifically, “The ultimate product of the training set is creating a ‘gold standard’ of understanding personality from Facebook profile information.”
It’s important to dwell on this. It’s not that this research was supposed to identify every U.S. voter just from this data, but rather to develop a method for sorting people based on Facebook’s profiles. Wylie believes that the data was crucial in building Cambridge Analytica’s models. It certainly seems possible that once the “training set” had been used to learn how to psychologically profile people, this specific data itself was no longer necessary. But the truth is that no one knows if the Kogan data had much use out in the real world of political campaigning. Psychological profiling sounds nefarious, but the way that Kogan and Cambridge Analytica first attempted to do it may well have proven, as the company maintains, “fruitless.”
(click here to continue reading Cambridge Analytica and the Dangers of Facebook Data-Harvesting – The Atlantic.)
The way I personally deal with Facebook is by seeding it with incorrect information whenever I can, and by being diligent about deleting Facebook cookies from my browsers. Of course, I’m sure they know way too much about me, but at least some of their information is wrong.