Facebook’s business is all about tempting ordinary people to share as much information about themselves as they possibly can and then selling that data to various companies. I personally haven’t used my Facebook account much since 2013, but there are many millions of people worldwide who now require Facebook in order to socialize with people they don’t see in person often. But some people will even use Facebook Messenger on their phones to send messages to someone who’s in the same house.
I don’t know who said it first, but it’s absolutely correct. If you aren’t paying money for a commercial service, you are the product, not the customer. There was a rich kid I knew several years ago who had a Hotmail account that got taken over by malicious bots. He asked me for help. He was shocked to discover that there wasn’t some sort of phone number he could call to demand that Microsoft fix his Hotmail issue. He was used to being able to get any business to bend over backwards in order to help him. So, he couldn’t understand that his family’s money didn’t matter in this situation. He had never paid money for his Hotmail webmail and he was the product, not the customer.
One of the many types of companies which are the customers of services like Facebook are those that do market research for political campaigns. Cambridge Analytica is one of those companies, and it turns out what they’ve been doing with Facebook data is unethical to say the least.
Cambridge Analytica
Cambridge Analytica is a British political consulting firm which primarily works for right wing politicians and causes. The company is partly owned by the family of American billionaire hedge fund manager Robert Mercer, and I suspect that the firm exists more as a way for Mercer to exert political influence than as a way to make more money. They seem to have had some success. They not only worked on the pro-Brexit campaign, but also Donald Trump’s 2016 campaign on the other side of the pond.
But it appears the success of those ends were the product of some questionable means. Even before the recent Cambridge Analytica and Facebook news broke, the firm’s practices have raised some eyebrows.
From The New York Times in March 2017:
“Cambridge Analytica’s rise has rattled some of President Trump’s critics and privacy advocates, who warn of a blizzard of high-tech, Facebook-optimized propaganda aimed at the American public, controlled by the people behind the alt-right hub Breitbart News. Cambridge Analytica is principally owned by the billionaire Robert Mercer, a Trump backer and investor in Breitbart. Stephen K. Bannon, the former Breitbart chairman who is Mr. Trump’s senior White House counselor, served until last summer as vice president of Cambridge’s board...
In recent months, the value of Cambridge’s technology has been debated by technology experts and in somemediaaccounts. But Cambridge Analytica officials, in recent interviews, defended the company’s record during the 2016 election, saying its data analysis helped Mr. Trump energize critical support in the Rust Belt. (Cambridge Analytica CEO) Mr. Nix said the firm had conducted tens of thousands of polls for Mr. Trump, helping guide his message and identify issues that mattered to voters.
But when asked to name a single race where the firm’s flagship product had been critical to victory, Mr. Nix declined.”
You don’t get to 50 million data breach victims without making a few enemies
Here’s the issue and why you should care about it. On Saturday, March 17th, it was reported that a whistleblower exposed how Cambridge Analytica has been mining Facebook data unethically. It’s quite likely that you are one of the victims, especially if you identify as American on Facebook. From The Guardian:
“A whistleblower has revealed to the Observerhow Cambridge Analytica – a company owned by the hedge fund billionaire Robert Mercer, and headed at the time by Trump’s key adviser Steve Bannon – used personal information taken without authorisation in early 2014 to build a system that could profile individual US voters, in order to target them with personalised political advertisements.
Christopher Wylie, who worked with a Cambridge University academic to obtain the data, told the Observer: ‘We exploited Facebook to harvest millions of people’s profiles. And built models to exploit what we knew about them and target their inner demons. That was the basis the entire company was built on.’
Documents seen by the Observer, and confirmed by a Facebook statement, show that by late 2015 the company had found out that information had been harvested on an unprecedented scale. However, at the time it failed to alert users and took only limited steps to recover and secure the private information of more than 50 million individuals.”
Cambridge Analytica has used Facebook data in order to tailor political advertisements to individuals. For example, if a Facebook user seems to be very paranoid about being subject to violent acts, CA would market anti-gun control positions as making it easier for the user to buy a gun to defend themselves. Whereas if a user seems to be more of a stickler for the American Constitution, CA would market anti-gun control positions as being consistent with the Second Amendment. The AI used in data mining specific users often does its best to figure out what a target’s personality, worldview, and lifestyle is like.
Here’s what CA has done
“The data was collected through an app called thisisyourdigitallife, built by academic Aleksandr Kogan, separately from his work at Cambridge University. Through his company Global Science Research (GSR), in collaboration with Cambridge Analytica, hundreds of thousands of users were paid to take a personality test and agreed to have their data collected for academic use.
However, the app also collected the information of the test-takers’ Facebook friends, leading to the accumulation of a data pool tens of millions-strong. Facebook’s ‘platform policy’ allowed only collection of friends’ data to improve user experience in the app and barred it being sold on or used for advertising. The discovery of the unprecedented data harvesting, and the use to which it was put, raises urgent new questions about Facebook’s role in targeting voters in the US presidential election. It comes only weeks after indictments of 13 Russians by the special counsel Robert Mueller which stated they had used the platform to perpetrate ‘information warfare’ against the US.
Cambridge Analytica and Facebook are one focus of an inquiry into data and politics by the British Information Commissioner’s Office. Separately, the Electoral Commission is also investigating what role Cambridge Analytica played in the EU referendum.”
Here’s why you should care
As I mentioned, if you’re an American Facebook user, it’s very likely that you are one of the victims of this breach. Just as you trust a website that uses HTTPS to keep your data secure, you need to be able to trust the companies with whom (or through whom) you choose to share your personal information. Here are some of the matters which stand out to me.
First of all, lots of people love doing fun quizzes on the web. It calls back to back in the 90s when teenaged me used to fill out printed quizzes in teen magazines to figure out which colour of dragonfly hair clips matched my soul. A lot of those completely for entertainment purposes quizzes live on through websites like Buzzfeed. Even I have been known to be curious about what my favourite pair of shoes says about my preferences in ethnic cuisine.
thisisyourdigitallife may not have been presented on Facebook in as much of a frivolous way, but people sure love to volunteer personal information about themselves online. It’s bad enough that the people who used thisisyourdigitallife weren’t completely informed of how their data was being used. The worst part is that data was also collected on the users’ “friends,” people who didn’t consent to use the quiz app in the first place. So, there’s one group who were exploited through uninformed consent and another group who didn’t even consent at all.
Both Cambridge Analytica and Facebook haven’t always been upfront about how they use data, even during legal proceedings.
“Last month both Facebook and the CEO of Cambridge Analytica, Alexander Nix, told a parliamentary inquiry on fake news: that the company did not have or use private Facebook data.
Simon Milner, Facebook’s UK policy director, when asked if Cambridge Analytica had Facebook data, told MPs: ‘They may have lots of data but it will not be Facebook user data. It may be data about people who are on Facebook that they have gathered themselves, but it is not data that we have provided.’
Cambridge Analytica’s chief executive, Alexander Nix, told the inquiry: ‘We do not work with Facebook data and we do not have Facebook data.’”
I’m also concerned by Facebook’s weak incident response. Recall that the incident news broke on a Saturday. From WIRED:
“By Monday, Facebook remained frozen, and Zuckerberg and Sandberg stayed silent. Then, late in the afternoon in Menlo Park, more bad news came. The New York Times reported that Alex Stamos, the company’s well-respected chief of security, had grown dissatisfied with the top of senior management and was planning to exit in a few months. Some people had known this for a while, but it was still a very bad look. You don’t want news about your head of data security bailing when you’re having a crisis about how to secure your data. And then news broke that Facebook had been denied in its efforts to get access to Cambridge Analytica’s servers. The United Kingdom’s Information Commissioner’s Office, which had started an investigation, would handle that.
A company-wide Q&A was called for Tuesday but for some reason it was led by Facebook’s legal counsel, not its leaders, both of whom have remained deafeningly silent and both of whom reportedly skipped the session.”
Companies like Facebook will do their best to save face and assure the public they have changed their policies, but only when they know an incident will be reported. Cambridge Analytica is a grocery shopper, Facebook is a supermarket, and you the ordinary users are lined up on the shelves with price tags sticking out of your mouths. But only once the general public found out that Cambridge Analytica squeezed the human data produce too hard did Facebook kick them out of the store. For now, anyway.
The best advice I have for people is this- think of all of the information you put online as being public. Most people cannot avoid using Facebook, and I must use Twitter for my work. I tweet about my work a lot, and I even share some innocuous personal stuff. But I cross my fingers and hope that as long as my mindset is that everything online is public, I’m being smarter with my data than other people.
But motivation may be tricky because ultimately, you’re not the customer.
Find out why you need machine identity management
Related posts