How The Great Hack ruined my weekend
In passing last week, my CEO mentioned that she had watched the recent Netflix documentary The Great Hack. Let me confess: I have a slightly Hermione Granger side to my personality that likes knowing what my teacher is reading. Saturday found me watching the documentary at a friend’s house with a bowl of salty tortilla chips and a cupful of lukewarm paranoia.
For those of you who haven’t watched, let me summarize: this is not a feel-good movie. In the early 2000’s Cambridge Analytica’s business was to use data and analytics to force social outcomes—governments, campaigns and companies would hire CA to profile targeted groups of people for a specific purpose. The U.S. Republican party hired Cambridge Analytica for the 2016 Presidential campaign, who then hired psychologist Aleksandr Kogan to build a personality-profiling survey to see what factors influenced voting habits.
Here’s the painful part: the survey responses went to Kogan’s custom app that collected Facebook data from users and their friends, data that included names, birthdates, and political affiliations, among other things. Cambridge Analytica then used that data to tailor media for specific voters in districts that were equally split between two candidates. Former CA employee Brittany Kaiser explains that they focused their propaganda storm on these so-called “persuadables” in narrow democratic elections most recently in the U.K., the U.S., and Trinidad and Tobago.
I’m not going to tell you to delete your Facebook profile. Yet I’ve noticed that the documentary has been polarizing to those around me; some it thought-provoking, inconclusive. For me it did one thing well: it showed how any person’s data gets disclosed (sometimes involuntarily, for which Aleksandr Kogan later apologized) then spread more broadly without consent, then finally used against us in sculpted political or economic advertising. The Great Hack was what finally made me—an already-cautious user, given my industry—stop using apps entirely before reading the fine print. It’s not bad enough that data privacy is an afterthought to most companies, but worse when companies that claim to protect customer actually sell it. It’s part of why I enjoy working at a company that takes a very public stance on believing your data privacy is precious. You should have the right to be forgotten as well as remembered, but on your terms.
We can’t take back what someone has already made known about us. It’s bad enough if my data is for sale on the dark web today—or to a voter profiling agency before my next local election. As machines churn through public and private data for years to come, the more everyone’s patterns get tied to future habits. There is a very short list of administrations who don’t want you to have that data privacy and have gone so far as to ban tools like VPNs altogether, because that anonymous data is a power they don’t want their citizens to have. In my view, machines and people are almost co-evolving, growing smarter together, teaching each other how to use data differently. My cupful of paranoia contains this thought: how my data (or that of those close to me) could be used against me tomorrow in ways I can’t imagine today.