Escaping the Algorithm: David Carroll Talks Data Privacy – Papermag

Posted: December 12, 2019 at 12:45 pm


without comments

We radiate data every day, all day. The more we interconnect, the more our information exhaust gets captured and refined into dark dossiers that identify and define us to unknown parties for unknown purposes. It's part of the reason why certain ads seem to know us better than we know ourselves. Algorithms mysteriously assemble our data into occasionally accurate predictions, but we're dispossessed of the knowledge to understand how it works. We're referred to by pseudonyms instead of the names our parents gave us, hashed numbers assigned as our unique identifiers. These mysterious IDs link us to our shadow profiles pumping through the data supply chain in high-speed auctions without our express knowledge or unambiguous consent. It's much easier to believe the conspiracy theory that our phones are always listening to us than it is to grasp the alien machine brain that has been modeling us in a simulation to get us to click, scroll, swipe and spend.

You may have seen the documentary The Great Hack that is, unless the algorithm decided not to push it on you. Along with Carole Cadwalladr, the reporter who broke the Cambridge Analytica scandal, and Brittany Kaiser, a former CA employee who recently published a memoir, the film follows my personal journey into the heart of darkness that was this data privacy cataclysm, trying to get my own data back. (For those who need a quick refresher on the saga, Cambridge Analytica, the Steve Bannon co-founded spin-off of a British military contractor that was bought by American far-right mega donor Robert Mercer, was hired by the Trump and Cruz campaigns to micro-target voters with psychologically enriched profiles generated from an illegal data-harvesting scheme that used a personality quiz to pilfer the data from 87 million unsuspecting Facebook users. Then, 30 million of those profiles were matched to voter files, allowing a data model to be generated that was applied to more than 200 million registered voters, even if they weren't on Facebook. I sued them in the UK for this mass data abuse against our country.)

There was something particularly surreal about having my name mentioned in UK Parliamentary hearings, especially after waking up before dawn in New York to watch the livestreams of the committee hearings in London investigating the scandal. The film captures me live-tweeting Parliament, digesting the incredible revelations from witnesses giving evidence in real time. Along with this public evidence, I excavated court filings from my lawsuit to find the legal proof that my worst fears had been justified that Cambridge Analytica had indeed interfered in our democracy.

Strangely, I had standing to sue because in a weird twist of fate, the infamous Cambridge Analytica ended up exporting Americans' voter data into the United Kingdom. As a consequence, this exposed their data crimes to European law, which grants people simple, fundamental data rights, rights that we don't (yet) grant ourselves in America. This includes the right to request your data from companies and organizations that collect and process it within the UK, regardless of citizenship. That means I wouldn't have had this legal right if Cambridge Analytica had kept our voter data here at home.

In the end, my effort to repatriate my voter profile resulted in the only criminal conviction of the company because they refused to hand over all my data as ordered by the UK's data cops, the Information Commissioner's Office. The company offered me about a dozen data points after bragging it had up to 5,000 data points for all American adults. Experts that reviewed my data suspected that it was not only incomplete but also unlawfully assembled on UK soil.

Shining a bright light on this cloak-and-dagger world of mass data abuse revealed how personal data leakage is being exploited by a shadowy international election interference industry. The common sentiment "I don't care about my privacy because I have nothing to hide" doesn't grapple with how rogue actors don't have to abuse your own data in order to try and hack the whole electorate and alter the course of history. As Brittany Kaiser, the former CA employee who was featured in The Great Hack, described it to me when I asked her how she responds to people resigned to privacy apathy, she pointed out that "some people will use your data to manipulate you into hating your neighbor, or not wanting to vote. Others will make billions of dollars off of selling your data to others without your knowledge, sharing none of that value with you." Her new memoir Targeted makes a compelling case that leaving it up to people to opt out, rather than requiring that they first opt in, makes our country vulnerable to deception. In that way, data rights laws offer a kind of herd immunity against mass data abusers, but this saga also laid bare for Americans that we don't have equal data rights to our friends and allies in the EU.

Given these eye-opening, eye-watering revelations, people understandably want to know what they can do to protect themselves from all this data leakage. While the vast majority of us can't go off the grid, we can limit the flood of data coming out. Being privacy-defensive online is a bit like paying extra for organic food: It feels better to try and support the more ethical option, even if the overall impact on the massive global data industries involved may be minuscule. And building up your personal data privacy defenses is not just good self-care. It's a boycott of unfair industry practices and unchecked corporate power.

But similar to the carbon pollution problem, data radiation is a new facet of the human condition that we are made to feel personally responsible for, despite government being the only viable mechanism available to us that has ever succeeded at inhibiting such profit-driven destruction on an industrial scale. Just as government has so far refused to enact sweeping new policies to combat climate change, leaving us on our own to fight global warming via consumption choices that are limited to whatever the market offers (recycling, electric cars, organic and vegan foods), we are stuck contending with ad hoc market-based solutions to deal with mass data abuse (i.e., adblockers, lawsuits, privacy settings). It's a special kind of gaslighting that we are the ones who have been made to feel guilty about having our data abused by transnational organized crime to subvert our democracies.

Most laws, especially outside of the EU, which grants data rights under the General Data Protection Regulation, generally impose all the consequences of making false choices on consumers. In America, we have virtually no data rights outside of rather narrow situations, like being a student in school (FERPA) or a patient in a clinical setting (HIPAA) or a kid (COPPA). Of course, no one reads the privacy policy anyway because there's no way to understand the implications.

Because we are mandated to do the work of being defensive by the corporate regime that insists "we care about your privacy" but never defines what that means, it takes time, patience and perseverance to send the signals to the marketing surveillance industrial complex that you want out. Even so, you are contributing to an analytic that will be measured, expressing your preference to be left alone and to see ads detached from your dark data dossier. The more of us that do the work to opt out, the more the industry will measure our boycott and fail at lobbying lawmakers to water down and loophole new data rights bills being drafted in statehouses and on The Hill.

To become more privacy-defensive and join the boycott, the first thing to do is consider how much Facebook and Google you can eliminate from your life. Most of us probably cannot practically delete our accounts (hate to break it to you, but Instagram and WhatsApp are also owned by Facebook), but if you're lucky enough to not depend on these platforms for your school or job, you probably won't miss them that much.

Dig deep into the settings to hunt for the switches that turn ad targeting and personalization off. In your Google account settings, you flick switches to "pause" data collection or set your data to auto-delete after a period of time, at which point they've already harvested your behaviors into their algorithms. Even ditching Chrome for other browsers sends a strong signal. Alternatives like Safari, Firefox and Brave now block cross-site trackers by default.

After subscribing to your trusted news publications, you'll want to install a trustworthy adblocker on both your desktop and mobile devices but be wary of the most popular options, which let trackers in by default. This is because Google and other adtech companies pay companies like Adblock Plus and AdBlock to be on a "whitelist" of ads that don't get blocked because they are deemed to be "acceptable." (I use the Disconnect.me Privacy Pro VPN on my iPhone, Ghostery for Firefox, Brave, Better Blocker in Safari and uBlock Origin in Chrome.) Probably the most expensive privacy defensive act would be ditching Android for iPhone to more fully de-Googlize your life. In either case, dig into your phone's privacy settings and "Limit Ad Tracking" while resetting your device identifier. This will offer some opt-out protection for ads and trackers embedded within apps, often the most pernicious, covertly pumping your data back to the motherships in Palo Alto and Mountain View and who knows where else.

You've made it this far, mastered the controls, exercised your preferences and secured the perimeter. The creepy ads should be gone. But there's still untold volumes of your data being bought and sold on the open market with an unspecified half-life. Companies you've never heard of have been collecting and broadcasting your personal data and pseudonyms in the real-time auctions that fire off in milliseconds and put a hidden price tag on your attention based on your own intimate traits and tendencies. And, ultimately, we can put in all this work to reduce, or even obfuscate our dark data dossiers, but we can never really know how much is already out there or for how long it'll remain. Opting out is arguably a symbolic, political act when you realize the futility of the efforts.

Perhaps the most effective means of resisting the dark data industrial complex is to do some democracy. California's landmark Consumer Privacy Act becomes enforceable in January 2020 and moves the needle in the US toward the EU in terms of data rights and protections while also starting to shift the burden of consent off of people to the shadow industry of extractors, refiners and arbitrageurs. Californians should learn about their new data rights and flex them. Research from the ad industry suggests that only a small minority of users will allow their data to be used to micro-target us for ads next year, upending the industry's well-worn claim that people prefer ad relevance over privacy. California also has the chance to further strengthen their law in an upcoming ballot initiative, addressing the critical role of political advertising and targeting, cutting into the heart of the Cambridge Analytica scandal.

California also joined Vermont in passing a law that mandates a state registry of data brokers, and for the first time, consumers can glimpse at a legally mandated list of previously unknown entities that have been profiting on our secrets for years, hidden in the shadows. New York is another economic powerhouse state debating a data rights act that goes farther than California's. These important actions in statehouses raise the bar for Congress, who will have to grant states serious concessions from the industry lobby that desperately wants these new state laws to be preempted by a national standard, as weak as they can get it, with as many loopholes as they can engineer. In that way, the most significant thing you can do as a US voter might be not tediously finagling fine-print settings but instead contacting your state representatives and asking them to stand up to the personal data industrial complex and co-sponsor a data rights bill to outdo California's. Demand your right to know how your data could be affecting your life.

See the rest here:

Escaping the Algorithm: David Carroll Talks Data Privacy - Papermag

Related Posts

Written by admin |

December 12th, 2019 at 12:45 pm

Posted in Organic Food




matomo tracker