Posts Tagged ‘privacy’

One year later, the harms of Europe’s data-privacy law

The European Union’s General Data Protection Regulation (GDPR), which went into effect just over a year ago, has resulted in a broad array of consequences that are expensive, unintended, or both. Alec Stapp reports at Truth on the Market, with more discussion at Marginal Revolution:

GDPR can be thought of as a privacy “bill of rights.” Many of these new rights have come with unintended consequences. If your account gets hacked, the hacker can use the right of access to get all of your data. The right to be forgotten is in conflict with the public’s right to know a bad actor’s history (and many of them are using the right to memory hole their misdeeds). The right to data portability creates another attack vector for hackers to exploit.

Meanwhile, Stapp writes, compliance costs for larger U.S.-based firms alone are headed toward an estimated $150 billion, “Microsoft had 1,600 engineers working on GDPR compliance,” and an estimated 500,000 European organizations have seen fit to register data officers, while the largest advertising intermediaries, such as Google, appear to have improved their relative competitive position compared with smaller outfits. Venture capital investment in Euro start-ups has sagged, some large firms in sectors like gaming and retailing have pulled out of the European market, and as of March more than 1,000 U.S.-based news sites were inaccessible to European readers.

More in Senate testimony from Pinboard founder Maciej Ceglowski via Tyler Cowen:

The plain language of the GDPR is so plainly at odds with the business model of surveillance advertising that contorting the real-time ad brokerages into something resembling compliance has required acrobatics that have left essentially everybody unhappy.

The leading ad networks in the European Union have chosen to respond to the GDPR by stitching together a sort of Frankenstein’s monster of consent,a mechanism whereby a user wishing to visit, say, a weather forecast is first prompted to agree to share data with a consortium of 119 entities, including the aptly named “A Million Ads” network. The user can scroll through this list of intermediaries one by one, or give or withhold consent en bloc, but either way she must wait a further two minutes for the consent collection process to terminate before she is allowed to find out whether or it is going to rain.

This majestically baroque consent mechanism also hinders Europeans from using the privacy preserving features built into their web browsers, or from turning off invasive tracking technologies like third-party cookies,since the mechanism depends on their being present.

For the average EU citizen, therefore, the immediate effect of the GDPR has been to add friction to their internet browsing experience along the lines of the infamous 2011 EU Privacy Directive (“EU cookie law”) that added consent dialogs to nearly every site on the internet.

On proposals to base legislation in the United States on similar ideas, see Roslyn Layton and Pranjal Drall, Libertarianism.org. [cross-posted from Cato at Liberty]

“Unconscious People Can’t Consent to Police Searches”

Police officers in Wisconsin “drew Gerald Mitchell’s blood while he was unconscious—to test his blood alcohol content after a drunk-driving arrest. The state has attempted to excuse the officers by citing an implied-consent statute, which provides that simply driving on state roads constitutes consent to such searches.” Although the right to privacy are not absolute, there are problems with that approach, made worse by a strange Wisconsin Supreme Court opinion extending to highway searches a Fourth Amendment search exception for “pervasively regulated businesses.” [Ilya Shapiro and Patrick Moran on Cato cert amicus brief urging the Supreme Court to review Mitchell v. Wisconsin]

Does European data privacy regulation help entrench U.S. tech firms?

Roslyn Layton, AEI, in November:

The EU’s General Data Protection Regulation (GDPR), along with similarly heavy-handed regimes such as California’s Consumer Privacy Act, entrenches established platforms that have the resources to meet their onerous compliance requirements. Since the GDPR’s implementation in May, the rank and market share of small- and medium-sized ad tech companies has declined by 18 to 32 percent in the EU, while these measures have increased for Google, Facebook, and Amazon.

Via Alex Stamos thread on Twitter (“Anybody wonder why the big tech companies didn’t really fight that hard against GDPR? It isn’t due to a newfound love of regulation”) by way of James Pethokoukis; more, Antonio García Martínez.

Buying a home? Feds want to know your identity

Another valued little piece of financial privacy being lost: in the name of enforcing money laundering and know your customer regulations, the Treasury Department’s Financial Crimes Enforcement Network has expanded a program the effect of which is to require disclosure of your identity if you buy a home in some parts of country [Kathleen Pender, San Francisco Chronicle]

Related: British financial regulators adopt new approach of “shifting the burden of proof onto foreign investors; they must now prove their wealth is legitimate.” [Jeffrey Miron, Cato]

The NYPD’s DNA dragnet

New York City police have employed the equivalent of DNA dragnets, combining voluntary with covert (e.g., grabbing a discarded cup) collection methods. Thus, before identifying a suspect in the Howard Beach jogger case, “the NYPD collected well over 500 DNA profiles from men in the East New York area….But things get worse from there. For those people excluded from the jogger case, the Office of the Chief Medical Examiner, the city’s crime lab, permanently keeps those profiles in their databank [with more than 64,000 others] and routinely compares profiles to all city crimes.”

In other words, cooperate with police by giving a DNA sample in order to help solve (or clear yourself in) some dreadful crime, and you’re in the database to nail for anything and everything else in future. “In this respect, [you] will be treated just like someone convicted of a crime.” And did you guess this? “Under their labor contract with the city, rank-and-file officers don’t give the lab their DNA, which means the lab can’t easily rule out possible crime-scene contamination.” [Allison Lewis, New York Daily News]

For LabMD, the consolation of a big win in court

Readers who watched the Cato forum last November on prosecutorial fallibility and accountability, or my coverage at Overlawyered, may recall the story of how a Federal Trade Commission enforcement action devastated a thriving company, LabMD, following a push from a spurned vendor. Company founder and president Mike Daugherty, who took part on the Cato panel, wrote a book about the episode entitled The Devil Inside the Beltway: The Shocking Exposé of the U.S. Government’s Surveillance and Overreach into Cybersecurity, Medicine and Small Business.

Last month two separate federal appeals courts issued rulings offering, when combined, some consolation for Daugherty and his now-shuttered company. True, a panel of the D.C. Circuit Court of Appeals, finding qualified immunity, disallowed the company’s claims that FTC staffers had violated its constitutional rights by acting in conscious retaliation for its criticism of the agency. On the other hand, an Eleventh Circuit panel sided with the company and (quoting TechFreedom) “decisively rejected the FTC’s use of broad, vague consent decrees, ruling that the Commission may only bar specific practices, and cannot require a company ‘to overhaul and replace its data-security program to meet an indeterminable standard of reasonableness.’” [More on the ruling here and here]

As usual, John Kenneth Ross’s coverage at the Institute for Justice’s Short Circuit newsletter is worth reading, both descriptions appearing in the same roundup since they were decided in such quick succession:

Allegation: Days after LabMD, a cancer-screening lab, publicly criticized the FTC’s yearslong investigation into a 2008 data breach at the lab, FTC staff recommend prosecuting the lab. Two staffers falsely represent to their superiors that sensitive patient data spread across the internet. (It hadn’t.) The FTC prosecutes; the lab lays off all workers and ceases operations. District court: Could be the staffers were unconstitutionally retaliating for the criticism. D.C. Circuit: Reversed. Qualified immunity. (Click here for some long-form journalism on the case.)…

Contrary to company policy, a billing manager at LabMD—a cancer-screening lab—installs music-sharing application on her work computer; a file containing patient data gets included in the music-sharing folder. In 2008 a cybersecurity firm finds it and tells LabMD the file has spread across the internet. (Which is false.) When LabMD declines to hire the cybersecurity firm, the firm reports the breach to the FTC, which prosecutes the case before its own FTC judge. LabMD does not settle; the expense of fighting forces the company to shutter. The FTC orders LabMD to adopt “reasonably designed” cybersecurity measures. Eleventh Circuit: The FTC’s vague order is unenforceable because it doesn’t tell LabMD how to improve its cybersecurity.

Our friend Berin Szóka of TechFreedom sums it up: “The court could hardly have been more clear: the FTC has been acting unlawfully for well over a decade.” He continues by calling this “a true David and Goliath story”:

Well over sixty companies, many of them America’s biggest corporations, have simply rolled over when the FTC threatened to sue them [over data security practices]. … Only Mike Daugherty, the entrepreneur who started and ran LabMD, had the temerity to see this case through all the way to a federal court. …After losing his business and a decade of his life, Daugherty is a hero to anyone who’s ever gotten the short end of the regulatory stick.

[cross-posted from Cato at Liberty]

Europe’s new data-privacy law helps… guess who?

The European Union’s new privacy law, the General Data Protection Regulation, or GDPR, is sometimes defended as a response to the prospect that too much data will concentrate in the hands of the biggest corporate data users. Per the WSJ, however, one of its earliest effects “is drawing advertising money toward Google’s online-ad services and away from competitors that are straining to show they’re complying with the sweeping regulation.” In particular, Google is showing a higher rate of success in gathering individuals’ consents to be marketed to. [Tyler Cowen] With bonus mention of CPSIA: “The Inevitable Lifecycle of Government Regulation Benefiting the Very Companies Whose Actions Triggered It” [Coyote]

California’s privacy-law bomb

Eric Goldman, “A Privacy Bomb Is About to Be Dropped on the California Economy and the Global Internet”:

By tomorrow, the California legislature likely will pass a sweeping, lengthy, overly-complicated, and poorly-constructed privacy law that will have ripple effects throughout the world. While not quite as comprehensive as the GDPR, it copies some aspects of the GDPR and will squarely impact every Internet service in California (some of whom may be not currently be complying GDPR due to their US-only operations). The GDPR took 4 years to develop; in contrast, the California legislature will spend a grand total of 7 days working on this major bill. It’s such a short turnaround that most stakeholders won’t have a chance to participate in the legislative proceedings. So the Internet is likely to change radically tomorrow, and most people have no clue what’s coming or any voice in the process.

As bad as this sounds, the legislature’s passage of the bill is likely the GOOD outcome in this scenario. What could be worse?

Read on in the post for a discussion of the peculiar dangers of the contemporary California initiative process. And as predicted, the bill did pass, unanimously [Issie Lapowsky, Wired]

Chasing data portability on social media

Data portability mandates on tech companies like Facebook are sometimes conceived as a way to bring about more competitive market structures pleasing to antitrust enforcers by engineering a less “sticky” consumer experience. But is it really much of a solution to anything? [Alex Tabarrok citing Will Rinehart, American Action Forum; more, Tyler Cowen]