Police officers in Wisconsin “drew Gerald Mitchell’s blood while he was unconscious—to test his blood alcohol content after a drunk-driving arrest. The state has attempted to excuse the officers by citing an implied-consent statute, which provides that simply driving on state roads constitutes consent to such searches.” Although the right to privacy are not absolute, there are problems with that approach, made worse by a strange Wisconsin Supreme Court opinion extending to highway searches a Fourth Amendment search exception for “pervasively regulated businesses.” [Ilya Shapiro and Patrick Moran on Cato cert amicus brief urging the Supreme Court to review Mitchell v. Wisconsin]
The EU’s General Data Protection Regulation (GDPR), along with similarly heavy-handed regimes such as California’s Consumer Privacy Act, entrenches established platforms that have the resources to meet their onerous compliance requirements. Since the GDPR’s implementation in May, the rank and market share of small- and medium-sized ad tech companies has declined by 18 to 32 percent in the EU, while these measures have increased for Google, Facebook, and Amazon.
Via Alex Stamos thread on Twitter (“Anybody wonder why the big tech companies didn’t really fight that hard against GDPR? It isn’t due to a newfound love of regulation”) by way of James Pethokoukis; more, Antonio García Martínez.
Another valued little piece of financial privacy being lost: in the name of enforcing money laundering and know your customer regulations, the Treasury Department’s Financial Crimes Enforcement Network has expanded a program the effect of which is to require disclosure of your identity if you buy a home in some parts of country [Kathleen Pender, San Francisco Chronicle]
Related: British financial regulators adopt new approach of “shifting the burden of proof onto foreign investors; they must now prove their wealth is legitimate.” [Jeffrey Miron, Cato]
New York City police have employed the equivalent of DNA dragnets, combining voluntary with covert (e.g., grabbing a discarded cup) collection methods. Thus, before identifying a suspect in the Howard Beach jogger case, “the NYPD collected well over 500 DNA profiles from men in the East New York area….But things get worse from there. For those people excluded from the jogger case, the Office of the Chief Medical Examiner, the city’s crime lab, permanently keeps those profiles in their databank [with more than 64,000 others] and routinely compares profiles to all city crimes.”
In other words, cooperate with police by giving a DNA sample in order to help solve (or clear yourself in) some dreadful crime, and you’re in the database to nail for anything and everything else in future. “In this respect, [you] will be treated just like someone convicted of a crime.” And did you guess this? “Under their labor contract with the city, rank-and-file officers don’t give the lab their DNA, which means the lab can’t easily rule out possible crime-scene contamination.” [Allison Lewis, New York Daily News]
Readers who watched the Cato forum last November on prosecutorial fallibility and accountability, or my coverage at Overlawyered, may recall the story of how a Federal Trade Commission enforcement action devastated a thriving company, LabMD, following a push from a spurned vendor. Company founder and president Mike Daugherty, who took part on the Cato panel, wrote a book about the episode entitled The Devil Inside the Beltway: The Shocking Exposé of the U.S. Government’s Surveillance and Overreach into Cybersecurity, Medicine and Small Business.
Last month two separate federal appeals courts issued rulings offering, when combined, some consolation for Daugherty and his now-shuttered company. True, a panel of the D.C. Circuit Court of Appeals, finding qualified immunity, disallowed the company’s claims that FTC staffers had violated its constitutional rights by acting in conscious retaliation for its criticism of the agency. On the other hand, an Eleventh Circuit panel sided with the company and (quoting TechFreedom) “decisively rejected the FTC’s use of broad, vague consent decrees, ruling that the Commission may only bar specific practices, and cannot require a company ‘to overhaul and replace its data-security program to meet an indeterminable standard of reasonableness.’” [More on the ruling here and here]
As usual, John Kenneth Ross’s coverage at the Institute for Justice’s Short Circuit newsletter is worth reading, both descriptions appearing in the same roundup since they were decided in such quick succession:
Allegation: Days after LabMD, a cancer-screening lab, publicly criticized the FTC’s yearslong investigation into a 2008 data breach at the lab, FTC staff recommend prosecuting the lab. Two staffers falsely represent to their superiors that sensitive patient data spread across the internet. (It hadn’t.) The FTC prosecutes; the lab lays off all workers and ceases operations. District court: Could be the staffers were unconstitutionally retaliating for the criticism. D.C. Circuit: Reversed. Qualified immunity. (Click here for some long-form journalism on the case.)…
Contrary to company policy, a billing manager at LabMD—a cancer-screening lab—installs music-sharing application on her work computer; a file containing patient data gets included in the music-sharing folder. In 2008 a cybersecurity firm finds it and tells LabMD the file has spread across the internet. (Which is false.) When LabMD declines to hire the cybersecurity firm, the firm reports the breach to the FTC, which prosecutes the case before its own FTC judge. LabMD does not settle; the expense of fighting forces the company to shutter. The FTC orders LabMD to adopt “reasonably designed” cybersecurity measures. Eleventh Circuit: The FTC’s vague order is unenforceable because it doesn’t tell LabMD how to improve its cybersecurity.
Our friend Berin Szóka of TechFreedom sums it up: “The court could hardly have been more clear: the FTC has been acting unlawfully for well over a decade.” He continues by calling this “a true David and Goliath story”:
Well over sixty companies, many of them America’s biggest corporations, have simply rolled over when the FTC threatened to sue them [over data security practices]. … Only Mike Daugherty, the entrepreneur who started and ran LabMD, had the temerity to see this case through all the way to a federal court. …After losing his business and a decade of his life, Daugherty is a hero to anyone who’s ever gotten the short end of the regulatory stick.
[cross-posted from Cato at Liberty]
The European Union’s new privacy law, the General Data Protection Regulation, or GDPR, is sometimes defended as a response to the prospect that too much data will concentrate in the hands of the biggest corporate data users. Per the WSJ, however, one of its earliest effects “is drawing advertising money toward Google’s online-ad services and away from competitors that are straining to show they’re complying with the sweeping regulation.” In particular, Google is showing a higher rate of success in gathering individuals’ consents to be marketed to. [Tyler Cowen] With bonus mention of CPSIA: “The Inevitable Lifecycle of Government Regulation Benefiting the Very Companies Whose Actions Triggered It” [Coyote]
Eric Goldman, “A Privacy Bomb Is About to Be Dropped on the California Economy and the Global Internet”:
By tomorrow, the California legislature likely will pass a sweeping, lengthy, overly-complicated, and poorly-constructed privacy law that will have ripple effects throughout the world. While not quite as comprehensive as the GDPR, it copies some aspects of the GDPR and will squarely impact every Internet service in California (some of whom may be not currently be complying GDPR due to their US-only operations). The GDPR took 4 years to develop; in contrast, the California legislature will spend a grand total of 7 days working on this major bill. It’s such a short turnaround that most stakeholders won’t have a chance to participate in the legislative proceedings. So the Internet is likely to change radically tomorrow, and most people have no clue what’s coming or any voice in the process.
As bad as this sounds, the legislature’s passage of the bill is likely the GOOD outcome in this scenario. What could be worse?
Read on in the post for a discussion of the peculiar dangers of the contemporary California initiative process. And as predicted, the bill did pass, unanimously [Issie Lapowsky, Wired]
Data portability mandates on tech companies like Facebook are sometimes conceived as a way to bring about more competitive market structures pleasing to antitrust enforcers by engineering a less “sticky” consumer experience. But is it really much of a solution to anything? [Alex Tabarrok citing Will Rinehart, American Action Forum; more, Tyler Cowen]
The lead anecdote in a Bloomberg story on the evils of tech fine print is on PayPal deleting the accounts of persons who joined before age 18. Yet on its own internal evidence, this seemingly irrational action is pretty clearly a response to the risk of liability/regulatory exposures rather than some act of random malice. How many more instances of pointless runaround or “impenetrable legalese” are going to be occasioned by the ongoing push to regulate and assign new liability to data-intensive businesses? [Nate Lanxon, Bloomberg]