Hanlon’s Razor urges that we not attribute to malice what stupidity can explain. This is usually the case (cue B-movie music) WHEN REGULATORS ATTACK! The European regulators who want to mandate that all consumer electronics come with “easily” swappable batteries, and the British lawmakers who want to shanghai messaging apps into the wars on terrorism and CSAM, think they are being helpful. Are they?
Last week’s newsletter was relatively short. Today’s isn’t—sorry—so make a cup of tea, buckle in, and let’s transform and roll out.
If you’ve never looked at a teardown of a modern electronic device, you should. Here’s one for the iPhone 13 Pro, for example, with lots of pictures. Note how well-sealed everything is, allowing modern devices to survive not only an accidental dip into a pool but days underwater. Note how tightly-packed everything is, how miniscule the tolerances. Does it seem like there’s room to add in hatches, latches, and fasteners that use non-specialist tools—the kind of thing necessary to allow “easy” removal of batteries? Of course not. And that’s before you consider the near-certainty that the batteries themselves would “need to be” (read, would be required to be, by other regulations) made “safer” (read, heavier, bulkier) for removal and handling. Those changes would necessitate substantial redesigns, adding even more useless weight and bulk.
One answer is, “No, no; they aren’t mandating the G4 Powerbook, they’re just mandating the 2011 MacBook Air, the battery of which was removable.” I’ve replaced batteries in 2011 MacBook Airs. Believe me, if that’s the standard, iPhone batteries are already “removable.” To read the regulations as meaningful quoad their target (smartphones) is to read the adverb “easily” as doing real work.
Before you get too cocky that this seems too stupid to pass, recall that these regulators are the same geniuses who brought you the obligation to accept “yes, I accept the goddam cookies” on every single goddam website. “‘I’m from the government and I’m here to help’” indeed.
Worse yet, public sympathies will run against Apple on this. For ages, the myth was that Apple crippled old phones, forcing people to upgrade otherwise-viable hardware. Whence, Wired says that Apple “has a history of making its batteries difficult to replace and has taken flak for hitting users with dubious battery service alerts when trying to repair their own batteries.” (Links in original.) The truth was that iOS throttles your CPU as your battery depletes, stretching the charge as far as possible, and the older your battery (because battery performance diminishes over time) the faster it will deplete to the point where throttling starts. So, the lower your battery gets, the more iOS constricts performance, for good reasons, and the older your battery gets, the sooner the squeeze starts, for obvious reasons. Big deal, right?
Apple, however, did this secretly, which was dumb, because when the truth came to light, it was brandished as proof of the myth. Conspiracy theories contain many elements, including some that are true and well-sourced, which are used to bootstrap or credential the fantastic elements. Proving one more element true doesn’t prove the conspiracy theory. Except—for most people it does. Add some zest of hidden knowledge and you’re really stroking the erogenous zone in the part of some folks’ brains that loves being “in on it.” Let it sit for a few years, give people something else to think about (like, oh, say, a pandemic) and now all of a sudden you have a public that (thinks it) remembers Apple admitting throttling phones.
If you ask people, “do you want benefit x,” sure, they’ll say yes, if only out of FOMO. Duh. Who doesn’t want user-swappable batteries, if there’s no trade-off? But there’s always a trade-off. Often, their answer’s still yes if you ask them something more closely approximating the real question; “do you want benefit x even if the consequence is y.” Because they think they want x more than they fear y, or because they think y will fall on someone else, or for any of a myriad of reasons why people do poorly weighing factors in the abstract. But when you give people a market choice between devices that are thin, light, durable, and waterproof, but you can’t swap the battery, and devices that are fatter, heavier, less durable, and less water-resistant, but you can swap the battery, virtually all consumers pick the former, and we know that because the latter exist and no one buys them.
§
Turning now to the Online Safety Bill (“OSB”) and its implications for encryption, I should underscore: You don’t need to understand encryption to understand end-to-end encryption. So wake up, you at the back! We’re not going to dive into the weeds of computer science as many explainers imagine is necessary. No, all you need is passing familiarity with the spy movie genre.
Imagine Bond and Moneypenny invent a language (a “code”), and Bond travels to Sumatra whence he calls Moneypenny. If they speak in code, their conversation is “end to end encrypted.” You—assuming “you” are a major intelligence service of competent jurisdiction, or at least a good bluffer—can demand from the telephone company (“telco”) a recording of their conversation. But you can’t demand that the phone company provide a decoded recording; the telco doesn’t have the code. It can’t get the code. This is how many smartphone modern messaging services like iMessage and WhatsApp work. Although they exchange data across the public internet, the two endpoints agree on a code that they alone know. That only the endpoints know; not the maker of the messaging app, nor the maker of the smartphone, still less the companies providing the wires and antennae across which the signals traverse the globe to reach one another.
Now imagine Bond communicates with Leiter via Moneypenny. Bond has a code with Moneypenny; Leiter has a different code with her. This is transport encryption. You still can’t demand the contents of the communications from the telcos, but you could demand it from Moneypenny. That’s how your bank’s website works; the little padlock (or the letters https) in the address-bar tell you that your connection to that web server is encrypted. What happens with that data once it arrives at the server is anyone’s guess, but your connection thence is secure. And as that example with your bank demonstrates, you may not think you care about encryption—but you bloody well do.
OSB’s critics say that it threatens privacy. WhatsApp, Apple, and other vendors of messaging platforms (I’ll call them “WhatsApp et al”) think it so threatening to their ability to provide customers with secure services that they’re threatening to pull their products from British markets. The Home Office rejoins to Auntie that this is all to “protect the public from criminals, child sex abusers[,] and terrorists.” Well, why didn’t you say so! Government seldom justifies draconian new powers by insisting that it must protect the public from rollerbladers, soap actors transitioning to film, and omissions of oxford commas so blatant that SNOOTs who write newsletters will bother to use an alteration to fix it. No, when more power lurks behind the curtain, you will find center-stage and spotlit an existential crisis or a child in danger.
Exactly what OSB does or how it threatens the very existence of end-to-end encryption is less clear from the reporting than I’d like. The gist of the criticism is that OSB requires service providers to either do something technically impossible or rejigger their offerings to make it possible. Auntie, in the article linked in the last paragraph above, points up two problems. First, that OSB “lets the Home Office demand security features are disabled, without telling the public” and “wants messaging services to clear security features with the Home Office before releasing them to customers.” Second, that it would “require companies to install technology to scan for child-abuse material in encrypted messaging apps and other services.”
The latter seems to be the critics’ focus. In the same vein is The Verge, which in the article cited in my lede says that OSB “asks online tech companies to use ‘accredited technology’ to identify child sexual abuse content ‘whether communicated publicly or privately.’ Since personal WhatsApp messages are end-to-end encrypted, not even the company itself can see their contents. Asking it to be able to identify CSAM … would inevitably compromise this end-to-end encryption.” The Grauniad says, “[a]t the core of the dispute are clauses that allow Ofcom to compel communications providers to take action to prevent harm to users. Those clauses, privacy campaigners say, do not allow for the possibility that an encrypted messaging provider may be unable to take such action without fundamentally undercutting their users’ security.” And they continue, uncritically quoting a remarkable response from Downing Street:
A No 10 spokesperson dismissed the criticism. “Tech companies we believe have a moral duty to ensure they are not blinding themselves and law enforcement to unprecedented levels of child sexual abuse. We support strong encryption. This cannot come at the cost of public safety…. It does not represent a ban on end-to-end encryption, nor will it require services to weaken encryption. It will not introduce routine scanning of private communication. This is a targeted power to use only when necessary.”
In other words—“trust us.” As John Gruber observes, OSB proponents seem to
believe, wrongly, that it must be possible for these messaging platforms to add ‘good guys only’ back doors. That if they pass this law, the result will be that the nerds who work at these companies will be forced to figure out a way to comply. What will actually happen is that these companies will be forced to pull the services from U.K., because they can’t comply, unless they scrap their current end-to-end encryption and replace it—worldwide—with something insecure, which they aren’t going to do.
Well, they’re going to try to not do it, anyway. An open letter from the Top Men of WhatsApp, Viber, Signal, and other other players in the encrypted messaging space (I’ll refer to them collectively as WhatsApp), focuses its concern here, too. The core of their case is this:
Around the world, businesses, individuals and governments face persistent threats from online fraud, scams and data theft. Malicious actors and hostile states routinely challenge the security of our critical infrastructure. End-to-end encryption is one of the strongest possible defenses against these threats, and as vital institutions become ever more dependent on internet technologies to conduct core operations, the stakes have never been higher. … Proponents [of OSB] say that they appreciate the importance of encryption and privacy while also claiming that it's possible to surveil everyone's messages without undermining end-to-end encryption. The truth is that this is not possible.
Thus, they argue, “[a]s currently drafted, the Bill could break end-to-end encryption,” because it contains “no explicit protection for encryption, and if implemented as written, could empower Ofcom to try to force the proactive scanning of private messages on end-to-end encrypted communication services—nullifying the purpose of end-to-end encryption as a result and compromising the privacy of all users.” (Apple didn’t sign the WhatsApp letter, but it blew a similar clarion of its own.)
If we want to unravel this, we’ll have to snap on a latex glove and go into the bill itself. (Why it seems beyond the mainstream media to quote, cite, or even hyperlink to legal materials, I do not understand. Explain the material in layman’s terms, yes, but if primary sources are available online, hyperlink them!) As best I can tell based on what I’ve recounted above, the controversy involves Section 122, though we also need to consult Sections 3, 4, 126, and 237 for some key definitions, so that’s where we’ll dig in.
Created in 2003, Ofcom is Britain’s equivalent of the FCC. OSB provides that if Ofcom “consider[s] that it is necessary and proportionate to do so,” Ofcom may “requir[e]” the “provider” of a “regulated user-to-user-service” to take various actions. § 122(2). A “user-to-user service” is one by which “content that is [either] generated directly on the service by a user of the service, or uploaded to or shared on the service by a user of the service, may be encountered by another user … of the service.” § 3(1).1 It is not necessary that content be “actually shared” with anyone, nor that it be actually encountered; only that exists there and that the service procides “functionality that allows such sharing.” § 3(2)(a).
This definition is broad. It makes a provider of almost any person or corporation offering almost any imaginable internet-leveraging service. For example, imagine that Alice creates a website cataloging locomotives and their service histories. Alice makes the content and pays web hosting company BobCo to provide the storage, connectivity, domain name registration, DNS, and all the other boffinry involved in hosting a website. In turn BobCo, which like most modern web infrastructure companies doesn’t own physical servers, leases computing and storage resources from datacenters operated by a company we’ll call… Err.. Marañon Web Services. MWS peers with your ISP, which conveys the ones and zeroes of Alice’s website from MWS’ datacenter to your laptop. Alice, by operating a website, is the “provider” of a “service” (viz. the website) by means of which “content is uploaded to or shared on the service” and may be encountered. BobCo is, too, and so are MWS, and so is your ISP. And any such “service” is a “regulated” one if it “has links with the United Kingdom” and does not meet a schedule of exceptions. § 4(2).
So that’s the who. Next, briefly, the how. Ofcom may oblige providers to act by issuing to them a “notice.” Before a notice may issue, Ofcom must “obtain a skilled person’s report” and issue a “warning” of forthcoming notice, §§ 122(6), 123, and 124, and it must “particularly consider” a laundry-list of “matters … in deciding whether it is necessary and proportionate” to oblige action, §125. As procedural safeguards go, this would be anemic for even routine government power. Cf. 5 USC §§ 551 et seq.; Perez v. Mortgage Bankers Association, 575 U.S. 92, 95-97 (2015).
Finally, we may turn to the what. Ofcom may require a service provider to “use accredited technology” to “identity” two categories of content, and to “prevent individuals from encountering” such content that would otherwise be communicated to them “by means of” the service. The two categories are terrorism-related content that is communicated publicly and Child Sexual Exploitation and Abuse (“CSEA,” coterminous with “CSAM,” the prevalent term stateside) content whether communicated publicly or privately. § 122(2)(a). Furthermore, Ofcom may require providers to “use … best endeavours to develop or source” accredited technology in order to identify and interdict the two content categories just mentioned (publicly-communicated terrorism content or publicly- or privately-communicated CSEA content). § 122(2)(b).
Under OSB, technology is “accredited” if it has on Ofcom’s advice been blessed by the Secretary of State (answerable to both cabinet and Parliament—in theory) as “meeting minimum standards of accuracy in the detection of terrorism content or CSEA content, §§ 126(12) and (13), and “[i]f a provider is already using accredited technology in relation to the service in question, a notice may require the provider to use it more effectively (specifying the ways in which that must be done).” §126(2).2 I should observe that it’s unclear at what level of generality “technology” is to be so blessed, but given Section 122(2)(b)’s authorization to mandate “develop[ment]” of it, the answer almost has to be “pretty darn general.” Thus, the requirement that “technology” be “accredited” seems like thin armor indeed.
Thus, if a provider or their service has a “link” with the UK, OSB authorizes Ofcom to issue individualized mandates to “providers” of products that convey data through the internet that could be “encountered” by someone. Such mandates may oblige the provider to invent, impose, or execute policies, practices, and technologies to identity and interdict certain data payloads.
The provisions relating to terrorism payloads specify “public” conveyance. This to me (and presumably to WhatsApp et al) doesn’t threaten private messaging. The provisions relating to CSEA material, however, explicitly include both public and private conveyance. Whatever else this distinction may mean, the latter is obviously broader. While WhatsApp et al are anticipating a problem that does not yet exist, they are not speculating unreasonably. If they have customers in the UK—and boy do they—it’s hard to see how they could not qualify as “providers” of “regulated user-to-user-service[s].” If Ofcom issues them with an identify-and-interdict mandate, WhatsApp et al may find themselves in a bind: They cannot scan in-transit materials on their services because such materials are encrypted (while in transit, anyway) with keys that WhatsApp et al don’t have.
OSB has an answer to that. As we have seen, it authorizes that mandates go further. They can oblige “develop[ment]” of “technology” (a word that, as we’ve seen, has to be understood quite generally in an OSB context) to accomplish the mandated goals. That implies that mandates can literally order providers to change the behavior of their apps and technology platforms in order to facilitate the identification and interdiction. How far will this be taken, WhatsApp et al might worry, in targeted actions (divide and rule, amirite) at the discretion of a thinly-accountable regulator reporting to a barely-accountable government?
American legal doctrine contains much wisdom that applies beyond its narrow confines. From it, I would take two points that seem relevant here.
First, there’s a difference between statutes that can operate invidiously in some circumstances and those that are invidious in (almost) all circumstances.3 While it may seem that “passing on the validity of a law wholesale … [is] efficient in the abstract, any gain is often offset by losing the lessons taught by the particular, to which common law method normally looks. Facial adjudication carries too much promise of premature interpretation of statutes on the basis of factually bare-bones records.” Sabri v. United States, 541 U.S. 600, 609 (2004) (cleaned up). Thus, “a plaintiff seeking to render a law unenforceable in all of its applications must show that the law cannot be constitutionally applied against anyone in any situation.” June Medical Services v. Russo, 140 S. Ct. 2103, 2175 (2020) (Gorsuch, J., dissenting) (emphases in original).
Second, a case must be ‘ripe.’ “One does not have to await the consummation of threatened injury to obtain preventive relief. If the injury is certainly impending, that is enough." Thomas v. Union Carbide, » 473 U.S. 568, 581 (1985) (cleaned up). But when harm depends on “future events that may not occur as anticipated, or indeed may not occur at all," Texas v. United States, 523 U. S. 296, 300 (1998), courts may end up entangling (and should not entangle) themselves in “abstract disagreements," Abbott Laboratories v. Gardner, 387 U.S. 136, 148 (1967). Prudence counsels against deciding too much too soon.
Given its open texture (too open to pass muster stateside, I think) and preenactment posture, any evaluation of how OSB may be used must be “riddled with contingencies and speculation….” Trump v. New York, 141 S. Ct. 530, 535 (2020). It’s fair to say uncertainty as to the scope and operation of OSB’s provisions will be hardship on WhatsApp et al, hanging over them “continuing uncertainty and expense,” see Thomas, supra, 473 U.S., at 581, it seems patent that the concerns churned up by OSB will be “clarified by further factual development.” Id., at 569.
With these considerations in mind, I suggest that British legislators do well to ask themselves two sequential questions as they weigh the concerns raised by WhatsApp et al in considering enacting Section 122. First, they should ask, has it a “plainly legitimate sweep,” Washington State Grange v. Washington State Republican Party, 552 U. S. 442, 449 (2008), i.e., can we imagine plausible cases where Section 122 could meaningfully advance compelling governmental interests? If not, there is nothing to weigh against the potential drawbacks and legislators should reject it. If so, legislators should then ask, are any imaginable applications that would be valid. If any exist, because Section 122 authorizes rather than requires mandates, it is possible and wise wise to wait and see.
Both of these questions can reasonably be answered in the affirmative. As to the first, tackling child pornography and the child abuse it implies must plainly be a legitimate target of government policy, and seeking the assistance of the channels and instrumentalities through which such materials pass must obviously be a potentially fruitful avenue of attack. As to the second, while identify-and-desist mandates might well be unconstitutional in the United States, there is no comparable bar in Britain, where Parliament’s power is plenary. While it is predictable that Ofcom may use its new powers stupidly and even abuse them, that is necessarily contingent and speculative, and those powers are amenable to legitimate use. For example, if Ofcom learns that BobCo (our hypothetical web hosting company from earlier) hosts a CSAM website, an order to disable the website and disclose what information BobCo has on their customer (minimally, payment information, one assumes) would obviously advance a legitimate objective and is neither technically impossible nor particularly objectionable.
WhatsApp et al speculate that they will be ordered to change their apps, for all users, such that a) there is no end-to-end encryption at all, b) there is no true end-to-end encryption because the provider will hold master keys to all communications, or c) there will be end-to-end encryption but the app will at each end scan and potentially act on all data before it is encrypted for transport. Call the latter “pre-transport scanning” (“PTS”).
WhatsApp et al assume that any mandate would require them to make changes to their apps per se, changes that would be broadwaved to all devices affecting all users. That’s my assumption, too, and it’s the most likely scenario, I think. But it’s not the only one imaginable. It’s not unthinkable that an Ofcom mandate could be far narrower. For example, when Apple and the FBI stood off in 2016 over access to the San Bernadino terrorists’ iPhone, Apple characterized the FBI’s request as follows:
[T]he FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.
Whatever else you can say about that “request,” it was not a demand that Apple make a change for all users. The FBI wanted a tool that it could manually install on specific devices of which it already had physical custody. Suppose Ofcom ordered Apple to make a bespoke version of iMessage tied to a specified individual Apple ID such that when that user’s iPhone next connected to the App Store and downloaded a routine update, that user’s iMessage app and that one alone sends Apple a copy of the keys. Easy? Probably not. Impossible? Unlikely. Not something Apple wants to do, for other reasons? Obviously. But it is not unthinkable.
What is to me the more persuasive point, though, is the capacity of PTS to fulfill many imaginable mandates. Note this critical observation from The Verge: “The consensus among legal and cybersecurity experts is that the only way to monitor for CSAM while leaving messages encrypted in transit is to use some kind of client-side scanning….” And that is technically feasible. Apple let that cat out of the bag when it very publicly contemplated doing PTS for CSAM. Voluntarily! The concept was, iOS would create a one-way hash of each image file on your device and compare those hashes to a database of CSAM hashes.4 Apple changed course only after the tech tech press went apeshirt. People don’t like the idea that their devices will be spying on them in case other people are using other devices for something bad. To be sure, Apple objected strenuously to what they saw as the FBI’s attempt to conscript their engineers in the San Bernardino case, and the public discontent that scuppered Apple’s CSAM project was real. But Apple’s voluntary flirtation with PTS dynamited any argument that if Ofcom demanded pre-transport scanning, it would ask the impossible.
I understand the unspoken fear actuating WhatsApp et al. They’re afraid that if they don’t now work collectively to stop OSB from becoming law, they won’t later be able to fight if they are separately targeted in wasp-sting mandates. Maybe. But while I’m sympathetic to the parade of horribles wheeled out by OSB critics, they are too speculative for me to join their criticism.
§
Notes & Queries
Last week’s newsletter would have done well to link to videos discussing Taylor Swift’s re-recording project by producer Rick Beato and lawyer James Stone.
Today's title image was created by Johannes Landin and was released under a CC BY-SA license. Our theme music was composed by B.J. Leiderman.
Cornucopia
The United States is in the grip of a heat wave. This heat wave occurs annually between July and September; for more information, please consult a smug middle-school geography teacher.
Speaking of how the internet works.
Arvin Ash notes that we don’t know that photons are truly massless. For reasons Ash notes, any photonic mass must be nearly zero, making it individually trivial. But there are a lot of photons, which conjures an obvious thought: The idea that dark matter could be partially composed of literal light is just too delicious.
Florida Governor Ron DeSantis continues to retool his campaign for the Republican nomination. It seems futile. I have no inside information, but it’s plain as day that his original theory of the race was that without Trump in it (or with a diminished Trump), erstwhile Trump voters would pivot to the next most Trumpy candidate. So he positioned himself as such. The theory bit the dust when the race began with an undiminished Trump in it. That was the moment for DeSantis to abort his entry into the race. Ego said otherwise, so we’re now back to the race we had in 2016: With few exceptions (then Fiorina, now Christie), you have Trump and a raft of second-tier candidates waiting for him to implode, hoping they will absorb his supporters, terrified of saying anything to disinherit themselves.
Both “encounter” and “content” are expressly defined in Section 237. Their intuitive to mean, in the broadest sense, their normal meaning. “Content” includes “anything communicated by means of an internet service, whether publicly or privately, including written material or messages, oral communications, photographs, videos, visual images, music and data of any description.” Encounter means “read, view, hear or otherwise experience content.”
A “requirement to use accredited technology may be complied with by the use of the technology alone or by means of the technology together with the use of human moderators.” § 122(5).
To make the latter case, “the challenger must establish that no set of circumstances exists under which the Act would be valid.” United States v. Salerno, 481 U.S. 739, 745 (1987). Thus, for example, in Salerno, “[t]he fact that the Bail Reform Act might operate unconstitutionally under some conceivable set of circumstances is insufficient to render it wholly invalid….” Id.
More likely, would take the one-way hash of each image that iOS doubtless creates for its own purposes. A one-way hash is a brief numeric representation of an image; they work better than you’d think. The database would not be Apple’s but one maintained by the “National Center for Missing & Exploited Children,” which I assume for sake of argument is an unimpeachable source.