Menu
The Practical and Legal Implications of Secure Boot

The Practical and Legal Implications of Secure Boot

Daniel W. Steinbrook

Technology Strategy & Analysis

July 02, 2019
Arrange an Expert Consult


We’ve spent the last two posts describing the most basic levels of system software. Most recently, we considered how a series of boot loaders create and enforce a chain of trust.

Consumer devices didn’t include this feature in the past. A computer processor can happily execute whatever code is present in memory, and few will stop to check for a valid digital signature before executing. In this post, we’ll examine some of the legal and technical consequences of this fundamental change over the past decade.

 

Antitrust concerns

Added security measures often come with new restrictions, and secure boot is no exception. A hardware-enforced chain of trust interferes with a user’s freedom to run nonstandard software or otherwise customize their device.

For example, some tech-savvy users prefer to run a different operating system, or even to tinker with one of their own creation. On a device with secure boot enabled, the operating system developer needs to obtain a valid digital signature of the boot components from the device manufacturer. Thanks to encryption, the software developer generally cannot compute a trusted signature themselves. Without a valid digital signature, the device would refuse to execute the boot loader. Without the necessary boot loader, the alternative operating system cannot run. Because device manufacturers can decline to provide such a digital signature, this means that they can tightly control which software their devices are permitted to run.

Many device manufacturers began enabling secure boot on new Windows 8 laptops around 2012. This led to allegations of anticompetitive practices from some European users who had grown accustomed to installing Linux on similar computers. While Microsoft is no stranger to antitrust complaints (in fact, our team got its start working on the Microsoft Internet Explorer antitrust case over two decades ago), this time the plaintiff’s argument was less compelling because Microsoft already explicitly allowed users to disable secure boot in Windows laptops in order to avoid this very issue.

 

Copyright uncertainty

“Jailbreaking” techniques were eventually developed for mobile devices as a way to bypass the restrictions imposed by secure boot. One way to jailbreak a mobile device involves breaking the chain of trust in the boot loader sequence in order for alternative software to escape that locked-down environment.

On a device where secure boot cannot be turned off, breaking the chain of trust requires that security researchers identify a mistake or oversight in a boot loader that participates in that chain of trust. (Naturally, these researchers are generally unsanctioned by the device makers.) Finding such a vulnerability can be quite difficult, and it’s even harder to convert it into an exploit that has the desired effect. If achieved, however, later boot loader stages can be modified to, for example, load a modified version of iOS that can run applications not found in the App Store.

Circumventing the restrictions of secure boot could be considered bypassing a technical protection mechanism for the software. Consequently, jailbreaking could be interpreted as being prohibited by the Digital Millennium Copyright Act (DMCA). To address this, in 2012, the Library of Congress at least temporarily granted an explicit exception for jailbreaking. Nevertheless, since jailbreaking essentially neutralizes the security benefits of secure boot, vendors still generally discourage users from doing it and also often warn that doing so will void the warranty of the device.

 

Hardware adaptations

Because the humans that create software are imperfect, it is very hard to prevent jailbreaking simply by making software entirely vulnerability-free. Sometimes, however, new physical hardware can help strengthen shortcomings of software security.

Consider that smartphones perform some especially secure operations like fingerprint verification and mobile payment processing. The code for these functions could be executed in dedicated, isolated hardware. This lowers the risk they can be manipulated by a malicious operating system, even in the case when secure boot had been circumvented.

For example, many Android devices run on the ARM platform. Some versions implement a trusted execution environment (TEE) called TrustZone. This subsystem runs an entirely separate set of code from the main operating system. The TEE is even initialized with its own secure boot sequence. However, it is not a perfect solution, and code running in the TEE is still subject to its own share of potential vulnerabilities.

 

Conclusion

We have now seen how a modern device initializes and authenticates its software. We also briefly examined how increased software integrity changed how people, policies, and devices interact. It’s fascinating how a fairly mechanical startup process gained a heavy dose of cryptography that subsequently invoked legal scrutiny.

As technical analysts, we’ve spent considerable time investigating (and sometimes circumventing) security mechanisms like these. If you have a need to understand any opaque software or devices, get in touch.