MDJ 2020.02.14: What a signature grants
Why we need this convoluted system
We’ve covered that notarization isn’t about Apple controlling developers, even though it does impose some controls on developers. We’ve explained the fundamentals of digital signatures, certificates, code signing, and the “Developer ID” signature that Apple now “mandates” on all new installations of macOS.
That’s in scare quotes because you can still run any code you want, but you can’t run other people’s code by double-clicking an icon the first time without a code signature. Very little software is distributed on physical media anymore, and a warning about unverified downloaded code could easily scare away customers because antivirus vendors are hyping that exact threat (and more on that next week). Therefore, no developer wants to lose a customer because Apple made the customer scared to launch the app.
Yet you can easily argue these are all pains in service of a greater good—security, trust, and verification. Code signing is an absolute win—for organizations that need to control where their code runs, for customers who know that the bundle they installed is bit-for-bit exactly the bundle that the developer release, and for Apple to keep some form of control over developers.
Whoa! One of those things is not like the others! Why does Apple get to decide what code you can run on a computer you purchased? Where does the company get off telling developers, rather than suggesting to them, what code they can and cannot write?
The last question, at least, goes back to 1984: Apple has made it clear to developers that customers using developer software on Mac computers are both the developer’s and Apple’s customers. Apple insists on its own right to make operating systems that are safe for its customers to use, both originally and after upgrades. And in the Mac OS X-and-later era, the company increasingly includes both security and privacy in its definition of “safety.” If it absolutely must use certificate revocation power to provide those benefits to users, it reserves the right to do so.
No one blames any developer if this creeps them out. Over just the past few months, the company Blix has engaged in both a public and legal battle with Apple Inc. over their inability to get their email client, BlueMail, back into the Mac App Store. Blix, through its subsidiary Blue Mail Inc., patented a feature that optionally routes email through a Blue Mail address. The idea is that you sign up for mailing lists with the Blue Mail address, and the people on the other end never harvest your real email address. Blix says that a similar feature in "Sign in with Apple" violates their patent, so they filed suit against Apple. Shortly after that, Apple removed BlueMail (the email client) from the Mac App Store.
Blix says that Apple gave "shifting reasons" as to why BlueMail did not qualify for the Mac App Store. As of this week, however, BlueMail is back in the Mac App Store. In a statement to MacRumors, Apple said that Blix was “proposing to override basic data security protections which can expose users’ computers to malware that can harm their Macs and threaten their privacy.” In the article, Joe Rossignol writes:
Specifically, Apple says its Developer Technical Support team advised the BlueMail team to make changes to how it packages its Mac app to resolve a security and privacy warnings issue related to the app creating a new binary with a bundle ID that changes on each launch.
Whoa! An application’s bundle ID is supposed to be the identification for a program, including its developer and program names in reverse-DNS format. You can’t usually see these in the human interface, but if you Control-click on any application bundle and open the
Contents/Info.plist file, you can search for its property key of
CFBundleIdentifier. For Safari, it’s “
com.apple.Safari”; for Word, it’s “
com.microsoft.Word”. They can be as complicated as the developer wants—for instance, Xcode uses “
com.apple.dt.Xcode” where the “
dt” stands for “developer tools.” The deepest one we could find on our production system was an old installer: “
Rossignol does not make Apple's statement itself available to readers. However, if his report is correct, then each time you launched BlueMail, it was writing a new file containing executable code, assigning that file an unused bundle ID (likely with a random number in it), and trying to launch it. The seal of a Developer ID or Mac App Store app contains its bundle ID, so by definition, this cannot work with code signing.
It is impossible for an existing digitally signed seal to contain a bundle ID other than the one provided in the original program's Info.plist file. The only way to create such a signature would be if the BlueMail app itself contained Blix’s private key and included a copy of the
codesign tool. That’s because while macOS has all kinds of apparatus for verifying digital signatures, only
codesign creates them.
Therefore, if the report is correct, the app was trying to create unsigned code bundles to execute. This is precisely the behavior you see in adware and what Malwarebytes calls "potentially unwanted programs"—using previously unknown bundle IDs in an attempt to prevent anyone from identifying their source.
That's bad! If they have code built into the app that needs to run as a separate process, or even as a plug-in, it should be in its own file in the app bundle where it can be signed and notarized. While Mac App Store submissions do not have to be notarized, they do need to be signed, and they must pass all the same checks as notarization before being approved, with the possible exception of using the Hardened Runtime. Apple creates a different seal for Mac App Store programs that gives the same benefit as notarization without using the exact same external process.
Trying to create new executable files on the fly, rather than just including them with proper signatures, is literally how a virus behaves. But even if it were a virus, and there’s no evidence that it is, the system mechanisms still protect you. As Ben Lovejoy points out at 9to5mac.com, “[Mac malware programs] are not apps that can spread from machine to machine, installing themselves. macOS doesn’t allow unsigned apps to be installed without user permission.”
Pulling the plug
While the code signature issues do not vary much between Mac App Store and Developer ID apps, Apple is stricter with the former. Mac App Store apps must use the app sandbox; Developer ID apps are free to ignore it but do now need to work with the Hardened Runtime. As far as we know, Blix was still free to distribute BlueMail as a Developer ID app whiled it was absent from the Mac App Store, provided that the security issues coming to light now did not break the Developer ID guidelines as well.
When does Apple pull the plug on a Developer ID app, and how? That is, how does the company decide to revoke a notarization ticket or a Developer ID certificate to prevent code from launching? And where do they get the right? The company pointed Jason Snell to a developer support document that answers these questions.
What right? The right of the Apple Developer Program License Agreement:
When joining the Apple Developer Program and accepting the Program License Agreement, developers agree to ensure that their software is safe and secure for their users. They also agree to cooperate with Apple systems, such as the notary service, designed to help protect users from malware (e.g., viruses, trojan horses, backdoors, ransomware, spyware) or malicious, suspicious, or harmful code or components when distributing Developer ID–signed Mac software outside the Mac App Store.
It is worth everyone’s time to read this short document, but in summary, Apple will not tolerate behavior in one of three categories:
Deceptive products mislead you about what they’re supposed to do or where they came from. Installing “additional software components on a user’s system other than those clearly described” is misleading, as is selling something as the solution to a non-existent problem, or using system resources not needed to do its task (“e.g., a hidden cryptographic currency miner”) without your consent.
Sticky is our term for products Apple calls “difficult or costly to remove.” We’ve all seen these in Web pages—fake alerts that you can’t close, pages that try to keep you from navigating away, and so on. This is the native code analog and includes offenses such as pervasive ads, attempts to prevent removal, or charging to remove the software.
Anti-security products violate your privacy or the system's attempts to protect either that privacy or the system itself. Examples include changing your settings without your consent (including just about any network or certificate setting); transmitting your private information without your informed consent (like a game that wants to upload your address book but lies to you about why it wants access to your contacts); or interfering with other software in an attempt to be more in control of the system. This does not apply to a utility like Keyboard Maestro, but would absolutely apply if a game included a Keyboard Maestro-like engine that watched or generated keystrokes or input events without your informed consent.
These are broad categories, and Apple lists specific as examples as a way to remind developers that specific behaviors not mentioned in the document may still violate the program guidelines. Yet it's important to note that in all the years this revocation power has existed, we can't point to a specific instance of legitimate or popular macOS software being canceled through a revoked Developer ID. The vast majority of horror stories are about programs that can't get into (or can't update themselves in) the Mac App Store, where Apple has set stricter yet equally vague rules.
Apple has painted a picture of its power to revoke a developer or notarization certificate as one it does not want to use. We’ve seen no evidence to suggest otherwise. If the company suddenly placed a heavy hand upon the revocation mechanism, you’d be sure to see a lot of lawsuits and a stronger eye from regulators in the US, the EU, and elsewhere. At least one US presidential candidate has called for the Mac App Store and App Store (iOS) to be wrested from Apple’s control and run as a separate entity, perhaps as a non-profit. This is probably impractical in a whole bunch of ways we’ll get to one day, but Apple surely does not want to invite more scrutiny by taking valid software away from its own users.
And it is vital to have that switch. BlueMail was already dinged on other platforms for sending customer email addresses and passwords to BlueMail's servers. After responding, "we do not send passwords to our servers or to any 3rd parties,” the same tests found it sending both email addresses and passwords to their servers. Current tests do not indicate the same, but the company has already said once that it didn’t do the exact thing it did, so if you choose to believe it might still be happening but in way that’s harder to detect, we’re not going to give you much argument.
From basic code signing to Mac App Store requirements and notarization for others, this is how you build up security and verification. You may not trust Apple in any moral sense, or Blix, or any developer, but the seals and certificates tell you that every organization involved in the signature, all the way to the root certificate authority, has verified to their satisfaction that the information they signed was accurate, and the system validates the cryptographic hashes to make sure none of it has changed. Everyone benefits from this system. It is an absolute good.
And once again, we’re at our length before we’ve gotten to the fire-breathing part! If possible, we’ll finish up in a Saturday issue. If not, you’ll get it Monday morning (US time).