When last we were virtually together (MDJ 2020.02.05), we explained how a digital signature for a macOS app is just a blob of data—cryptographic hash values for everything in an application’s package or file—encrypted with a private key. That key is paired with a public key in a certificate, signed by an authority that is almost always Apple Inc.
When it’s time to verify the app’s integrity, the operating system uses the public key in the certificate to decrypt the data, confirming that it was encrypted by the private key holder. The system then compares the decrypted data to newly computed cryptographic hash values for the same files, using the same algorithm, and confirms they are identical. If all the cryptography is valid, this process confirms that none of the files whose hashes were computed have changed at all since the signature was created. In other words, if the code was not infected with malware when it was signed, it’s not infected with malware now.
Apple calls this blob of data the seal, because like the seal on a rolled-up document, as long as it’s intact (cryptographically verified in this analogy), no one has tampered with the enclosed data. It’s more secure than a wax seal because wax can be melted and reapplied. That’s why notaries public use embossing seals that physically alter documents, often by puncturing them with tiny pinpricks in the pattern of a seal.
If you want to be a notary public, you must meet specific integrity requirements (that vary by jurisdiction), and get some training on the job's legal requirements. When a client asks you to notarize a document, you must either attest that you personally know and vouch for the identity of the client or attest that you reviewed state-approved identification that proves the client’s identity. You must then witness the client signing the document in question and record such in your notary log, then emboss the document with your notary seal containing your notary license number.
This weird process is a system of identity verification. The notary public’s name and identity is proven to the state when they obtain the notary license. The client’s name and identity are proven to the notary when the document is signed. Therefore, a notarized document is an attestation to all in that jurisdiction that the person who signed the document is, in fact, the person whose signature is on the document. The signature belongs to the person; the person is the author of the signature. There's no guessing it might have been forged—the notary saw the client sign it. (Some jurisdictions allow notaries to ask for approved ID that already contains a signature, just in case they want to compare it to what the client creates on the document.)
That’s the sense of trust Apple hopes to provide by naming its current level of verification notarization. It's not an exact analog, but neither are digital signatures to physical signatures. The idea is that notarized code is more trustworthy than code that's just signed—even though, digitally speaking, neither process creates trust, and Apple is the trust authority behind both processes.
Sign here
Apple calls the perhaps-simpler kind of code signature a Developer ID application because they call the Apple-issued certificate with the public key accompanying a private key a Developer ID. For the most part, developers don't have to go through Keychain Access to generate certificate requests, as the Xcode development environment handles the details for most kinds of programming. But if your company also uses certificates and wants to use them to sign your code, you'll have to go through the manual process in addition to the other. As Apple explains in its Code Signing Guide:
Apple uses the industry-standard form and format of code signing certificates. Therefore, if your company already has a third-party signing identity that you use to sign code on other systems, you can use it with the macOS codesign command.
Similarly, if your company is a certificate issuing authority, contact your IT department to find out how to get a signing certificate issued by your company. However, while these valid certificates allow you to sign your code, you can only distribute through the App Store or the Developer ID program when you sign with a certificate issued by Apple.
(Let’s get one thing out of the way: the command-line tool for signing code is called codesign. It's the Unix way. Your correspondent always reads those eight letters on the first encounter as "co-design," and wishes the tool was named signcode, but it’s not and that’s that.)
In other words, Apple requires that code by signed with an Apple-issued certificate to pass Gatekeeper protections. You can still run code that doesn’t do this! You’ll just have to manually open it with Control-Click in the Finder (or add it to the Security & Privacy preference pane) if it’s quarantined, such as when first downloaded or installed. In effect, Apple uses a specific system security policy to say, "We'll let you run code without jumping through a hoop if it's signed with a certificate that we issued. If we can't revoke that certificate when necessary, we're going to make you jump through a hoop, so you know we don't recommend this."
![](https://substackcdn.com/image/fetch/w_1456,c_limit,f_auto,q_auto:good,fl_progressive:steep/https%3A%2F%2Fbucketeer-e05bbc84-baa3-437e-9518-adb32be77984.s3.amazonaws.com%2Fpublic%2Fimages%2F8b712abf-c580-4b7d-899d-eb7ae9902c02_739x620.png)
The above figure shows the different kinds of certificates that Apple issues for code signing. The Apple Development certificate lets you futz around with your app on your machines all you want without worrying about Gatekeeper or other distribution security policies. However, once you start to share your code with others, you need an Apple Distribution certificate. The development certificate is limited to machines within your company or team, a separate concept within development so that multiple product groups can work in the same company.
Anything signed with your team’s development certificate runs on your team’s machines, but not on any machines where that certificate is not installed and trusted. You can’t get around it by distributing your certificate, because Apple requires you keep it private to your team and can and will revoke the certificate if you don’t.
This may make some sense if you think about a company like Adobe that has at least two dozen separate Mac apps for Creative Cloud and other projects. Adobe management may not want the team that makes Audition to be able to create code whose signature says “the Adobe Photoshop team,” and vice-versa.
Adobe can get around internal problems by issuing its in-house certificates, requiring them to be installed and trusted on all development and test machines, and using them to sign development builds. If your keychain contains a trusted certificate that matches the key used to sign an app, Gatekeeper should let you launch an app signed this way without jumping through any hoops. It probably doesn’t, but the details on this could send you down the rabbit hole for a week or more.
Designated requirements
What you might call "needlessly complicated," others would call "flexible." Lone developers can find all this too much, but huge companies that make big software for big constituencies might find it barely adequate. It's long been fashionable in Mac circles to rag on Microsoft Word as "bloatware" that has far too many functions for anyone to need, leading to the familiar "selling point" of new text editors that "it's just for text, not all that other stuff." No one likes to point out that Word's features did not evolve on their own while no one was looking. Customers asked for them. The macro language you sneer at might allow a paralegal to properly format hundreds of pages of briefs in 30 minutes instead of six hours. Embedding Excel objects in a Word document seems beyond stupid to a fiction author, but it helps a company down the street send updated documents to investors without cut-and-paste errors, and so on.
We mention this because the next bit of flexibility has caused lots of complications.
As part of a code signature, developers can include both requirements and restrictions. The requirements start with the obvious, such as “this code needs to verify with every signature matching a valid certificate.”) Most of the requirements are just the code’s seal—hash values and certificate specifics that must match—but some can be more sublime, like “this code must validate against a trusted certificate from my company with these exact values, and is a Perl script that requires Apple’s Perl interpreter, not a different one.” You can read about the Code Signing Requirement Language (it’s not much of a language) here.
Restrictions are other parameters encoded into the seal, usually in the form of a property list, that tell the operating system not to allow some kinds of access. The most well-known of these restrictions is the Application Sandbox. The term sandbox first became popular when Java (the programming language, not the island) was going to rule the world and refers to the idea of an enclosed space in which it’s safe to play. Whatever happens inside the sandbox stays there; destroying the sandbox affects nothing in the larger world. Apps that run in sandboxes can’t delete arbitrary files on disk, access your personal information, or even send information over the network.
That would make for very dull programs outside of "Learn Swift" classes, so programs may breach the sandbox’s walls if they are entitled to do so. Those permissions are the oft-discussed sandbox entitlements, and they’re free for the asking. For example, the newest versions of Microsoft Word support dictation—speak into a connected microphone and Word can transcribe your text. But Word is sandboxed, so inside its code signature, Microsoft has included the com.apple.security.device.audio-input entitlement. It supersedes the older com.apple.security.device.microphone entitlement for apps like Word using the Hardened Runtime (more on that later).
Why bother with a sandbox if a program can escape it just by asking? It's still security. Programs only ask for those entitlements they need. If the app is somehow compromised and tries to reach beyond the sandbox, it fails when it tries to access resources the app did not need. In the same example, Word requests access to your Address Book because it prints labels (in fact, we first got the prompt to allow access to Address Book when we picked the "Labels…" menu item). However, Word does not need access to your calendars and accordingly does not include the com.apple.security.personal-information.calendars entitlement. If a future exploit compromises Word, it might be able to gain access to your contacts. It won’t be able to see your events.
Sandboxed programs can't access anything that requires an entitlement unless they include that entitlement. They're restricted to reading and writing files in their own container (a part of the disk created for them with read-write access) so they can't drop things in a user's (or the system's) Library folder without permission. They create documents only in the Documents folder unless you permit them to write a file elsewhere by choosing a different location. They can't even ask to use the printing or Bluetooth subsystems without the appropriate entitlements encrypted into the code's seal.
That was a good step, though hard on developers of advanced applications. (There are lots of issues, for example, with applications that host plug-ins. How they can load those plug-ins and verify that they are correctly code-signed is a bucket of ice water that we won’t be dumping on our head in this series.) But attackers keep getting smarter, and the defenses must rise to meet them. Starting with macOS 10.14 (Mojave), Apple includes an even stricter environment for code called the Hardened Runtime environment. Available in 10.14 and later when System Integrity Protection (SIP) is enabled, the Hardened Runtime removes several powerful programming techniques from applications because those techniques are mostly the domain of the OS itself.
For example, the Intel processors in every Mac can mark parts of memory intended for data as “non-executable,” meaning that the processor simply refuses to transfer control to code in that area of memory. That’s a powerful tool against code exploitation. If an attacker finds a bug that lets him copy 8 KB of data into a buffer meant to hold “email address,” he can put his code in part of that field and trick the system into calling it—but not if the system refuses to run any code in an area marked as data.
“Why don’t we turn that on for everyone?” you may cry. Well, we can’t. For starters, it would kill all the web browsers, which have to download JavaScript code, compile it, and then execute it. If the system enforces this execution protection on all apps, no app could compile code. Also, it’s hard to predict what applications are doing things like this, or even if they’re using development tools capable of making the distinction between “a buffer of downloaded data” and “a new block of memory to contain code I'm compiling now."
Therefore, in the same vein as sandboxing, the Hardened Runtime turns on these kinds of features unless applications say they need them disabled. As with sandboxing, apps must opt into the Hardened Runtime environment or, by default, they’ll be loaded only with the more lenient restrictions they’d already chosen in their seals. Sandboxed apps do not get Hardened Runtime by default, but must be modified to get it.
Under the Hardened Runtime, apps may not create writable and executable memory, with or without using the MAP_JIT flag, unless they specifically request it. (Developers can still request the ability to make completely unprotected writable and executable memory, and interestingly enough, must do so if they need the DVD Playback framework.) Hardened Runtime apps do not get access to DYLD environment variables that can alter how the system loads their code, nor can they disable the code signing protections for their own or hosted code. And by default, they get no access to audio input, cameras, location, contacts, calendars, photos, or even Apple events. Yet they can do all of these and more under Hardened Runtime if they tell the system they need these capabilities.
Notarize Public
Once an app is signed with the proper “Developer ID” certificate obtained from Apple and successfully works with the Hardened Runtime, it is finally time to submit it to Apple for notarization. This is both the easiest and most frustrating step for many developers.
It's easy for most developers because, for simple applications that use the Hardened Runtime, Xcode takes care of most of the details. After signing everything in the bundle properly and in the right order with cryptographic timestamps (another security measure), Xcode uses the command-line utility altool to submit the package to Apple’s notary service. The service checks all the signatures just as Gatekeeper and macOS itself would, and then it examines the binary code in the package—not just applications but plug-in, command-line tools, helper apps—all of it.
Apple says only one thing about this process: it checks for malware. That’s huge on its own, as we’d be willing to bet something embarrassing that Apple’s database of Mac malware is far, far more comprehensive than anything that antiviral software makers have. Apple could theoretically test against every security bug that it has fixed in the last 3-4 years and see if any submitted code is trying to use those bugs as exploits. Apple gives no statistics on what bad stuff notarization has found, but we’re comfortable guessing that, worldwide, it’s a lot. Like enough to scare any of us.
Second, we'd also bet that Apple is doing more analysis on the code for statistical purposes. This has let Apple reap considerable benefits in the past few years in iOS, and macOS could use the same. Binary analysis could reveal which coding patterns are the most popular, and therefore the best targets for optimization in the next version of the Objective-C and Swift compilers (or in the linker).
It could see how often developers use older APIs when it's time to consider either updating or deprecating them in a newer, more secure version of the OS. We know that Apple does search binaries for private APIs, including system programming interfaces (SPIs), and rejects those uses in the Mac App Store, so we'd imagine that if notarization doesn't do that now, it will someday. (SPIs contain good stuff, but they were neither designed nor tested outside of specific contexts not available to outside programmers. They are inherently dangerous, and they're private for good reason, no matter how frustrating that is.)
If Apple can build a sufficiently fast Intel simulator and run programs through simulated use, the notary service could use the data gathered from these thousands of packages to improve the system in as many ways as it could to mess with developers—if that’s what it chose to do. Since Apple now requires applications to be notarized to launch without obstacles, developers have a right to wonder what Apple is doing with their binaries—but also, we feel, should give the company the benefit of the doubt until there’s evidence Apple uses the data for anything but improving the OS for everyone.
Whatever happens during notarization besides validating signatures and checking for malware, Apple returns data to the developer containing either errors upon failure, or a log of potential issues (if any) and a ticket that is stapled to the submitted bundle, binary, or installer package. The ticket is a seal (digital signature) from Apple, using Apple's own certificates and signed with Apple's private key. Stapling just embeds the signature into the item that was notarized, usually the same way the code signatures are embedded. If a full ticket can't be embedded, the item gets a URL that Gatekeeper and macOS can use to download the ticket for verification.
If the product in question is more complicated than standard Xcode output (bundle, tool, installer package), or if developers have more complex needs, they can call altool directly to manage their submissions and receipts from Apple’s notary service.
Either way, though, their biggest complaint is that the whole process is asynchronous. You need to notarize any code before releasing it, even to beta-testers, but once it's submitted, you don’t know when you’ll get a response. Apple currently says that most submissions get results back within 15 minutes, but 15 minutes is a long time to wait when you want to get a test build out to a customer with a bad bug, especially if you’re trying to iterate. Even instant code changes could be limited to four builds per hour if you have to notarize each one.
There are ways around it, of course, but it’s jumping through hoops. It’s an extra step during a build for any release that takes an unknowable amount of time. Developers like to spend their time creating features that will inspire customers to buy new versions, not on waiting for Apple to permit them to launch their own apps.
Why do users care?
It’s easy enough to see the benefits of code signing and notarization, but why both? Do users really care about Apple’s extra malware scan? Probably not, though they should. It’s a layer of protection beyond the Developer ID signature, and it doesn’t cost the user any time (unlike the developer).
"Wait a minute," you may object, "we were sold on this whole code signing thing as a way to make sure applications were verified and not compromised. Shouldn't Apple just revoke the code signing certificate if a developer's app is compromised?" Yes, Apple should do that, but they made it too complicated. Developers are encouraged to use a single Developer ID certificate for all their macOS code (and another single iOS certificate for all iOS programs, although that's a different thing since those have to go through the App Store). Revoking that developer certificate would prevent launching any of that developer’s programs, not just the one found to be infected.
This stopped being theoretical in 2017, when a compromised version of Handbrake managed to steal source code from one of Panic’s developers, possibly including the private keys to Panic’s Developer ID certificate. To mitigate against the potential of attackers building compromised versions of Panic software with code signatures, Panic and Apple revoked the Developer ID certificate for all of Panic’s signed apps.
That was the right step then and now—if a Developer ID key is compromised, its certificate must be revoked to prevent malware. However, let’s take the more insidious example—a new strain of malware infects a developer’s machine and builds the malware into the application, rather than tries to inject it at some point after release. The app with malware included could get a valid code signature, pass notarization if Apple had not seen this or similar malware before, and get out into the wild.
When that bit of nastiness is uncovered, Apple can stop people from launching the compromised app by revoking the notarization ticket. Without notarization, the mitigation would again require revoking the Developer ID certificate, requiring all of that developer's customers (even those who had not upgraded to a new version) to obtain a new binary signed with a valid Developer ID. That’s a nightmare for everyone involved.
If any trusted certificate used in code signing or notarization is revoked, all of the apps using signature anywhere see it revoked. If you're in a large company with its own certificates, the company can sign with those certificates and revoke them if there's an internal problem. Apple could theoretically revoke one of its own root certificates if it were somehow (almost unimaginably) compromised, rendering the threat moot to all machines connected to the internet. You'd need physical access to compromise a machine not connected to the internet, so between the two conditions, that should be all machines. (Side note: this is why we hate reports that say something like, "The person may or may not take this action." We knew that before the report. "May" and "may not" are the entire set of possible actions. The report has added no information.)
One more look
We’ve run way over our size allotment in this delayed issue, so we’ll return one more time to the topic for the fire-breathing finale on why all of this has been such a giant pain for every person involved.