Reading view

There are new articles available, click to refresh the page.

Last Week on My Mac: Checking code can take longer now

Many of you have commented that you too find that apps and command tools can now take surprisingly long to launch. Although my previous analyses have demonstrated how those can often be attributed to security checks being made on components including frameworks and dylibs, there remains some dispute over the nature of those checks. Although I believe there’s convincing evidence that those checks are prolonged to recompute hashes and CDHashes, others are adamant that they are in fact ‘malware scans’. This article considers new evidence.

Inspecting and analysing the log during the launch of large apps isn’t the best way of getting a clear view of what happens when macOS runs its security checks. There’s a great deal going on at the time, with multiple security checks being performed for TCC, LaunchServices and RunningBoard in dialog, sandboxes and containers to set up, and iCloud services to connect. Last week I’ve been tackling this more methodically, and here use two launches of a simple command tool to give clearer insights into what’s going on when macOS runs its security checks.

Methods

The command tool in question is my blowhole, just over 200 KB of simple code to write a message to the log, hence directly telling us when it’s run. It has its signing certificate embedded into its Mach-O binary, and is notarized. However, because it’s a single binary and not an app bundle, its notarization ticket is stapled to its installer package, and not to the tool itself.

The two runs I analyse here are:

  • On a Mac Studio M1 Max running macOS Ventura 13.4.1 at 14:32 on 4 July 2023. blowhole had already been installed on that Mac, but the copy being run was in a previously unknown location, ~/Documents, which normally forces macOS to perform more extensive security checks, including an XProtect scan and notarization checks, but not as thorough as when in quarantine.
  • On a Mac mini M4 Pro running macOS Sequoia 15.4.1 at 20:07 on 2 May 2025. Although blowhole had been installed and run some weeks previously, it hadn’t been run since then, and was expected to attract more extensive security checks, including an XProtect scan and notarization checks, but not as thorough as when in quarantine.

The first of those was collected using Ulbow, and the second by LogUI. Excerpts giving milestone entries are given in the Appendix at the end. In the diagrams below, each milestone is given with time in seconds elapsed since the first mention of the binary.

Ventura

First mention of the tool to be run comes from AppleMobileFileIntegrity (AMFI), which leads immediately to its daemon amfid starting its assessment of the binary, with the entry SecTrustEvaluateIfNecessary to inspect the signature and the CDHashes it contains. As this takes a mere 0.003 seconds, and the next stage starts 0.01 seconds after amfid entered the path to the binary, all that could have done was confirm the integrity of the signature, its requirements, and that its hashes were already cached.

syspolicyd then records the start of the Gatekeeper process assessment, and initiates the Gatekeeper scan, first starting checks on the notarization ticket, then starting the XProtect scan while ticket checking proceeds. Of those, the XProtect scan completes first, returning its results to syspolicyd at 0.054 seconds after the start.

Ticket checking involves an explicit connection to iCloud using CloudKit, with abundant log entries. The CloudKit Ticket Store is found to be reachable, and the ticket checked for the CDHashes obtained earlier from the tool’s signature.

With both checks completed satisfactorily, at 0.192 seconds the Gatekeeper scan is declared complete, syspolicyd evaluates its result, and is almost ready for the tool to run. Before that can happen, details of the executable are entered into provenance tracking. syspolicyd confirms the evaluation allows the tool to run, and AppleSystemPolicy records the evaluation result.

Sequoia

The sequence here is very similar to that in Ventura, with some significant differences, marked in the emphasised items in the diagram above.

First, there’s a substantial delay of 0.063 seconds between amfid entering the binary’s path and the start of the Gatekeeper process assessment. This started with entries from amfid recording SecTrustEvaluateIfNecessary and trustd SecKeyVerifySignature, indicating that more took place here than in Ventura. However, there’s no evidence of any external signature checks being made, and it’s most likely that the binary’s hashes weren’t cached, so required recomputation to verify them. The delay is woefully inadequate for any form of malware scan to have taken place at this stage.

When XprotectService reports that XProtect is performing the malware scan, it additionally reports the location of the XProtect rules being used. That’s because Sequoia introduced a new location used for those data files, in /var/protected/xprotect/XProtect.bundle/Contents/Resources/XProtect.yara as recorded here.

Next, the XProtect scan here takes 0.126 seconds, rather than 0.030 seconds in Ventura. This is the result of the huge growth in the number and complexity of the Yara rules used for this scan over the last two years. The Ventura scan was performed using version 2168 of those rules, with a Yara file of 147 KB size and around 218 rules. By version 5296 used in the Sequoia scan, file size had risen to 947 KB with about 381 rules.

Size of the binary being scanned also affects scan time, although in an unexpected way. A great many of the Yara rules used include upper limits to file size, so those larger than a few MB are subject to few rules, probably intentionally. Thus, larger binary files are likely to complete their XProtect scan in shorter time than expected, and maybe more quickly than smaller binaries.

As a result of these two delays, Gatekeeper’s XProtect results aren’t reported until 0.247 seconds have elapsed since the start, already over 0.05 seconds longer than the whole process in Ventura. However, in this case there’s no mention of provenance tracking, and the blowhole tool is finally run after 0.3 seconds, taking just over 150% of the time in Ventura.

Summary of security checks

  • Trust evaluation and signature verification, to confirm hashes and CDHashes if not cached;
  • Gatekeeper scan, including simultaneous ticket check and XProtect malware scan;
  • CDHash ticket check online using CloudKit with iCloud Ticket Store;
  • XProtect malware scan against Yara rules;
  • Gatekeeper evaluation for syspolicyd to allow or not;
  • Result registered with the kernel;
  • Command tool run.

Note that there’s no evidence of any OCSP checks being made with the certificate authority to determine whether certificates have been revoked. Additional time will be required if hashes and CDHashes are to be recomputed, and as a result of increased Yara rules.

Appendix: Log Milestones

Times are given in seconds, adjusted to a start of 0.0. The Ventura extract was obtained with log privacy disabled.

Ventura 13.4.1 2023-07-04 14:32:40 on Mac Studio M1 Max

XProtect version 2168, Yara file 147 KB, 218 rules
0.000000 AppleMobileFileIntegrity Checking in with amfid for DER co.eclecticlight.blowhole
0.001395 amfid Entering OSX path for /Users/howardoakley/Documents/blowhole
0.010608 syspolicyd GK process assessment: /Users/howardoakley/Documents/blowhole <-- (/bin/zsh, /System/Applications/Utilities/Terminal.app/Contents/MacOS/Terminal)
0.022891 syspolicyd GK performScan: PST: (path: /Users/howardoakley/Documents/blowhole), (team: (null)), (id: (null)), (bundle_id: (null))
0.023291 syspolicyd looking up ticket: {length = 20, bytes = 0xe0cad936293cea0807ec2e5193bed5f3f02dc019}, 2, 1
0.023316 syspolicyd cloudkit record fetch: https://api.apple-cloudkit.com/database/1/com.apple.gk.ticket-delivery/production/public/records/lookup, 2/2/e0cad936293cea0807ec2e5193bed5f3f02dc019
0.023554 XprotectService Xprotect is performing a direct malware and dylib scan: /Users/howardoakley/Documents/blowhole
0.053971 syspolicyd GK Xprotect results: PST: (path: /Users/howardoakley/Documents/blowhole), (team: (null)), (id: (null)), (bundle_id: (null)), {
XProtectMalwareType = 0;
XProtectSignatureVersion = 4321574501753746074;
}, version: 4321574501753746074
0.170283 syspolicyd CKTicketStore network reachability: 1, Wed Jun 21 19:33:35 2023
0.191576 syspolicyd GK scan complete: PST: (path: /Users/howardoakley/Documents/blowhole), (team: (null)), (id: (null)), (bundle_id: (null)), 4, 4, 0
0.191797 syspolicyd GK evaluateScanResult: 2, PST: (path: /Users/howardoakley/Documents/blowhole), (team: QWY4LRW926), (id: co.eclecticlight.blowhole), (bundle_id: NOT_A_BUNDLE), 0, 0, 1, 0, 4, 4, 0
0.191996 syspolicyd Putting executable into provenance with metadata: TA(917fa0aed8a1a838, 0)
0.191999 syspolicyd Putting process into provenance tracking with metadata: 1410, TA(917fa0aed8a1a838, 0)
0.192054 syspolicyd GK eval - was allowed: 1, show prompt: 0
0.192090 AppleSystemPolicy evaluation result: 17, allowed, cache, 1688477560
0.195467 blowhole Blowhole snorted!

Sequoia 15.4.1 2025-05-02 20:07:00 on Mac mini M4 Pro

XProtect version 5296, Yara file 947 KB, 381 rules
0.000 amfid Entering OSX path for /usr/local/bin/blowhole
0.063 syspolicyd GK process assessment: <private> <-- (<private>, <private>)
0.081 syspolicyd GK performScan: PST: (path: cc151acaee5bc8cd), (team: (null)), (id: (null)), (bundle_id: (null))
0.082 syspolicyd looking up ticket: <private>, 2, 1
0.082 syspolicyd cloudkit record fetch: <private>, <private>
0.121 XprotectService Xprotect is performing a direct malware and dylib scan: <private>
0.125 XprotectService Using XProtect rules location: /var/protected/xprotect/XProtect.bundle/Contents/Resources/XProtect.yara
0.247 syspolicyd GK Xprotect results: PST: (path: cc151acaee5bc8cd), (team: (null)), (id: (null)), (bundle_id: (null)), XPScan: 0,1089725382763820427,2025-05-02 19:07:00 +0000,(null),(null),file:///usr/local/bin/blowhole
0.283 syspolicyd CKTicketStore network reachability: 1, Fri May 2 17:21:52 2025
0.293 syspolicyd GK scan complete: PST: (path: cc151acaee5bc8cd), (team: (null)), (id: (null)), (bundle_id: (null)), 4, 4, 0
0.295 syspolicyd GK evaluateScanResult: 2, PST: (path: cc151acaee5bc8cd), (team: QWY4LRW926), (id: co.eclecticlight.blowhole), (bundle_id: NOT_A_BUNDLE), 0, 0, 1, 0, 4, 4, 0
0.296 syspolicyd GK eval - was allowed: 1, show prompt: 0
0.296 kernel evaluation result: 5, exec, allowed, cache, 1746212820, 4, 4, f1f7b76465d358b, 1746212820, /usr/local/bin/blowhole
0.303 blowhole Blowhole snorted!

For comparison, Catalina’s log entries are remarkably similar too.

A brief history of code signing on Macs

Mac OS didn’t require or even support the signing of apps or executable code for its first 23 years. Apple announced its introduction at WWDC in 2006, and it appeared in Mac OS X 10.5 Leopard the following year. This happened in conjunction with the release of the first iPhone, on which only code signed by Apple could be run, and could be the first instance of an iOS feature being implemented in Mac OS.

In Mac OS, it was an Apple engineer known as Perry the Cynic, probably Peter Kiehtreiber, who claimed to have been responsible. As he told Jeff Johnson later, “I do work for Apple, and I designed and implemented Code Signing in Leopard. If you think it’s going to usher in a black wave of OS fascism, you have every right to blame me – it was, pretty much, my idea.”

Third-party developers were rightly concerned about Apple’s plans. In 2008, developers were told that “signing your code is not elective for Leopard. You are *expected* to do this, and your code will increasingly be forced into legacy paths as the system moves towards an ‘all signed’ environment. You may choose to interpret our transitional aids as evidence that we’re not really serious. That is your decision. I do not advise it.”

Despite those ominous remarks, it wasn’t until Gatekeeper was introduced in 2012 that code signing became of general importance. With Gatekeeper came the quarantine of apps downloaded from untrusted sources, and first run checks made on all quarantined apps, including ascertaining signing identity and code integrity.

Certificates

From the outset, there were two types of code signature: self-signed using an ad hoc certificate that has no chain of trust back to a root, and those using a certificate traceable back to Apple’s root certificate. While ad hoc certificates can provide a weak form of identification, almost all the value of code signing requires traceability to a certificate authority.

Apple therefore provides registered developers (who pay an annual subscription) with certificates for signing their code, but macOS doesn’t recognise certificates provided by any other authority. Certificates are also specific to their purpose: those used to sign apps for distribution outside the App Store, for example, are known as Developer ID Application certificates, and are distinct from those used to sign installer packages.

Until 2018-19, macOS stored information about valid certificates in a local ‘Gatekeeper’ whitelist database at /private/var/db/gkopaque.bundle, updated every couple of weeks. Since the release of macOS 10.15 Catalina that became effectively disused and wasn’t updated after 26 August 2019. Gatekeeper started performing online checks, to determine whether a certificate had been revoked by Apple as the certificate authority, probably from before El Capitan in 2015, but until Catalina those were only performed on quarantined apps undergoing their first run. From around July 2019 and macOS 10.14.6 those were extended to include apps that had already cleared quarantine.

Checks with Apple to verify certificates are made using the Online Certificate Status Protocol, OCSP, which came under fire in November 2020, when Apple’s OCSP service failed, leaving many unable to launch apps. It was subsequently realised that online checks weren’t encrypted and could have been used by man-in-the-middle attacks to identify users and their apps. Although Apple made some changes, its initial promises don’t appear to have been fulfilled.

CDHashes

When code signing was introduced, most attention was paid to the certificates it required, although Apple also stressed the importance of the cryptographic hashes in the code directory that’s actually signed. This is a data structure containing hashes for pages of executable code, resources, and metadata such as entitlements and the Info.plist property list, that are protected by the signature. The hash of each code directory is known as a cdhash, although here I’ll perversely refer to it as a CDHash for readability.

CDHashes were originally computed using SHA-1, but that was replaced by SHA-256 in macOS 10.12 Sierra, when SHA-1 became deprecated. Apps signed to work with 10.11 and earlier will therefore contain SHA-1 hashes, in addition to SHA-256 hashes if they’re intended for 10.12 and later.

Because hashes are unique and sensitive to the slightest change in data, they have become increasingly used to check the integrity of signed apps and code, and to identify it. Make a tiny change in a signed app’s Info.plist and a CDHash check will report the error and refuse to open that app.

sigs01

Errors detected following launch normally result in macOS crashing the app, with a code signing error.

sigs05

Privileges and entitlements

From the outset, code signatures have been used by Apple to determine access to some privileges. Among those were keychain access, code injection, access to an app sandbox, and Parental Controls. Since then, they have extended to include kernel extensions and many controlled features such as the ability to use snapshot features in APFS, and even access to bridged networking in virtualisation on Apple silicon.

This was anticipated by Mike Ash in 2008, when he wrote “Perhaps initially there will be some APIs which are only available to signed applications. At some point Apple will decide that there are some areas of the system which are too dangerous to let anyone in, even when signed. Perhaps you will begin to need Apple approval for kernel extensions, or for code injection, or other such things.”

Mach-O binaries

Code signatures are suited to the app bundle structure, where they can be stored in their own folder. Single-file Mach-O executables don’t have that flexibility, but their signatures and CDHashes can be appended to the binary, or, when necessary, added in extended attributes (xattrs). Apple discourages the latter, as xattrs are prone to get stripped when transiting some file systems, so are less robust.

Notarization

From the outset of the iOS App Store, and later that for macOS, apps provided through those stores have been signed not by their third-party developers but by Apple. That gives Apple full control over their contents and their CDHashes, and enables it to revoke an app by checking those, rather than having to revoke the signing certificate. However, as Apple doesn’t have any record of apps or code signed by developers using their certificates, it has no means of verifying those distributed outside its App Stores. This changed with the introduction of compulsory notarization in macOS Mojave 10.14 in 2018.

Although the App Store process and notarization have common objectives, of ensuring that apps and code aren’t malicious, and providing Apple with CDHashes and a copy of the app, they are also fundamentally different. Apps distributed through the App Store are reviewed by Apple, must conform to its rules, and are signed by Apple; notarized apps distributed outside the App Store are only checked for malware, aren’t required to comply with rules, and are signed by their developer.

notariznhashes1

This diagram shows the evolution of code signing on Macs, from pre-2007, 2007-2018, and from 2018 onwards. In 2024, the release of macOS 15 Sequoia now effectively blocks developers from distributing apps that aren’t notarized by closing the simple Finder bypass that could be used to launch unnotarized apps.

Apple silicon

Although Apple had long maintained that users would remain able to run completely unsigned code in macOS, that too changed with the release of the first Apple silicon Macs in November 2020. All code run natively on ARM processors is required to be signed, although that could still be using ad hoc signatures, as originally allowed in 2007. Xcode, build tools and other systems for developing executable code for Macs have been modified to ensure that, when building apps and other executables that aren’t signed using a developer certificate, they are at least ad hoc signed. It’s thus well nigh impossible to build code that isn’t signed at all.

Ad hoc signatures are also used in codeless apps such as Web Apps introduced in macOS Sonoma in 2023. These provide a property list defining the app’s scope in terms of its domain URL, its Home page within that, and an icon to use. LaunchServices registers them against a UUID, applies an ad hoc signature, and keeps a record of the app bundle’s CDHashes from that signature, against which to validate its contents before trying to run it in the future.

Abuse

Before notarization became widespread, some malicious software was signed using Apple Developer ID Application certificates. Obtaining a developer ID on the ‘black market’ hasn’t been costly or difficult, but until recently relatively few have gone to the trouble. This may be the effect of certificate revocation: once signed malware has had its certificate revoked, it’s dead and can’t be run on a Mac again, while ad hoc and unsigned malware has been harder to block.

With notarization becoming more compulsory, particularly in macOS Sequoia, there have been occasional malicious apps that have slipped through Apple’s checks, and been notarized. This is more demanding, and requires techniques to obfuscate code to evade detection. Even when successful, the lifetime of notarised malware is likely to be short, and for most not worth the effort.

Conclusion

Seventeen years ago, Mike Ash expressed his concern that code signing would lead to Apple having to approve the apps we can run on our Macs. Although in certain respects Apple does control what our apps can do, we can still run many apps that I’m sure wouldn’t meet its approval, and code signing has played an important role in preventing those that are malicious. Things could have been far worse.

References

Mike Ash, Code Signing and You, 7 March 2008, an invaluable contemporary summary
Apple’s Code Signing Guide, last updated 13 September 2016
Apple’s Inside Code Signing series:
TN3125 Provisioning Profiles
TN3126 Hashes
TN3127 Requirements
TN3161 Certificates

I’d like to acknowledge the help of Jeff Johnson of Underpass App Company in providing information this brief history, although all errors are mine alone.

How macOS Sequoia launches an app

Each new version of macOS has increased the complexity of launching apps, from the basics of launchd, the addition of LaunchServices, to security checks on notarization and XProtect. This article steps through the major landmarks seen when launching a notarized app that has already passed its first-run checks and is known to macOS Sequoia 15.3, on an Apple silicon Mac.

Rather than trying to provide a blow-by-blow account of what’s written in the log over the course of thousands of entries, I’ve extracted landmarks that demonstrate when each subsystem gets involved and its salient actions. These have been gleaned from several similar app launches, and are ultimately timed and taken from one complete record of one of my simpler notarized apps that has no entitlements and uses only basic AppKit features. The app hadn’t been through quarantine as it had been built and notarized on the same Mac, and had been run previously but not in that session since the previous boot. It had thus been previously registered with LaunchServices and other subsystems. The host was a Mac mini M4 Pro, so timings should be briefer than on many other Macs, it was run from the main Applications folder on the internal SSD, and AI was enabled.

LaunchServices and RunningBoard

LaunchServices has been around for many years, and handles many of the tasks exposed in the Finder, including mapping of document types to app capabilities, Recent Items and Open Recent lists, making it the backbone of app launching. RunningBoard was introduced in Catalina and has steadily assumed responsibility for managing resources used by apps, including memory and access to the GPU. Although the test app doesn’t have any of its resources managed by RunningBoard, LaunchServices launched it through RunningBoard.

RunningBoard’s first task is to create a job description, which it helpfully writes to the log as a dictionary. This is a mine of useful information, and has replaced the copious information compiled by LaunchServices in the past. This includes:

  • a dictionary of Mach services
  • whether Pressured Exit is enabled
  • a full listing of environment variables, such as TMPDIR, SHELL, PATH
  • RunningBoard properties including another TMPDIR
  • whether to materialise dataless files.

Once that job description has been constructed for the app, RunningBoard tracks the app and its assertions, providing a detailed running commentary through the rest of the app’s life. LaunchServices still performs its traditional tasks, including creating an LSApplication object and sending an oapp AppleEvent to mark the opening of the app, and launchd still reports that it’s uncorking exec source upfront.

When the app is running, its preferences are loaded from the user CFPrefsD, and its pasteboard is created. Almost 0.1 second later (0.3 seconds after the start of launch) there’s a sustained flurry of log entries concerning Biome, and signs of AI involvement (Apple silicon only). The latter include a check for the availability of generative models and WritingTools. There are also entries referring to the loading of synapse observers.

LaunchServices log entries are readily accessed through its subsystem com.apple.launchservices, and RunningBoard through com.apple.runningboard.

Security and privacy

The first serious engagement in security is the verification of the app’s signature and its evaluation by Apple Mobile File Integrity (AMFI, using amfid). Shortly after that comes the standard Gatekeeper (GK) assessment, with its XProtect scan, starting less than 0.1 second after the start of launch. Immediately after the start of that scan, XProtect should report which set of data files it’s using. In Sequoia those should be at /var/protected/xprotect/XProtect.bundle/Contents/Resources/XProtect.yara. That scan took just over 0.1 second.

While XProtect is busy, syspolicyd checks the app’s notarization ticket online, through a CloudKit connection with the CKTicketStore. That’s obvious from log entries recording the network connections involved, and the complete check takes around 0.05 second. Once that and the XProtect scan are complete, syspolicyd reports the GK scan is complete, and evaluates its result.

At about the same time that the Gatekeeper checks are completing, privacy management by TCC (Transparency Consent and Control, in tccd) is starting up. Its initialisation includes establishing the Attribution Chain for any Mach-O binaries run by the app, so that TCC knows where to look for any required entitlements. Following that, TCC writes bursts of entries as different components such as the Open and Save Panel service are set up for the app.

The final phases of security initialisation come in provenance tracking, which first appeared in macOS Ventura. This may be associated with presence of the extended attribute com.apple.provenance, but details are currently sketchy.

Following syspolicy in the log is best through its subsystem com.apple.syspolicy, you can watch XProtect using com.apple.xprotect, and TCC is com.apple.TCC.

Overall

Downloadable PDF: applaunch153

Main landmarks with elapsed time in seconds:

  • 0.000 Finder sendAction
  • 0.023 LaunchServices, launch through RunningBoard
  • 0.029 RunningBoard launch request
  • 0.043 AMFI evaluate
  • 0.066 Gatekeeper assessment
  • 0.080 XProtect scan
  • 0.085 check notarization ticket
  • 0.187 TCC checks
  • 0.204 launched

Previous article

Launching apps in Sonoma 14.6.1: Conclusions

❌