How to Build Unsigned macOS Apps

Sometimes you have to build a project for customers with IDs under their control but without their signing assets. This is how.

Our past workflow to deliver app builds to customers involved a back and forth of signing credentials. Mostly we used development signing assets because at one hand we could sign the deliverables and on the other hand the customers were not okay with granting us access to their development team or specific app as a developer to upload builds. The solution to free ourselves from that dependency and simplify the process was to provide unsigned builds.

As our builds are created through GitLab CI and the commandline usage of xcodebuild, we have to provide the CODE_SIGNING_REQUIRED and CODE_SIGNING_ALLOWED build settings.

/usr/bin/xcodebuild archive \
    -workspace 'Example.xcodeproj/project.xcworkspace' \
    -scheme 'Example' \
    -archivePath 'Example' \
    TEAM_ID='' \
    DEVELOPMENT_TEAM='' \
    CODE_SIGNING_ALLOWED='NO' \
    CODE_SIGNING_REQUIRED='NO' \
    CODE_SIGN_IDENTITY='' \
    CODE_SIGN_STYLE=''

I left out some arguments in the example above for clarity. The resulting Xcode archive will contain an unsigned build. Also, xcodebuild will not quit in error during the build because of possibly missing signing assets. That can happen when you build an app involving app and group IDs which are not under control of your Apple developer account.

Security Concerns

Disclaimer: Even though I am always interested to improve my skills and extend my knowledge, my expertise naturally focuses more on software development rather than on security. After all I am a software developer and not a cybersecurity expert and by far not an all knowing being. So take everything after this disclaimer with a grain of salt.

Theoretically the build and delivery of unsigned builds enable malicious and covert tampering under specific circumstances. Apps are built on some build machine of the providing organization, then have to be delivered through whatever file transfer method and finally signed on some customers machine. Until the customer signed the app with his signing assets the product is unprotected against malicious tampering. The problem is restricted to the point at which the customer takes over the deliverable, if the providing organization’s infrastructure is well secured and the file delivery under its control. It reduces the attack vector but does not eliminate it completely. If the customers devices which are involved in the signing process are corrupted, then the code can still be tampered with. This is not a problem for the providing organization anymore but for the receiving party and the end users of the app.

On the other hand: depending on the grade of intrusion at the providing organization’s side the signing would not help either. If an attacker gains access to the build machine in the organization providing the builds any available signing assets may be exposed, too. This would also enable malicious tampering including resigning the altered code.

It is all about securing the process. Preliminary signing the deliverable, even only with a development certificate, is not a solution but another factor which makes it harder for attackers to meddle with the code of a deliverable.

About The Author

Peter Thomas Horn is a professional software developer at Open-Xchange specialized on the Apple platform. He previously worked a decade across the full stack of various web technologies. Originally started with Java on Windows at the age of 12 years. While staying humble in throwing around buzzwords like "VR" and "machine learning" he occasionally experiences problems and the fitting solutions considered worth sharing.