Under The Microscope

Code Signing and You

At WWDC ’06 I was looking over the session list and picking out all of the ones they didn’t announce ahead of time when one of them stood up, extended its finger, and poked me right in the eye. “Code Signing”. There wasn’t much of a description, but there didn’t need to be.

The place was packed. I was obviously not the only one to think this was an important session. And for good reason: we are starting to see how code signing is gradually but deeply changing the nature of the platforms we work on.

Code Signing Background

Like most technologies, code signing itself is neutral, or ought to be. It can be used for good or evil. Code signing is basically a way to cryptographically prove the origin of a particular piece of code, nothing more.

The way things are built today, there are really two kinds of code signature. Or more accurately, there are two types of certificates used to sign code. There are self-signed certificates, where the certificate is created by the signer and has no connections with the outside world. This sort of certificate says absolutely nothing about the signer, but it lets you know that two pieces of code were signed with the same certificate, and therefore presumably by the same person. And then there is code signed using a “real” certificate, which is itself signed by a certificate authority. This means that the authority vouches for the identity of the certificate owner, generally by obtaining some sort of government identification from them. If you have code signed with this kind of certificate, you have good assurance that it was signed by the person whose name appears on the certificate, and is thus as trustworthy as they are.

Apple currently uses these capabilities in a few beneficial ways. There are several pieces of Mac OS X which depend on knowing the identity of an application. For example, the keychain tracks per-application access privileges. The Leopard firewall can be set to only allow access to certain applications. Parental Controls allows a user to determine which apps another user is allowed to run.

Many of these things existed before code signing (which is to say before Leopard), but they were relatively fragile. Every time you updated an application the system would have to treat it as if it were a new application. With signed applications, the system can see that an update came from the same people and should therefore have the same level of trust as the old version, so it doesn’t have to re-prompt the user for keychain access or any of the rest. This is good for users and it’s good for developers.

(There is a certain amount of faith involved here. Anyone who has used enough software should know that the next version of something isn’t always equally trustworthy. Apple’s decision to always give new versions the same level of trust is interesting and not entirely obvious, but this is a subject for another day.)

Apple also uses code signing in some odd ways. For example, OS X includes a function called task_for_pid() which, among other things, is basically the gateway for things like mach_inject. Back in the old days, basically early 10.4 and below, this could be called on any process owned by the same user as the caller. Sometime in the 10.4 days this changed to require the caller to be either root or part of the procmod group. This makes sense; code injection is dangerous and requiring approval from the user is a good thing. On 10.5 the rules are the same for unsigned apps, but signed apps can get a free pass. This is strange. Apparently Apple believes that having a signature automatically makes an application “good”, or that merely being accountable is enough of a barrier.

Code Signing Going Forward

Apple clearly plans to take this further. Here is a choice quote from Apple:

SIgning [sic] your code is not elective for Leopard. You are *expected* to do this, and your code will increasingly be forced into legacy paths as the system moves towards an “all signed” environment. You may choose to interpret our transitional aids as evidence that we’re not really serious. That is your decision. I do not advise it.

Posted to apple-cdsa on March 3, 2008

This makes sense for APIs such as the keychain where being able to identify the source of code is useful. But how many such APIs are there on the system? Where else can you be forced onto legacy paths for having unsigned code?

Apparently the answer is “everywhere”:

In order to achieve the nirvana of only running valid code, the system must completely refuse to run unsigned code. Since that would really have ruined third party developers’ Leopard experience, we don’t do that in Leopard (except for the Parental Controls and firewall cases, where we surreptitiously sign unsigned programs when they are “enabled” to run).
Eventually you will all have signed your recent releases, and we’ll have fixed all the (important) bugs and closed all the (important) holes, and a switch will materialize to this effect – to refuse (at the kernel level) to run any code that isn’t valid.

Posted to apple-cdsa on March 3, 2008

What purpose does this serve? Remember that being signed only tells you about the origin of code, it doesn’t tell you if that code is good or not. And self-signed code doesn’t even tell you that: it only lets you determine that two pieces of code signed by the same entity actually came from that entity.

The iPhone SDK

The answer to this might be found in the iPhone SDK. Just released, it offers a development environment very similar to the desktop edition of Mac OS X. All the standard UNIX stuff is there, many APIs are the same, and many (such as the GUI parts) are similar but adapted for the mobile environment.

However, the environment is different in one important way. Apple is the gatekeeper:

Phones will only run apps signed by Apple. It also applies FairPlay to the package.

Twitter message from Deric Horn on March 6, 2008

Let me repeat that: if Apple doesn’t sign your iPhone app, it does not run.

Even for local development, you need to get the code signed. The iPhone SDK is free, but by itself it won’t let you load apps onto an iPhone. When you pay Apple the $99 to enroll in the program, they send you a certificate which can be used to sign your applications. However, they will only work on iPhones which have been provisioned with this certificate.

To distribute your application to other people, you must go through Apple, and Apple has explicitly stated that they are going to be vetting the apps before they give their blessing. Steve Jobs identified six types of bad behavior which would cause them not to sign an app:

  • Illegal
  • Malicious
  • Unforeseen
  • Privacy
  • Porn
  • Bandwidth hog

Some of these make good sense. Malicious apps and apps that violate your privacy are bad. But then again, Apple’s definitions of these may not agree with yours. Bandwidth hogging apps are something which need to be regulated to keep the EDGE network up and running, assuming that this only applies to EDGE. On the other hand, there are cell phones which can run any code without vetting by the manufacturer and none of them have destroyed a carrier’s data network. Apple probably has little choice but to block illegal apps, but once again Apple’s definition of illegal is not going to be everybody’s. I can only assume that they will be applying the legal standards of California, USA to all apps, even if the developer is in Lithuania and the user is in Italy.

And then we have “Porn”. You have to wonder why this is on the list. As long as the application doesn’t hide its nature there’s nothing harmful about it, it’s generally legal, and a porn application can be just as well behaved toward the EDGE network and toward the user as any other application. Apparently this is on the list just because some people think it’s morally wrong. Apple may or may not believe this, but they at least think enough people will that it’s not worth allowing. So now Apple is making moral judgements of the apps they sign.

The most worrying one on the list, of course, is “Unforeseen”. This is basically a catch-all intended to give Apple an out in case anything comes up which they don’t feel like letting onto the device. Maybe some new class of evil app is developed which doesn’t quite fit into the above categories and Apple needs to block them. Or maybe Apple just doesn’t feel like having any competitors in a particular market, and wants to shut them all out.

Compare this to the current situation on the Mac. I can develop, distribute, and install programs which Apple has never even heard of. I have absolute freedom to do as I wish in this regard. This means that I have the freedom to install bad stuff which will destroy my system or spy on me. But it also means that I have the freedom to install good programs which Apple wouldn’t approve.

The Future

Ultimately I think the trend is bad. Code signing itself is a neutral technology, but it gives incredible power to the system vendor, and that power is just waiting to be exercised and abused. I believe that the iPhone is serving as a testbed to see how users and developers will react to an environment with ubiquitous code signing and control. If it goes well I think we can expect to see our desktop Macs gradually move in this direction as well. Judging by how badly Apple’s developer servers were flattened during the SDK release it seems like there’s no way it won’t go well.

I’m sure it will be a gradual process. If 10.6 ships and suddenly nothing will run without Apple approval there will be a huge revolt among users and developers. In 10.5 it’s pretty much innocent. In 10.6, given what Apple has revealed, I would expect to start seeing some restrictions in place. Perhaps initially there will be some APIs which are only available to signed applications. At some point Apple will decide that there are some areas of the system which are too dangerous to let anyone in, even when signed. Perhaps you will begin to need Apple approval for kernel extensions, or for code injection, or other such things. Then one day Apple may decide that unvetted code is too dangerous. Maybe advanced users could still be allowed to use it, but a setting may show up, “Allow unapproved applications”. It will, of course, be off by default.

Would life really be so bad in such a world? After all, even in the worst case, hacks would no doubt appear to disable the signature checks. But at this point the ecosystem has been severely damaged. Any application which requires such a scary setting to be changed is not going to get a very large audience. This is bad news for the developer. And with such a reduced audience, the amount of such software made is going to be much less, which is in turn bad news for the user. It could very well be a wonderful environment for the average Joe whose every need is met by Cupertino-approved wares, but it’s certainly not the kind of environment I want on my desktop.

Our Software