Under The Microscope

Code Signing and You

At WWDC ’06 I was looking over the session list and picking out all of the ones they didn’t announce ahead of time when one of them stood up, extended its finger, and poked me right in the eye. “Code Signing”. There wasn’t much of a description, but there didn’t need to be.

The place was packed. I was obviously not the only one to think this was an important session. And for good reason: we are starting to see how code signing is gradually but deeply changing the nature of the platforms we work on.

Code Signing Background

Like most technologies, code signing itself is neutral, or ought to be. It can be used for good or evil. Code signing is basically a way to cryptographically prove the origin of a particular piece of code, nothing more.

The way things are built today, there are really two kinds of code signature. Or more accurately, there are two types of certificates used to sign code. There are self-signed certificates, where the certificate is created by the signer and has no connections with the outside world. This sort of certificate says absolutely nothing about the signer, but it lets you know that two pieces of code were signed with the same certificate, and therefore presumably by the same person. And then there is code signed using a “real” certificate, which is itself signed by a certificate authority. This means that the authority vouches for the identity of the certificate owner, generally by obtaining some sort of government identification from them. If you have code signed with this kind of certificate, you have good assurance that it was signed by the person whose name appears on the certificate, and is thus as trustworthy as they are.

Apple currently uses these capabilities in a few beneficial ways. There are several pieces of Mac OS X which depend on knowing the identity of an application. For example, the keychain tracks per-application access privileges. The Leopard firewall can be set to only allow access to certain applications. Parental Controls allows a user to determine which apps another user is allowed to run.

Many of these things existed before code signing (which is to say before Leopard), but they were relatively fragile. Every time you updated an application the system would have to treat it as if it were a new application. With signed applications, the system can see that an update came from the same people and should therefore have the same level of trust as the old version, so it doesn’t have to re-prompt the user for keychain access or any of the rest. This is good for users and it’s good for developers.

(There is a certain amount of faith involved here. Anyone who has used enough software should know that the next version of something isn’t always equally trustworthy. Apple’s decision to always give new versions the same level of trust is interesting and not entirely obvious, but this is a subject for another day.)

Apple also uses code signing in some odd ways. For example, OS X includes a function called task_for_pid() which, among other things, is basically the gateway for things like mach_inject. Back in the old days, basically early 10.4 and below, this could be called on any process owned by the same user as the caller. Sometime in the 10.4 days this changed to require the caller to be either root or part of the procmod group. This makes sense; code injection is dangerous and requiring approval from the user is a good thing. On 10.5 the rules are the same for unsigned apps, but signed apps can get a free pass. This is strange. Apparently Apple believes that having a signature automatically makes an application “good”, or that merely being accountable is enough of a barrier.

Code Signing Going Forward

Apple clearly plans to take this further. Here is a choice quote from Apple:

SIgning [sic] your code is not elective for Leopard. You are *expected* to do this, and your code will increasingly be forced into legacy paths as the system moves towards an “all signed” environment. You may choose to interpret our transitional aids as evidence that we’re not really serious. That is your decision. I do not advise it.

Posted to apple-cdsa on March 3, 2008

This makes sense for APIs such as the keychain where being able to identify the source of code is useful. But how many such APIs are there on the system? Where else can you be forced onto legacy paths for having unsigned code?

Apparently the answer is “everywhere”:

In order to achieve the nirvana of only running valid code, the system must completely refuse to run unsigned code. Since that would really have ruined third party developers’ Leopard experience, we don’t do that in Leopard (except for the Parental Controls and firewall cases, where we surreptitiously sign unsigned programs when they are “enabled” to run).
Eventually you will all have signed your recent releases, and we’ll have fixed all the (important) bugs and closed all the (important) holes, and a switch will materialize to this effect – to refuse (at the kernel level) to run any code that isn’t valid.

Posted to apple-cdsa on March 3, 2008

What purpose does this serve? Remember that being signed only tells you about the origin of code, it doesn’t tell you if that code is good or not. And self-signed code doesn’t even tell you that: it only lets you determine that two pieces of code signed by the same entity actually came from that entity.

The iPhone SDK

The answer to this might be found in the iPhone SDK. Just released, it offers a development environment very similar to the desktop edition of Mac OS X. All the standard UNIX stuff is there, many APIs are the same, and many (such as the GUI parts) are similar but adapted for the mobile environment.

However, the environment is different in one important way. Apple is the gatekeeper:

Phones will only run apps signed by Apple. It also applies FairPlay to the package.

Twitter message from Deric Horn on March 6, 2008

Let me repeat that: if Apple doesn’t sign your iPhone app, it does not run.

Even for local development, you need to get the code signed. The iPhone SDK is free, but by itself it won’t let you load apps onto an iPhone. When you pay Apple the $99 to enroll in the program, they send you a certificate which can be used to sign your applications. However, they will only work on iPhones which have been provisioned with this certificate.

To distribute your application to other people, you must go through Apple, and Apple has explicitly stated that they are going to be vetting the apps before they give their blessing. Steve Jobs identified six types of bad behavior which would cause them not to sign an app:

  • Illegal
  • Malicious
  • Unforeseen
  • Privacy
  • Porn
  • Bandwidth hog

Some of these make good sense. Malicious apps and apps that violate your privacy are bad. But then again, Apple’s definitions of these may not agree with yours. Bandwidth hogging apps are something which need to be regulated to keep the EDGE network up and running, assuming that this only applies to EDGE. On the other hand, there are cell phones which can run any code without vetting by the manufacturer and none of them have destroyed a carrier’s data network. Apple probably has little choice but to block illegal apps, but once again Apple’s definition of illegal is not going to be everybody’s. I can only assume that they will be applying the legal standards of California, USA to all apps, even if the developer is in Lithuania and the user is in Italy.

And then we have “Porn”. You have to wonder why this is on the list. As long as the application doesn’t hide its nature there’s nothing harmful about it, it’s generally legal, and a porn application can be just as well behaved toward the EDGE network and toward the user as any other application. Apparently this is on the list just because some people think it’s morally wrong. Apple may or may not believe this, but they at least think enough people will that it’s not worth allowing. So now Apple is making moral judgements of the apps they sign.

The most worrying one on the list, of course, is “Unforeseen”. This is basically a catch-all intended to give Apple an out in case anything comes up which they don’t feel like letting onto the device. Maybe some new class of evil app is developed which doesn’t quite fit into the above categories and Apple needs to block them. Or maybe Apple just doesn’t feel like having any competitors in a particular market, and wants to shut them all out.

Compare this to the current situation on the Mac. I can develop, distribute, and install programs which Apple has never even heard of. I have absolute freedom to do as I wish in this regard. This means that I have the freedom to install bad stuff which will destroy my system or spy on me. But it also means that I have the freedom to install good programs which Apple wouldn’t approve.

The Future

Ultimately I think the trend is bad. Code signing itself is a neutral technology, but it gives incredible power to the system vendor, and that power is just waiting to be exercised and abused. I believe that the iPhone is serving as a testbed to see how users and developers will react to an environment with ubiquitous code signing and control. If it goes well I think we can expect to see our desktop Macs gradually move in this direction as well. Judging by how badly Apple’s developer servers were flattened during the SDK release it seems like there’s no way it won’t go well.

I’m sure it will be a gradual process. If 10.6 ships and suddenly nothing will run without Apple approval there will be a huge revolt among users and developers. In 10.5 it’s pretty much innocent. In 10.6, given what Apple has revealed, I would expect to start seeing some restrictions in place. Perhaps initially there will be some APIs which are only available to signed applications. At some point Apple will decide that there are some areas of the system which are too dangerous to let anyone in, even when signed. Perhaps you will begin to need Apple approval for kernel extensions, or for code injection, or other such things. Then one day Apple may decide that unvetted code is too dangerous. Maybe advanced users could still be allowed to use it, but a setting may show up, “Allow unapproved applications”. It will, of course, be off by default.

Would life really be so bad in such a world? After all, even in the worst case, hacks would no doubt appear to disable the signature checks. But at this point the ecosystem has been severely damaged. Any application which requires such a scary setting to be changed is not going to get a very large audience. This is bad news for the developer. And with such a reduced audience, the amount of such software made is going to be much less, which is in turn bad news for the user. It could very well be a wonderful environment for the average Joe whose every need is met by Cupertino-approved wares, but it’s certainly not the kind of environment I want on my desktop.

70 Responses to “Code Signing and You”

  1. Wikiwikiman says:

    If the only “normal” way to install an application on your iPhone is through Apple and/or the iTunes Store, what am I supposed to do if I would like to develop some very personal application and use that – as the sole user – on my own iPhone? Call it a hack or a kludge, call me a hobbyist: if it’s something useful to me, why shouldn’t I be able to put it on a device that I own? Not every good idea can be implemented as a web app – especially those that are somehow linked to a function on an altogether rich platform like the iPhone.

    I can understand, even applaud, the benefits of code signing – but code signing is NOT code distribution. I think we’re all better off if the two remain separate…

  2. GeorgeT says:

    One thing that really pissed me off is that I can’t even write apps to use for myself on my iPod touch. Why should I pay 100$ to use an app that I wrote myself?

  3. Jason says:

    Mike said: “a good OS should not allow arbitrary apps to stuff up the device”

    Well…yeah. But how many consumer-oriented OSes actually are that robust? A game from a big company can still lock OS 10.4 hard (though I don’t personally game much).

    “If you’re going to be putting iffy apps onto your phone then you should realize that this may make it less reliable, but that should be your choice to make.”

    Ahhhh, protecting the clueless phone user from themselves ;) Since Apple will likely never actually pre-test apps themselves, the signing is useless from a protection perspective. You can have a signed app that still kills you. And until the word gets back to Apple and it’s revoked, it will nail others too.

    So what’s your position? No form of signing with any kind of implementation is good at all? What I actually like is…ALL code gets signed, with a gatekeeping authority, whose only role it is to verify that you are who you say you are so you can always be tracked down. By the end user, the OS maker, etc. Nothing else though…no selective signing (unless known to be illegal or malicious). As for revocation…what I prefer is, a user can decide, “revoke all apps by creator xyz on my system”.

  4. bobdole says:

    um, I think it’s pretty clear that apple won’t allow porn because that would make them legally a “pornography distributor” which is something they don’t want to deal with.

  5. ethan says:

    This is identical to how verizon uses the brew structure to manage their app stack.

    GM talked to us about delivering sales information via cell phones but brew is not useful for enterprise level apps. They would not put an app only for GM sales guys on their app stack so there was no way to get it on the cells. GM had a corporate licensing via verizon so there was no options. They were more interested in games and stupid ringtones.

    It’s the same with apple-they are doing nothing different from the industry.

  6. Ben K says:

    I am very weary of the possible restricted future of code signing as well, but I’d rather have a pretty clear policy as Apple seems to put down for the AppStore, than the frivolous policy of MacUpdate to accept or deny applications at will.

  7. Mike says:

    Wikiwikiman: You can pay your $99 and be able to at least put it on your own device. But, as should be obvious from my post, I think this is a poor situation.

    Jason: You make a good point about OS X still failing to protect itself from everything. The system-wide fragility of the OpenGL stack never ceases to frustrate me. As for my position, I want the restrictions to be optional. I’d also like the default position to not require Apple approval, but that’s secondary. Mainly I just want to be able to put the software I want onto the device I own without feeling like a criminal.

    ethan: I agree completely that Apple is not doing anything different from the rest of the cell phone industry. And that’s a big problem, because the (American) cell phone industry completely blows.

    Ben K: Keep in mind that I can easily get software for my Mac without ever laying eyes on MacUpdate, but it is not possible to bypass Apple’s App Store.

  8. Anonymous says:

    >I also don’t pay $70/month for my iPod Touch to work, but all of these restrictions hit it just as much as they would an iPhone.

    You got me there. I hadn’t considered that angle. I agree with you. I too would prefer if we could load whatever we wanted. I was trying to decipher why Apple would establish the limits that they have.

  9. Perry The Cynic says:

    A system that runs unsigned programs will also run modified copies of a signed program after the attacker has stripped off the (now broken) signature. Thus you cannot be sure that you’re running only “intact” programs unless you prohibit running unsigned ones. Put differently, in a Code Signing driven environment, access must be white-listed; it cannot be black-listed.

    There may well be APIs in the future that will not serve unsigned clients. That by itself is not a restriction. Signing code is free, and it’s anonymous if you want it to be. You don’t gain or lose anything by signing your code, except protection from alteration by someone else (not the developer). For APIs that are interested in stability of identity (as e.g. the keychain APIs), that’s all that is needed.

    APIs that want to express policy restrictions in terms of developer identity can do so. In fact, we expect a Mac’s administrator to want to do this. Note the flow here – the developer provides information about identifying his code; the administrator uses this information to decide what to run (and how). The administrator is the ultimate authority on a Mac’s security policy. I don’t expect that to change. (DRM is the exception, of course – Apple is legally required to keep you from hacking DVD Player.app, for example.)

    A policy of “run anything signed” is actually the null policy – it does not restrict anything (except developers who refuse to sign their code for some reason, I suppose – but a motivated user can sign such developers’ code himself just fine). In practice, administrators will pick more restrictive run policies – but what policy they pick is their business, and Apple’s job is to make it easy to do reasonable things (and possible to do even weird things :-).

    The phone is different. I understand the fears and worries and concerns, and I can’t meaningfully answer them here. For all practical purposes, at present, Apple plays the role of the administrator on a phone, and its user is just an ordinary user. I get that you don’t like that. Be sure to let Apple know.

    But please understand that from where we sit, the case for Code Signing on a Mac – in the service of its administrator/owner – is very compelling. In a sense, the Mac is living on borrowed time – viruses and worms and other nasty bit-critters will surely come our way, and going to an all-signed environment is one the most potent weapons we have to keep your systems from being overrun. I realize this capability *could* eventually be abused in various ways, and I trust you’ll all keep Apple honest about it. But it can also be a powerful force *for* you.

    Oh, and to put that to rest: I do work for Apple, and I designed and implemented Code Signing in Leopard. If you think it’s going to usher in a black wave of OS fascism, you have every right to blame me – it was, pretty much, my idea.

    — perry

  10. Bertie says:

    First of all, great article!!

    My point of view on the above issue: I agree that “a lot of other people don’t care, they want their e-mail and web programs to run without sending their credit card number to Nigeria”, so for them having signed application is fine. But on the other hand other people would like to be able to run whatever software they want on their iPhone (without being obliged to jailbreak it, I mean). For example, I’d like VoIP, and I’m not sure Apple would allow it…

    The *only* problem is that Apple is the one to decide which program is “good” or “bad” (for them, not generally speaking).

    Signature in itself would be OK, if any developper was able to sign and distribute its own software on the iPhone, which means without going through iTunes. Average users would then buy their software from iTunes and will therefore have Apple’s check on it (for what it’s worth, we don’t know how Apple will check that the code is not malicious: pre-check seems impossible, and post-check not very good); Other more adventurous users could buy their iPhone software somewhere else.

    Unfortunately that’s not what we’re heading for.

  11. Bertie says:

    I think you’ve got a good point about testing and alpha/beta releases.

    @Brian Erdelyi
    “I also believe this will enable Apple to further lock down the iPhone and prevent unlocking while forcing us to use particular carriers (a practice which I believe is illegal in some parts of Europe).”
    Yes, this is illegal in France (where I live). You cannot force people to buy two products at once, i.e. in our case an iPhone and a monthly GSM plan. You must allow people to buy them separately. For the iPhone it means an unlocked iPhoned at 750€, an iPhone on the Orange network with no plan at 650€, and an iPhone with a two year Orange plan at 399€ (+ the price of the plan).
    Normally for the same reason it should be possible to buy a PC with no operating system (Windows…), but the consumers associations are having a hard time trying to enforce this. When I buy a PC I get a license for Windows in a sealed enveloppe. If I refuse the license I should get a refund of the Windows OS (while keeping the PC). But this refund is impossible to get, and actually it is impossible to know the price of Windows on top of the hardware’s price in the first place.

  12. rvr says:

    i think you’ve made some good points in this post, and the concerns are valid. i believe, and i hope my faith is deserved, that locking down os x in this fashion, even if done gradually, would create such a backlash that apple would look very bad for doing it. i also think they know this. the platform has been successful and has continued to survive, especially during the dark years before jobs returned, in no small part due to the vibrant community of developers like r.m., and the core of users who have high standards in the apps they use. i must believe that apple recognizes this. i know they sometimes walk the fine line between supporting the developer community and cannibalizing it (by incorporating things as os features, etc), but i think they know if would not be in their best interest to severely piss them off or constrict the ecosystem so much.

    i also hope that apple recognizes this will be the best course, in the long run, for the iphone. is it more likely that the mac ends up going the direction of the iphone, or the iphone ends up going the direction of the mac. we can only wait and see. what’s clear to me is that the community is the best place, in the long run, to vet apps and give them recognition. there will always be more manpower than one company can afford, and there’s a big advantage in having a wide range of people providing opinions. i hope apple sees it this way as well, and that the iphone ecosystem can evolve in this direction.

  13. Nathan de Vries says:

    Welcome to the world of mobile phones. You’re new here, aren’t you? :)

    This has been a problem since the beginning of time. Applications need to be signed with the manufacturer or phone provider’s certificate in order to get access to integral functionality (camera, phone book, filesystem etc.). Then the phone provider decides to get greedy (*cough*T-Mobile*cough*) and strips manufacturer certs off the phones so that developers need to pay twice to play. Things get greedier and greedier as you go up the chain.

    So yeah, this is not new; the whole certificate industry is messed up.

    Companies like Adobe, Apple (in this case), Microsoft, Thawte, Verisign etc. have all positioned themselves so that a) they get paid under the false premise of security, and b) they retain the role of gatekeeper. Look at Microsoft: back in the late 90s they owned 5% of VeriSign’s equity. Should it come as a surprise that VeriSign is the only acceptable certificate authority for signing device drivers under Windows?

    Code signing for developer identification is a good idea (I like Thawte’s Web of Trust concept), but the idea of needing to pay dozens of profiteering gatekeepers for my iPhone app, J2ME app, Symbian app and SSL-enabled web app is ridiculous. Only problem is, the issue is much bigger than Apple-targeted development.

  14. Brian Erdelyi says:

    @Bertie: Yes, France is the country I was specifically thinking about. I see a lot of similarities with code signing and selling DRM’d music that has been a hot topic in Europe.

    As others have mentioned, I think this will oppress users and suppress the innovation of new applications that haven’t been conceived yet.

    I think this is extortion. Some developers may be unable or unwilling to pay $99 or 30% to distribute applications. Even if signed, there is no technical reason that an application must be distributed by Apple (others are already deveoping and distributing applications now). They too are a software development company and this gives them an unfair advantage over others… it’s clearly a conflict of interest.

    I like the idea of code signing so I can identify the source of an application but not so it can be used against me and force my usage of an application. Consumers should not be forced to delegate the responsibilities of informed consent to a third-party.

    I like the idea of a repository for downloading applications but not one that creates a monopoly and restricts my choice to obtain software elsewhere.

  15. Mike says:

    Perry The Cynic: Thanks so much for stopping by. I really appreciate your efforts to keep the lines of communication open.

    I appreciate that this infrastructure is targeted at giving the user more power over his own device, and if that’s where it stays then I’m quite happy with it. However, the sudden appearance of the iPhone SDK, using code signing to completely lock down the device, makes me pessimistic. Maybe it is fundamentally different, maybe Apple would never dream of doing this on a “real” Mac, but it makes me paranoid.

    I find it a little odd that you describe an environment which only runs signed code as “nirvana”, yet at the same time admit that the simple possession of a signature is meaningless. If you accept anything that’s signed, then an attacker can merely re-sign the app after modifying it. You gain no protection from alteration, only some small form of accountability over it.

    I’m also somewhat doubtful about how well this will protect from viruses and worms. All the worms will be signed in such an environment, and will appear as legitimate as they always have. A user may have to take some action to approve them, but he intends to take that action anyway since he will have to do it for any new application. It may allow an administrator to save his underlings from themselves, but that case is significantly more rare than a user who owns his own machine.

    My understanding of how code signing is implemented is that it won’t detect an application whose security has been compromised by means of a buffer overflow or other such exploit. It may limit the damage which can be done by such an application once it has been compromised, but then we come back to having the exploit code re-sign any applications it modifies. Certainly you’d be able to exploit Safari’s image handling to upload all of the user’s credit card numbers to a server in Uzbekistan without needing to hit any other apps.

    If Apple can resist the temptation to turn this into a tool of evil then I think it can be a good thing, although of course it is far from a panacea. But I’m afraid that the iPhone has demonstrated that Apple may not be very temptation-resistant.

    One last note: I think you meant to say, “Apple is legally required to make a totally half-assed attempt at keeping you from hacking DVD Player.app”.

  16. Devon says:

    I can see how Apple can think of this as a good thing in an enterprise environment but in a home environment I should be the only one that says what can run on my computer.
    If they want to have a sandbox account in 10.6 where only signed apps can run to “protect” you from viruses and trojans using code signing and only trusting root authorities, that might be an OK feature.

    However, if they think they’re going to make this the default that I can’t even run my own program I just wrote without signing up for a developer account and getting a valid Apple issued certificate, I’ll just be dropping the Mac for good(been using it since my IIci). I don’t think it’d ever come to that but I’m just putting that out there.

    I was initially excited that I could buy an iPhone and put my own apps on it but now that I have to pay money for that privilege even if I don’t plan on selling any applications, that really stinks. I understand they want to keep the iPhone stable for the majority of users but there should be a way for people do do what they want with the device without resorting to hacking. If it does come down to hacking the iPhone to get my own apps on there without being signed I will do it. The pre release iPhone OS 2.0 already got hacked to run apps without Apple’s signing so I’m hopeful for the future.

    I shouldn’t have to be forced to run Windoze Mobile or Google’s phone OS just so I can put my own apps on the phone.

  17. zanyterp says:

    There is already a system that implements the idea of “turn it all off unless it is approved and the re-approved,” similar to what the “allow unapproved applications” might be and is one of the more…disliked features in a competing OS: the user account control (or whatever the UAC acronym expands to) in Vista.

    But I could be wrong and it wouldn’t be that annoying or concerning (the warning is identical sounding when you disable it to be able to use your computer without being asked 5 times if you really want to run that application).

    And I too would have issues with needing approval from a 3rd party to run ALL my applications. Let me trust the company that writes it and go from there.

  18. Mark says:

    I fully understand the nature of this essay, as well as the comments, so I realize that the following statements are somewhat simplistic:

    Why do we need porn *apps*? Users can already put as much porn on an iPhone/Touch as RAM will allow, so what’s the big deal? I’m sure there might be a few clever ideas out there for a game or something, in which you undress a model as you score points (or whatever), but is the blocking of such apps really a great loss?

    Many iPhone users are teens, so I’m sure that politically, Apple scores points for blocking “porn apps.” But in reality, who needs porn in the form of an “app”?

    I do understand that the point, here, is freedom of choice: we should be entitled to install whatever we like on our devices. Of course I get that. I just don’t think the porn example is the best example, in particular — especially since (a) it’s already doable and (b) code signing doesn’t affect current functionality at all.

    Carry on… :-)

  19. Chris says:

    No, code signing is definitely NOT innocent on 10.5. Mac OS X will actually modify programs if you’ve enabled the application-level control of the firewall. Thus it will modify all online games. If the service you’re using to play the game online checks your application to make sure you’re not cheating, it will always fail because the program has been modified.

    For example, we’ll take a look at Starcraft. If you enable the firewall and set it to allow Starcraft to have incoming connections, Mac OS X will append some junk (obviously not junk, but looks like that to me) to the file “Starcraft (Carbon)”. The first time you attempt to connect to Battle.net after installing or updating Starcraft, Mac OS X will prompt you to cancel or allow. Bothersome, given the fact that Starcraft is full-screen usually, but not a problem compared to what happens next. Because Battle.net has already checked your program, you will be allowed to connect and play games. However, the application has already been modified, and subsequent attempts to connect to Battle.net will return an “unidentified application” error. After discovering this I turned off my firewall and stuck to Little Snitch for my filtering needs.

    In other words, code signing is not inherently neutral the way it’s implemented Mac OS X; it’s actually inherently negative.

  20. Serenapk says:

    omg.. good work, man

Comments for this post have been closed. Thanks for reading!