Whether it be Stuxnet or an iPhone virus, it is people who are the cause for trouble. But let’s go back to how the story began: Just a few days ago, it was unthinkable for an iPhone to be infected with a virus. The concept of the App Store itself – which only allows the distribution of software authorized by Apple – seems to suggest that the spread of viruses through their apps would be impossible. It is the same belief we had for one of Siemens’ controllers years ago: "They can never be subject to viruses." It was just a matter of time before both assumptions were proven wrong.
What happened then? Any software running in a closed system, like an iPhone, must be signed by a software publisher. For this purpose, the developer uses a key pair consisting of a private and a public key. The private key is kept secret and used for the cryptographic signature. The public key is signed by the manufacturer of the closed system, in this case Apple, with his private key (root key). The resulting electronic document – which includes the developer’s public key and the signature from Apple – is called a certificate. For validation purposes, the closed system only requires the public key (public root key) that is already included in iOS by Apple: "Only developers that I know and trust, are allowed to run software in my closed system."
In a jailbreak, this mechanism is undermined by the user of the device. A modified operating system skips this check. While any software can then run on the device, the user of a jailbroken phone inadvertently opens the door to virus threats as well. However, the issue now affects respectable users (those not using jailbreaks) too.
And why is the iPhone case so similar to Stuxnet? In both cases, the development environment of the software developer was attacked. This means that the virus had already taken hold of the software after compiling, but before signing the application. When the developer signed the software, he included the virus as well, which thus passed any verification controls unnoticed. Compared to this, the attack via Stuxnet occurred at an alarming lower level. The new incident exploited human vulnerability – convenience first and foremost – by offering a tampered pirated copy of XCODE for download. China was affected more significantly by it, as the use of pirated products is widespread and usually regarded as a minor offense.
What are the takeaways from this incident?
- Even free software needs protection against piracy, protection against reverse engineering, and very robust integrity protection.
- The signature of a software must be made in a trusted environment. For instance, the key should be safely stored in a secure hardware element.
- Even in a closed system, we should not assume that all software will be reviewed in detail and take our security for granted. The review process is only one link in the protection chain.
- A security solution is only as good as the weakest point in the chain. Even the best approach may be undermined, if it is not done holistically.
- A protection solution must offer the same level of security across all platforms. This is where a professional solution like CodeMeter comes into play.
Siemens responded quickly and did a good job after all. Let's hope that Apple is equally responsive. If you are ready to implement the lessons learned from this episode, you can count on CodeMeter, our all-in-one protection suite, and on the professional expertise of our team.