The Apple way of (in)security

Tight hardware control and strict application policies reduce risk. So why doesn't everybody take Apple's approach to security?

When will Apple become a juicy target for hackers and cyber-crooks? Industry experts have predicted that as Apple's market share grows, so will the malware targeting its platforms.

To date, this has yet to happen, but why?

Is the Mac really that much more secure than the PC in terms of design or policy? Or is Apple's market share still below some sort of malware tipping point? After all, the PC has a much larger market share and is the platform of choice for most businesses. Thus, it's a far more enticing target for hackers. Right?

Also see Chad McDonald's Peeling Apples: Reconsidering Mac Security

Not necessarily. There are two key markets where Apple is a leader: MP3 players and smart phones. iPods rule the music world, and even as they've become internet-capable, hackers have ignored them. Contrast that to USB storage, printers and other peripherals that have posed serious security problems. (http://defensetech.org/2008/11/20/pentagon-slammed-by-cyber-attack/)

The same is true of the iPhone. Granted, the iPhone isn't the smart-phone leader in terms of market share (that would be BlackBerry in the U.S. and Symbian worldwide) (http://techcrunch.com/2010/02/23/smartphone-iphone-sales-2009-gartner/), but it's surely number one in terms of prestige.

Yet, the attacks are few and far between.

This doesn't mean Apple platforms don't have vulnerabilities. They do. Plenty of them. The March software update of the Leopard and Snow Leopard operating systems corrected a record-setting 92 vulnerabilities, a third of which were critical. Apple's software has just as many flaws as everyone else's.

Control Mitigates Risk

What's different about Apple's software is that it is very tightly controlled.

The Apple way of security is this: control the hardware, tightly control your own software, control where users get third-party software, control what type of software can be installed and then control what that software can do after installation. As a result, you have controlled malware.

You may have a load of vulnerabilities, but this process makes it difficult for hackers to exploit them.

Despite industry and investor pressures, Apple has never decoupled its software from its hardware. In fact, when Mac clone companies have emerged, Apple has gone on the offensive with lawsuits and PR attacks. (http://consumerist.com/2009/11/federal-judge-rules-against-scrappy-mac-clone-manufacturer-psystar.html)

With tight hardware control, Apple doesn't need to worry about a slew of hardware vulnerabilities, such as faulty drivers or communication ports accidentally left open by default.

Next, Apple strictly polices the software that runs on its devices. If you want your software on the iPhone, there's one way to do it: get pre-approved applications through the App Store.

You cannot add an app that does something malicious or questionable, such as piggybacking spyware, unless Apple approved the app in the first place. If you're embedding adware or spyware, good luck on getting approval.

If the approved app has some flaw that wasn't uncovered in the approval process, Apple's larger process will still mitigate the risk. Approved apps must be self contained, meaning that attacks that rely on function, file or buffer overflows can't be executed. Apple has thus eliminated an entire class of attack.

Even the browser, a juicy attack target, is closed off in Apple's world. Many plug-ins never get approved, even when they have broad acceptance. Case in point: Apple still doesn't allow Adobe Flash because of its security risks. The Flash player has direct access to the underlying operating system and file system. In Apple's world, that's risky behavior, and they refuse to allow it.

Control the device, control the operating system, control third-party software and add-ons, and again, you've essentially controlled malware.

If Apple's Way Is So Secure, Why Isn't Everyone Doing It?

Other vendors aren't following Apple's lead for some very compelling reasons. The first is price. By decoupling the operating system from hardware, competing hardware vendors drive down the price of the device.

This happened with PCs, as we all know, and it's happening again with smart-phones. Android had been on the market less than a year when the price point for many of its phones fell below $100 (factoring in carrier subsidies). Then, for the 2009 holiday season, carriers offered two-for-one deals, dropping the price below $50.

Apple never wins on price and never intends to. While this is all well and good when there is tangible product differentiation, as with the Mac, it's a more risky approach in the mobile market where hardware and operating systems are becoming less and less important, while immediate access to a wide array of apps is more so. Apple seems to recognize this—or is bowing to pressure from AT&T—since outdated iPhone models have been priced more and more competitively in the last few months.

The second disadvantage of Apple's way of security is flexibility. Do you need a custom application for your business? Then you'd better go with something other than Apple or brace yourself for a long, arduous approval process. Developing and running a custom app that doesn't have Apple's seal of approval violates the Apple EULA. It's easy to see why the PC is the platform of choice for most business sectors.

Third, there is the speed issue. The app market is changing so fast these days that a closed app store could soon be a competitive disadvantage. Consumers, especially younger ones, expect their phones to keep up with the latest fads. A closed market that forces a lengthy approval process on developers will always lag behind an open market.

Finally, all of these problems combine to create the problem of market share. Closed systems are good from a security and maybe even an aesthetic standpoint, but they rarely capture the market share of open platforms.

That doesn't mean Apple's choice isn't a good one. It works for them, but, unfortunately, this model doesn't translate well from this high-end niche to the broader market.

Is There a Middle Way?

Even though security and openness seem to be mutually exclusive, Google may have already stumbled upon a happy medium.

Android phones come locked down out of the box. The Android Market contains only approved apps, and the default mode is to only allow downloads from the Android Market.

Also see Android Security Chief: Phone Attacks Coming

Of course, it only takes a couple taps to open up your phone to outside apps, but at the very least, setting the device up this way protects a whole class of users from themselves. The naïve user who opens every spam and thinks CD-Rom trays are cup holders won't click out of default mode. Of course, they won't be early adopters of smart-phones, but we'll conveniently ignore that for the moment.

Granted, a simple default protection does nothing to address hardware flaws, OS exploits, flawed plug-ins, etc., but it may not have to. The heart of Google's strategy is to move everything off of the client and into the cloud.

Throw cloud computing into the mix and the Apple way may translate after all. If the time comes when pretty much anything of value resides in the cloud and not on the device, then hackers will shift their attention away from devices.

The risks won't disappear, but the burden is on cloud providers to handle these risks, not end users. And any time you take security decisions away from end users, you've improved security.

Morey Haber is Vice President of Business Development for eEye Digital Security, a provider of unified vulnerability and compliance management solutions based in Irvine, Calif. For information visit www.eeye.com.

Insider: How a good CSO confronts inevitable bad news
Join the discussion
Be the first to comment on this article. Our Commenting Policies