Why do developers prefer iOS?
Kevin Partner investigates why developers are opting for Apple's mobile OS, and what this means for other online businesses
Why do app developers prefer iOS? And what can an online business learn from this fact?
At first glance, a bias towards iOS makes no sense, since Android-based smartphone activations outstripped those of the iPhone some time ago, passing 500,000 per day in 2010.
That’s a huge market, and Google’s lead is likely to extend as vendors introduce more capable entry-level smartphones that attract users upgrading from “dumb” mobiles. Still, the majority of new apps are written for the iPhone.
A glance at the Showcase page of cross-platform development tool Corona SDK reveals that of the 25 most recent Corona apps, 17 were for iOS and ten for Android, with only two targeting both. Since Corona makes it easy to create for either OS, this is by choice rather than ease of development or feature set.
I don’t buy the common explanation that developers are frustrated by Android’s “fragmentation”. Certainly, there’s a huge range of devices (Google informed me that 685 different models would be able to download my recent app), but they differ from each other far less than Windows PCs do.
I don’t buy the common explanation that developers are frustrated by Android’s fragmentation
Take a look at the version breakdown here and you’ll see the majority of Android users are on versions 2.2+, with almost 90% on Froyo or Gingerbread – between which you’ll be hard-pressed to spot any difference. Yes, fragmentation was a problem in the early days, but since 2.2 it’s been one app for all.
Android device screen sizes and resolutions do vary – but again, within a small range: a table on Google’s Developer Console shows that for my app, five of the top six most popular devices have an 800 x 480 screen resolution and the same aspect ratio, making it possible to scale the same code for all (which Corona does automatically).
And given that Samsung’s Galaxy S II accounts for 30% of all downloads on its own, it isn’t too difficult to identify the prime target. Overall, I’ve found writing Android apps no more complex than for Apple, and publishing them several orders of magnitude easier.
So although fragmentation is a popular criticism of Android, in my view it’s groundless. The perceived “coolness” of iOS is certainly another important factor, but the main reason developers choose iOS is simpler: money. According to mobile analytics firm Flurry, for every pound an app can make on iOS it will bring in only 24p on Android. Rovio famously chose not to charge for Android versions of Angry Birds because they make more money from in-app advertising, and the reluctance of Android users to pay for apps has become as much a part of received wisdom as fragmentation (and perhaps just as false).
Culture probably plays a part: most Apple users have been exposed to the iTunes ethos of charged-for content since they likely upgraded to iPhone from iPod. But given the ubiquity of Apple’s popular music player, most Android users will also have previously owned an iPod, so I don’t buy the argument that Android itself fosters a something-for-nothing culture any more than other mobile OSes. I doubt the majority of recent Android phone buyers are more than dimly aware of what OS they have, let alone of its supposed culture!