This article is obviously biased toward extolling the advantages of iOS's uniformness, yet the best example it takes to argue against Android's fragmentation is an application that fails to support only 7.8% of the Android market! How is this a proof that fragmentation is causing a "shocking toll"?
For the curious, I decompiled this "Temple Run" app. It requires Android 2.2 or higher, because its AndroidManifest.xml declares the use of OpenGL ES 2.0 which was introduced in that version of Android. (And the app's minimum API level is Android 2.1.) The app also needs android.hardware.sensor.accelerometer and android.hardware.touchscreen.multitouch, but virtually all Android 2.2 devices have these capabilities.
So, effectively, Temple Run works on any Android 2.2+ device, which represents 92.2% of the devices in the wild: http://developer.android.com/resources/dashboard/platform-ve... When the developer says they support 707 of the 1443 unique devices on Android Market, this means that these 707 devices represent 92.2% of the market, whereas the article tries to present this as "only half the market is supported".
Yet another article written by an iOS fanboy trying to unfairly depict the state of the Android ecosystem... Nothing to see here.
The problem is that a lot of Android devices that claim to support these specs do so in buggy and inconsistent ways. GL shaders, for instance, sometimes have to be tweaked for individual devices.
GL shaders have to be tweaked on every platform, and every device. The only thing that's consistent about them is that they're inconsistent everywhere. For example, certain drivers will reject shaders using integer literals where floats are expected, others won't; different GPUs will also behave different with respect to precision, or with how they handle a NaN in math.
That's GPU "fragmentation." You have multiple hardware/firmware/driver implementations of OpenGL. Unless you prescribe a GPU and software stack, you will always find things that slip past compatibility tests.
I disagree wholeheartedly, this is one of the least biased articles on iOS vs Android I have ever read.
And while I have yet to experience a single app that's not working on my not exactly new Desire Z with stock ROM (2.3.3?) apart from some games that are tabler-only), I do agree on the problem part.
The happy users won't all rate, but the unhappy ones are usually quick to downrate. A frustration I can understand as someone who used a HTC Tattoo with 1.6 for a while.
I agree with what you're saying...mostly. However, there IS a problem with devices that ONLY meet the requirements in buggy ways due to manufacturer alterations/additions. Its silly to ignore this problem and claim it doesn't exist.
Who claims it doesn't exist? I think people are rightly saying that it is hardly the hysterical disaster that some -- for self-serving reasons -- try to present it as.
Android has compatibility tests, and they keep growing in scope and value. OpenGL ES 2.0, for instance, demands certain givens that some devices don't actually meet -- those devices should be removed from OpenGL ES 2.0 compatibility (which, as a manifest directive, would eliminate most games from being available).
Yes, Android is a free-wheeling mess. But I am not concerned.
I see this as a replay of the 1980s. Apple was the dominant player with a premium option on nice hardware, a consolidated set of software options, and facing fragmented opposition. (Then the PC/Windows landscape. Now Android.)
Then Steve Jobs was yanked from the picture (then by being fired, now by dying), Apple lost its focus, and the fact that so many people were on the messy platform caused it to win in the marketplace.
There are big differences. Tim Cook is hopefully not as incompetent as John Sculley proved to be. Google is not Microsoft reborn.
But we've seen this before. The rule of thumb in computers for decades has been, "the commodity always wins". And Android is better positioned to be that commodity than Apple is.
> I see this as a replay of the 1980s. Apple was the dominant player with a premium option on nice hardware
No they weren't. In Job's first stint at Apple he was beaten by Commodore.
Commodore were vertically integrated. They owned the company who designed the chips in early Apple computers. They focussed aggressively on price, and they won the battles with the Vic 20, C64 and early Amigas. After a bunch of crazy decisions at the end of the decade they eventually lost the war and died a death in the mid 90s.
As they say, it's the winner of the war who gets to write the history books, and in Cupertino they're done an excellent job at it.
Then the commodity is the iPad. Eventually this may change but it's 2012 and a serious competitor hasn't been built yet (going by what I've seen from CES).
There are still some major differences between the early 80s and the current market:
* IBM: They owned the office long before Apple came along and continued that dominance with their invention of PC. IBM isn't in the tablet business. RIM tried to make a play and did poorly. Increasing corporate tablet sales are going to Apple.
* Price: Compared to a Mac the PC was cheap. Nowadays tablet makers are having a hard time building <$500 tablet that will sell.
This is likely to change in the next generation though. Currently the sub-$500 tablets are pretty poor but cheap Tegra 3 devices are going to be good enough for a lot of people.
This was said about every previous generation of Android tablet. The problem is Apple's execution has been flawless: price, performance, delivery dates, supply, apps, and marketing.
A lot of people are assuming things will play out the way they did in the 80s. Like Apple didn't learn anything. They realize developers, price, supply all those things matter. Which is why I don't believe 2012 will be anything like 1984.
The first generation of Android phones was terrible too.
What I hope and expect to happen is that in the next year or two we see real competition in the tablet market. Apple is likely to stay on top at least in the near future but I predict lower cost Android tablets are going to start to exert real pressure on them soon.
There's only been one generation of Android tablets that's played out and they all came with Honeycomb with Tegra 2's. Even Google admitted that HC was a thrown together hack so its not surprising that the devices weren't particularly competitive. The upcoming Tegra 3 devices is the second generation.
They were preceded by a generation of single-core Android 2.x tablets, which were terrible. And that includes the original Galaxy Tab that was sufficiently Google-supported to get access to all the proprietary Google apps and the Marketplace.
The best devices of the Honeycomb generation are pretty compelling, especially after getting an upgrade to ICS.
The only Google sanctioned "tablet" from that time was as you mention the Galaxy Tab. The only reason it had the market was because it was essentially a big phone. In Europe, you even pop a sim in it and make calls out of the box. However, the others were not sanctioned thus hardly qualifying as a "generation" and even the Galaxy Tab was discouraged by various people involved in Android development.
Not to mention that the PC vs. Mac wars have come full circle. With PC's becoming completely commoditized, noone able to differentiate/ offer compelling products while Apple's market share and margins continue to grow.
PC margins have always been razor thin and Mac margins have always been better. PCs have always been commodity, that's the whole point. People used to refer to them as "beige boxes."
After his return, Jobs managed to turn Macs into fashion accessories, first with the iMac. This put the Mac into a category where it wasn't competing directly with PCs, where it's been ever sense.
But let's not kid ourselves. Apple lost the battle for the desktop computer in the 80s, and it never regained it. Jobs returned Apple to profitability by giving up on that market and positioned the company into a new one.
One difference that the OP failed to mention, between desktop computers and cellphones, is that cellphones are worn and therefore have always had an element of fashion to them. It's unlikely that cellphones will ever be "beige boxified" since there's no customer that sees them as being purely utilitarian and will ignore physical aesthetics.
no customer that sees them as being purely utilitarian and will ignore physical aesthetics.
Except for, you know, about 3 billion customers in Asia and Africa who cannot pay a premium for aesthetics but whom cheap dumbphones first allowed to communicate outside their immediate surroundings (no landlines there), and whom cheap Android smartphonse are now (or in the near future) first allowing to access the internet.
Are these people in the third world putting their credit card into an app store and buying software? How do cheap android devices in third world countries validate android as a developer platform in the long run?
I'm not trying to be snarky here btw, legitimate questions.
Eventually they will buy software and services that provide value to them, which can be things for which first-world countries have an existing infrastructure, but which have so far been completely unavailable to these people.
For example, in parts of Africa, transferrable pre-paid phone credit has become a de facto bank acount for many people who've never had access to non-cash transactions.
Other examples: third-world countries have many subsistence farmers. Accurate local weather forecasts and information about current market prices in neighboring towns could be very valuable to them.
As for payment - credit cards may not play a huge role, but centralized app stores can easily support a diverse range of nation-specific payment options (like the phone credit mentioned above).
In places like Africa you buy phones from a stall in a market square. No contracts, because they're not enforceable.
I think the only credible (and very real[1]) competitor to low-spec Android handsets is Windows Phone, which with the Tango update will run on really low-spec hardware. Nokia is playing a solid game here.
[1] Real because Nokia rule the feature-phone market, not so much in Asia but very much in Africa. Nokia have a reputation for quality there that Android, as the new kid on the block, will have a hard time displacing.
Which part of 'on-contract' you didn't understand? The full phone price is in the price of contract with interest.
Of course you can buy sub-$50 phones on contract too, in which case the monthly payment would be much lower than AT&T's $99 iPhone plan, of which about half goes towards the phone payment.
But let's not kid ourselves. Apple lost the battle for the desktop computer in the 80s, and it never regained it. Jobs returned Apple to profitability by giving up on that market and positioned the company into a new one.
Are you seriously describing the company making 3 times as much profit as their nearest competitor and consistently outgrowing them all for more than 20 straight quarters as "giving up on the market"?
In Android you don't have to support any device you don't want to, just exclude it from being shown to those users. Same with the OS version, if you don't want to support 1.6 you can make it unavailable to those users.
Most apps don't have fragmentation problems unless you're doing something tricky and hardware dependent, like trying to do fancy things with the camera. I've released several semi-popular apps with few bug complaints.
Finally, the author's link to one badly voted Reddit comment with little discussion made me think he was just out to write an Android bashing article as link bait.
In Android you don't have to support any device you don't want to, just exclude it from being shown to those users.
This will:
1) Be quite tricky due to the sheer number of devices making it difficult to test everywhere.
2) Only supporting those you know to work isn't viable due to the amount of fragmentation.
Most apps don't have fragmentation problems unless you're doing something tricky and hardware dependent
How about reading from a zipped resource on storage? (10x slower on Galaxy S, in some circumstances, due to filesystem bugs)
How about doing an SQL query with a JOIN? (100x slower, in some circumstances, on pre-2.3 devices due to a query optimizer bug in the SQLite version they ship)
Showing a splash screen? (Top cut off on Samsung devices running Gingerbread, nowhere else. Still tracing the reason)
Finally, the author's link to one badly voted Reddit comment with little discussion made me think he was just out to write an Android bashing article as link bait.
That's a valid criticism. The author found one person that didn't want to pay a dollar, and a few hundred calling him out on being stupid.
Can we get more income comparisons between popular apps on iOS and Android? That's the real data.
Fragmentation is not that bad of a problem because the distribution is far from even: you can still support most (as in 90%+) with just a handful of types. You van also restrict support by limiting the minimum OS version. The 707 devices supported cover more than 90% of the devices used.
The article is linkbait. And less than stellar journalism. Looks like someone had to fill the weekly quota of articles and took the easy option.
But it's a bad comparison.
Instead of bashing we should be thankful for the choices.
I can't read about this so called fragmentation anymore, it's ridiculously overblown, seriously.
First of all, Android provides are really good set of APIs to make the development of apps really easy across multple devices with very different hardwars specs.
There may be device specific bugs, but well.. That's business as usual.
If you guys are all for comparison, than write an Android program for 1 device(and test for it) and compare it to the iphone version.
Comparison is really flawed with iOS vs Android, it's stupid.
Also, it's not like iOS apps are bugfree. I have seen enough bugs on my iPad to say that general app quality is not as different as people claim.
I really, really hate supporting an app on Android. This is a big part of the reason for that.
As an example, at one point it apparently would simply refuse to start on the Droid X, despite running great on the original Droid I had to test it with, and the VM I set up to emulate the X. Without access to a physical Droid X, it couldn't be debugged. The app doesn't require anything from the hardware besides a GPS, so I was at a loss for what would cause this.
Because it's a huge pain to properly test new versions, I tend not to keep the Android version up to speed with the iOS version of the same app. As a result of that, I get a lot of complaints from Android users that feel short changed and bad reviews from people who feel like the app isn't keeping pace with the alternative apps (mostly they use the web app as reference).
It's just a really crappy situation. I've thought about dropping Android support, but then they would just complain that the iOS crowd get an app and they don't, so that wouldn't be much better. At least I wouldn't have a low star rating publicly attached to my brand, though.
> As an example, at one point it apparently would simply refuse to start on the Droid X, despite running great on the original Droid I had to test it with, and the VM I set up to emulate the X. Without access to a physical Droid X, it couldn't be debugged. The app doesn't require anything from the hardware besides a GPS, so I was at a loss for what would cause this.
If your Android app is similar to the screenshots, it really is quite trivial and terrible looking. I don't mean to sound harsh, but it looks like you did the absolute minimum work possible to "support" Android. Blaming feedback on the platform seems like you're being a bit dishonest with yourself.
It's also interesting that you have a lower rating, and more negative feedback (26% 1 star) on iOS than you do on Android.
Yeah, that's pretty harsh given that you haven't tried it. Of course it could be better with more work put toward it, my point was that the platform has made it a pain. Whereas things that worked on the first version of the iOS SDK still work well, the same hasn't been true of Android (mostly the screen sizing). I unfortunately don't have time to update it with new versions of android. I don't dislike Android, but I don't like it as a dev platform. The bad reviews I see of the iOS client are actually mostly unrelated to issues with the client. This is in contrast to the Android side.
In any case, I was mostly referring to the email feedback I get about the two versions.
I honestly am not trying to sound harsh, but I've found that this discussion provides a very convenient excuse for a lot of people. If your app isn't great on Android, it's Android's fault. To the reviews on iOS (which I browsed to on my iPad 3rd gen), a large percentage of the reviews complained about the app outright crashing. How is that not a problem on the client?
Hm, I didn't see those in my brief look this morning (was on my phone), I'll have to take a look. There was a syncing issue with the last version which was causing some crashes, but I think it should be fixed in this version. For this version's reviews, I just see a login problem and a couple complaining of it not loading anything (usually connectivity issues). I should probably prompt users for reviews to get the skew away from those motivated by negative experiences.
It's obviously not Android's fault entirely, but as a result of their choices (letting handset manufacturers run wild with no real standardization, including making custom versions of the OS), getting things to look and work perfectly is more work. For a one man shop and some occasional contractors, that makes continuing development less of an obviously good idea because for the amount of time that can be spent on it, the results won't be as good, and it's a direct result of the way Google has been handling their ecosystem. For people whose whole job it is to make apps, sure, Android is a perfectly fine platform, but it's higher maintenance than the iOS side of things.
I think it's very valid criticism and it can't really be compared with Windows. There's many differences:
- Windows has a lot of APIs and hardware abstraction levels. Microsoft has always been known to change things very, very slowly, and keep backward compatibility.
- Hardware makers don't mess with Windows nearly as much as they do with Android. They will install a Norton trial and that's it. They don't remove the UI and put their own.
- Microsoft acts as a gate to what innovation hardware vendors can bring in, even more so today with driver testing and signing. Remember back in the day when an OEM did something crazy you'd get BSODs from bad drivers. Now that doesn't happen anymore, because everything is so much more robust. Android is still no where near that, and Google isn't doing much, relying on the open community.
None of your differences are really the differences you think they are.
Android has a lot of APIs and hardware abstraction levels. Of course it does or it wouldn't work across over a thousand devices with a pretty high degree of success.
OpenGL ES 2.0 itself is an abstraction that has allowed for lots of triple-A (relative to mobile) iOS apps to painlessly get ported to the NDK on Android.
To get the market and the Google apps -- to be a Google sanctioned Android device -- the maker has to pass the Android compatibility tests. Should they be more stringent? Absolutely, but they exist and have done a good job.
The linked article is terrible, and horrendously biased, using exaggerated language at every turn. It makes completely unsupported claims at every turn. What Android has accomplished is amazing, and the level of compatibility is incredibly high, but yes with over a thousand devices, not all of which are certified, hundreds of millions of users, and hundreds of thousands of apps, you will find problems. Big surprise.
What I'm curious about why it is fragmentation is such an issue on mobile devices, when it was never this big a deal on desktops. Outside of cutting-edge 3D games, Windows developers don't need to have 100 laptops to test all the different hardware permutations for their software.
> What I'm curious about is why is fragmentation such an issue on mobile devices, when it was never this big a deal on desktops.
Stable APIs and abstractions, better renderers, ability not to go fullscreen and nobody caring about dpi. Resolutions changed very slowly, especially before the 00s and 3D games. Also culture, somewhat, probably: if the game does not work due to a bizarre combination of drivers it's the users problem.
And it's definitely not true that "it was never this big a deal on desktops", stable hardware target is regularly mentioned as a reason to go console by developers.
Heck it was never a big deal is factually incorrect.
IT is STILL a big deal:
Fragmentation means your ATI card not working one day, or your novice user having to go dig through pages of support to figure out how to debug a faulty driver.
Or its your printer not working on XP, but working on 7 || your games being unplayable on one set up and working on another without a hiccup || any of a massive set of issues that people take for granted when they have to deal with PCs.
Fragmentation is an issue, its just that people have gotten used to it and familiar with it.
I would say its a much of a deal on mobile as it is on desktop. The problem is how badly Android is doing here compared to Apple. If there were only Android, the problem would be the same for the developers, but they'd accept it in return for the income.
Right now, if you can only target one platform, it's a no-brainer against Android.
Note that you say "outside of cutting-edge 3D games", but remember that games are a significant part of apps. And cutting-edge on an Android device doesn't have to mean that much.
Lastly, consider how console game sales are doing compared to PC game sales. And how developers tend to treat PC game development as a result.
- Dell, HP, Compaq, Acer, etc didn't invent entirely new UI's for the computers they sold: everybody's OS looked basically identical and had similar interface elements and metaphors.
- Screen sizes and resolutions increased slowly and didn't involve taking over the entire screen as with smartphones. Software that previously took up the whole screen at 1024x768 simply ran as a smaller window at 2560x1440 and wasn't forced to scale up to full screen.
Android apps that are designed for smaller screens leave black space on the new mega-screened phones. Users pan you in the reviews pretty hard for this, though. The expectations for apps that conform properly to every phone are a lot higher than they used to be on early PCs, I think.
I don't think early PC games had such a direct feedback mechanism as the Android store has.
Evil developer didn't allow his app to run on your phone? Punishing him is only 2 clicks away!
App doesn't have a feature you want? Blackmailing the author by giving him a 1 star review is only 2 clicks away!
Reminds me a bit of the restaurants complaining about online reviews.
My app was written for the G1 by a friend on a correspondingly early version of the SDK back when the G1 was the only Android handset, and it doesn't scale up to full screen on the bigger phones. It might be the case that they've released autoscaling and it just wasn't around when it was developed, or it might just be that he used the SDK improperly and neglected some existing autoscaling options.
Yes, as you guessed, the problem you're seeing is because the app was developed for a very early version of the Android API. Here's how to fix the app (requires a recompile):
"If your [applications] manifest file has either android:minSdkVersion or android:targetSdkVersion set to "4" or higher, then the Android system will scale your application's layout and assets to fit the current device screen, whether the device screen is smaller or larger than the one for which you originally designed your application. As such, you should always test your application on real or virtual devices with various screen sizes and densities."
I was going to make this exact post. I realized that:
1. a lot of this is handled by APIs
2. we still have to deal with hardware issues, such as having the correct video driver and a correctly configured OS to play non-casual videogames. Some programs refuse to work with certain hardware, like Dungeon Master 2 not running with Creative sound cards.
If you are a mobile software developer who refuses to deal with Android under the pretext it is "fragmented", you will simply be obliterated by your competition. The Android market is so large, and becoming larger every day, that other developers will gladly do the work you refuse to do. And they will succeed, just like the desktop/server software development world has learned how to cope with "fragmentation" in the Windows/Linux ecosystem by supporting the various OS versions, editions, service packs, distributions, locales, etc.
It does not matter what you think about Android fragmentation. It exists. Deal with it.
The point about the market being 'large' is the very thing that fragmentation undermines. What matters is how many paying customers can receive a good user experience from your app, and what this costs to provide.
You also assume that Android will continue to outpace iOS in terms of growth - i.e. the trajectory will be like the PC rather than like the iPod.
It matters very much what you think about Android fragmentation. Dealing with it has been a costly mistake for some developers, and could be for you.
Stop complaining about fragmentation and define yourself which devices you support!
It's crazy to think that a common OS among such a wide variety of phones is enough to achieve perfect portability. Let's take an analogy: Your Windows PC runs an OS very similar to the ATM across the street (I'm not even mentioning the wide array of platform Linux runs on). Do you really expect your app to run on all Windows machines flawlessly?
The real issue is that Google is only too happy to brag about number of Android users, or apps in their market (what's the proper naming convention now? Play?). Corporate bullshit meant to appease shareholders and attract developers.
You can sit all day complaining that Android isn't the developer paradise it was advertised, or you can be more pragmatic, restrict yourself to fewer devices and achieve quality.
But we, as developers, are greedy and pretentious. We want our apps to run on every machine possible, and be featured on top of their stupidly competitive "market".
I blame Apple for the broken model (but hey, it works for them), and Google for not knowing better.
I think you missed the part where I mention that Windows _runs on an ATM!_ I'm not crazy enough to suggest that Windows apps running on your Dell laptop would run on a Toshiba laptop. After all, they're both clones of the same machine (http://en.wikipedia.org/wiki/Pc_clone)
I've programmed ATMs, and dendory has it exactly right: those are completely regular stock PC hardware running a completely regular copy of Windows. The only thing special about them is that they're physically secured and have some very unusual peripherals and assorted drivers.
Honestly I never did program on an ATM so I may be wrong. But I'm willing to assume that unusual peripherals and drivers are analogous to the portability issues across android phones. Genuinely asking, not claiming to have the truth.
There's actually a standard for ATM hardware interfaces originally developed by Microsoft (WOSA/XFS, now CEN/XFS), which is a major reason why ATMs run Windows.
When I worked in the industry 10 years ago, there were lots of ambiguities in the standard and the drivers were pretty flaky, so the situation there may actually have looked like Android does now.
But ATM-specific hardware and software is a niche market with few customers paying lots of money each, so the pressure to eliminate incompatibilities is not as strong - you can just throw people and money at fixing issues as they crop up.
The situation is completely different with mass-market PC hardware and software though, and I'd say that it should be possible for Android to get there as well, since it's also a mass-market product.
Actually why would you think a Windows app wouldn't run on that ATM? I'd bet most would. It's probably a normal PC, with all the normal hardware. It would be slow, because it's most likely a slow one with no GPU acceleration and such, but I wouldn't be surprised if 95%+ of Windows apps ran on it. The only reason you, yourself couldn't test it is because they (obviously) run their own ATM software full screen and block you from installing stuff. But why wouldn't you think an ATM admin couldn't fire up Solitaire on it?
> Let's take an analogy: Your Windows PC runs an OS very similar to the ATM across the street [...] Do you really expect your app to run on all Windows machines flawlessly?
Yes, I do. (With limited exceptions.)
Software often has minimum specs to run. Apart from games the minimum spec will often be for machines that are obsolete.
So (apart from games) I would expect an app created for one machine running MS Windows to work on another machine capable of running the same version of MS Windows. And those versions of Windows will last for several years.
Compare that with Android as applied to cell phones. Not everyone upgrades their phone every 18 months[1].
Somehow thousands of Android developers still seem to manage fine. Some parts of the article made me quite suspicious that it was really an Apple marketing piece.
Our experience supporting both iOS and Android versions of our app for 18 months (>1M daily users) is that Android users are much less satisfied. We are twice as likely to get a support request from an Android user and those requests tend to be more complex (and qualitatively more grumpy).
For example, take push notifications. C2DM only works on 2.2 devices with a logged-in Google account, so you need a fall back push mechanism. Even with C2DM you must struggle with various challenges surrounding power management and wifi sleep. In comparison, APNS just works.
Also, Android aggressively deprecates. With ICS there's a brand new Fragments API, which to be sure has a backward compatibility library, but now you are faced with the choice of rewriting your app to fit the new UI style (and potentially confusing existing users), limping along with the old Android styles, or supporting both and making your life miserable. Hey, and don't forget landscape mode and multiple aspect ratios!
" A previously (very) successful game on iOS, it was brought over to Android in order to take advantage of the huge number of devices that run the OS."
Right. A game with graphical assets designed for a couple of fixed screen sizes and resolutions is going to have problems on a system that forces developers to handle multiple screen dimensions and resolutions.
Shocked. I'm shocked.
The one thing Android "fragmentation" (meaning whatever it means in which ever context) hasn't caused for applications that were developed for Android is compatibility problems. Android apps are remarkably compatible on hardware that they were never tested on.
There is plenty to hate about OEMs' craplets and "enhancements" - which comprise the more commonly held definition of "fragmentation" - but they have had very little effect on application compatibility.
This very "hard" problem was solved long time ago by releasing your product along with a free demo (or shareware, or whatever). As a developer you simply have to make sure your product works on the most popular platforms, or whatever kind of platforms you want. Before the user buys he tries it, so there will be less complaints that the app does not work for them.
Hardware and software will always be fragmented in a free market, and that's ok.
How do I know? Because the same fragmentation exists on iphone but smaller in quantities..to say it is not there on iphone is a bit somewhat lying like republicans..
Now lets see someone be honest about this on both platforms or STFU
I run a 10-person development team, split 50:50 Android / iOS, under a digital agency. We build apps for carriers (account management apps), brands, etc.
From my own personal experience, which having built apps since the first iOS SDK and Android 1.6 is hopefully fairly reliable, the QA and testing burden on Android apps is considerably higher than iOS.
That in itself isn't totally surprising, because there are far more handsets on the market. But that's not the real issue: say I'm writing an app for a network carrier. Carriers typically want their apps to run on all hardware they've sold for the past two years (ie, everyone who's still under contract).
On iOS, that's iOS 3-5. Very soon, it'll be iOS 4-5 - 4 was released in June 2010.
On Android, that's 1.6, 2.0, 2.2, 2.3, 3.0, and 4.0. Because the key difference is when Apple release a new version of iOS, they stop selling the old one. But that's not the case with Android. Even today there are still phones on the market running 2.1.
When you say "the same fragmentation exists on iPhone but smaller in quantities" - I don't really think you know what you're talking about. If you're developing an app now you can pretty easily support iOS 4.3 upwards, which would cover all handsets being actively sold by Apple for the past 21 months or so (the 3GS onwards, or three phone models and two tablets)). How many Android phones running how many versions have been released in the past 21 months? Far, far more.
This doesn't inherently make Android a bad platform: lots of people choose to develop PC games even though there's a much higher testing burden that consoles (in terms of the varying environments and OS versions). I think Android is great: I'm not arguing against developing for it. But I think you'd be very hard pressed to find someone experienced with both the iOS and Android platforms who'd disagree with the notion that fragmentation on Android takes up a lot more time to deal with than iOS.
Apple's major versions come out once a year, while Android's about twice per year. In this case Android 2.1 (early 2010) is even newer than iOS 3.0. Are you saying nobody has an iOS 3.0 iPhone anymore?
I don't think phasing out versions is a big deal for Android. It happens about as fast as for iOS (~3 years). What is a bigger deal is how fast the latest OS/API gets adopted. iOS can push out the new version in days to at least 50%-75% of the users, while for Android it takes about a year to be on 75% of the devices (that's how long it took for Gingerbread).
And the issue with that is that users don't get to take advantage too much of the most cutting edge software, until months after release, when (or if) they get their upgrade, while on iOS they can do that just days later.
For the curious, I decompiled this "Temple Run" app. It requires Android 2.2 or higher, because its AndroidManifest.xml declares the use of OpenGL ES 2.0 which was introduced in that version of Android. (And the app's minimum API level is Android 2.1.) The app also needs android.hardware.sensor.accelerometer and android.hardware.touchscreen.multitouch, but virtually all Android 2.2 devices have these capabilities.
So, effectively, Temple Run works on any Android 2.2+ device, which represents 92.2% of the devices in the wild: http://developer.android.com/resources/dashboard/platform-ve... When the developer says they support 707 of the 1443 unique devices on Android Market, this means that these 707 devices represent 92.2% of the market, whereas the article tries to present this as "only half the market is supported".
Yet another article written by an iOS fanboy trying to unfairly depict the state of the Android ecosystem... Nothing to see here.