There’s an article making the rounds right now about how applications on iOS crash more often than applications on Android. I’m not going to detail the entire methodology – the article itself does so – but it does raise an interesting talking point about how both mobile operating systems handle application crashes and updates.
Before we get going, my personal experience with both operating systems is definitely in line with the findings of the study and its data. On my Galaxy SII, I’ve probably seen like two, maybe three application crashes in the past few months, while my iPad sees several application crashes per week. Back when I still used my 3GS, the situation was the same. Of course, your personal experience may differ.
The interesting thing here is that Android and iOS handle application crashes differently. On iOS, there’s no indication an application crashed; you’re just dumped back at the home screen. On Android, you get the force close dialog. There’s something to be said for this cleaner approach in iOS, but personally, I like to be notified of a crash, but hey, that’s just me. This could mean that most people simply don’t realise an application has crashed; “I must’ve pressed the home button or something”.
So, if we were to assume the data reflects reality – why do iOS applications crash more often than Android applications? I think a very big difference is how applications on Android can silently update in the background (if enabled in the Market), whereas iOS applications have to be updated manually. Whenever someone hands me their iPhone, I often see them not updating their applications.
Furthermore, Android developers can update their applications way more easily than iOS developers can. Google is far less strict with its market, meaning developers can push updates instantly, quickly fixing bugs. On iOS, each update has to go through the certification process, meaning developers tend to wait with pushing updates to make sure they include several fixes – less updates, meaning bugs can roam free on iOS devices for longer times. All this shows that there’s really no ‘right’ or ‘wrong’ way to curate an application store.
Of course, there’s countless ifs and buts attached to this data, so don’t accept it as some sort of clear and unprotested truth. Still, the points raised are interesting.
Your statement that iOS apps are more crash prone because Android apps are updated more often is just silly. Did you ever wonder why C++ apps crash more than Java apps? Why would you think it would be different in the mobile space?
With the explosion of mobile app development there has been an influx of very average programmers that don’t know much about memory management. And when you consider the limited amount of RAM the iPhone and iPad have it all starts to make sense. The majority of these iOS apps crashes can probably be attributed to bad programmers that can’t manage their memory properly.
Edited 2012-02-04 00:00 UTC
You know, it’s funny… if somebody makes a comment about how many desktop Java apps run like ass, people are quick to blame programmers as well. Surely, it CAN’T be the languages that suck
Have you considered upgrading your Pentium with 1 GB of RAM? As for languages that suck, well, they really don’t get much worse than Objective C.
Edited 2012-02-04 00:20 UTC
Well, I would upgrade my pentium to 1 G, but it doesn’t have any more memory slots & maxes out with two 64 mb sims.
Except for C/C++, Java
Couldn’t agree more. A bastardized mix of C and SmallTalk, and neither bit bares even the slightest syntactic resemblance to the other. Even C++ is cleaner, and that’s saying something. Why anyone thought the design of obj-c was a good thing I couldn’t guess, and why Next (and as a consequence Apple) went with it is one of those mysteries we’ll probably be wondering about forever.
I actually love Objective-C, for high level application development it provides a lovely runtime oop environment… That said, I always use C++ for embedded design.
-edit- Both Obj-C and C++ are vastly superior to the abortion that is Java and C#… The Fisherprice of programming languages
Edited 2012-02-04 11:25 UTC
Life is full of mysteries indeed, especially when you have no clue what you are talking about.
Objective-C, unlike C++, is a strict superset of C. And it does object orientation just like the creators of smalltalk intended it: via messages.
So if you claim Objective-C bears no semantic resemblance to C or SmallTalk, then I have to assume that you have never really programmed seriously on any of those three languages, or “semantic resemblance” does not mean what you think it does.
Edited 2012-02-04 20:36 UTC
That is not what I said. I said that obj-c is two languages smashed together, and the two bits don’t resemble one another. The SmallTalk-like objective bit doesn’t fit in with the regular C bit when maintaining code, and it’s sometimes a pain to decipher poorly-written obj-c code much more so than even straight C.
P.S. You would debate better if you leave out accusations.
And sometimes “poorly written code that is hard to maintain” has nothing to do with the technical merits of the language, and it’s simply an indication that the original coder and the maintainer are just bad programmers who don’t understand the language.
Again, Objective C is an object oriented extension to an imperative language. The C code in Objective-C looks and behaves just like C. The object oriented syntax and behavior is lifted from SmallTalk.
And there is a very clear reason for that: The syntactical differences in ObjC make very clear under which programming model one is operating, unlike C++ for example.
It is very simple, if you see something that does not look like C, you know that you’re on the OO part of the programming model and vice versa. Really, it does not get any more elegant than that. Apparently you consider one of the principal features of the language to be a bug or a side effect, which led me to believe that you indeed do not understand what ObjC is.
Given the fundamental differences between imperative/functional and OO programming models, I would make the case that trying to shoehorn both under a C syntax, which was derived from a purely imperative model, is a recipe for disaster. Which is why so many C++ programmers end up producing bastardized C; functional code with useless OO wrappers.
Edited 2012-02-06 00:46 UTC
That wasn’t really what you were going at, above… (but quite nice cop-out)
Are you implying that Java sucks?
That said, Java’s Swing tool-kit was always a slow ram hog when used without proper knowledge.
I’m implying that the gold rush on mobile apps has attracted sub par developers that write crash prone iOS apps because they don’t understand memory management.
I’m pretty sure that Objective-C uses some kind of mem management… It’s not just plain C
Aye, It uses reference counting…
(2.0 also has a GC but that isn’t present in iOS)
Edited 2012-02-04 21:48 UTC
Not really.
The GC introduced in 2.0 was a failure, because it was not compatible with many Objective-C frameworks.
The reference counting mechanism introduced recently is a joke, because basically the compiler relies on certain programming patterns to guess what to do. And this, again, only works with certain frameworks.
http://clang.llvm.org/docs/AutomaticReferenceCounting.html
Like Apple devs? Because right now most crashy app on my iPad is Safari.
The problem isn’t with the developers. They want to focus and making the app do what their imagining, why should they waste time on memory management? C# and Java prove you can develop without memory management.
No, it’s the programmers. You can perform bad memory management in every programming language (including java).
Ex:
In C/C++ if you don’t release memory, you’ll get memory leaks
In Java or .NET: If you don’t dispose objects the GC will not re-use their memory, and will keep requesting new memory blocks from the OS with every new().
Memory management is the responsability of the programmer, not the language (despite what some Java or .NET folks say).
They don’t. You are talking out of your ass.
Actually they do. Do some simple research instead of pulling it out of your ass.
Edited 2012-02-04 00:18 UTC
You’re the one who made an outlandish claim, not me, therefore the burden of proof is on you. The very fact that you didn’t post any evidence of your arbitrary assessment is why I qualified it as talking out of your ass.
So, I eagerly await you to post your own research. It shouldn’t take long, since you claim such research is simple.
Since I’m in a good mood though, I’ll warn you about a couple of pitfalls that may undermine your research credibility:
– anecdotes don’t count (things such as “I use this thing written in C++ and it crashes all the time”)
– any comparison of the rate of crashing between java and C++ application must be adjusted to take into account the much larger amount of C++ applications compared to java applications.
Let’s see you research. I’m sure it will be enlightening.
Edited 2012-02-04 11:35 UTC
I use KDE, which is written in C++. It crashes on me (not necessarily the whole DE) more in a month than all the Java apps together I’ve ever used. Claim proved.
Anyway, I’m not sure Java SE and the Dalvik VM has so much in common that we can make any valid deductions from the former’s performance.
Don’t use KDE. It’s a beast that tries to do too much with too much code, etc. Something leaner and more modular is probably going to be more reliable.
I use KDE everday at work. It very rarely crashes, and KDE/QT/C++ apps I run also crash very rarely.
However, a large Java app I use everyday (BCeSIS) also crashes pretty much everyday.
Thus, your claim is disproven. :p
Hence why the GP (GGP?) listed anecdotes as “not allowed”.
Eclipse alone crashes enough to turn that statistic around.
Well… Technically they do. But mostly because it’s easier to screw up a C++ app and/or memory management.
It’s easier to screw up the design of a java program and get away with it. c++ punishes bad programming practices a lot harder.
C++ punishes bad programming practices? No, it punishes the subsequent maintainer of that program.
As design goes, it’s as easy to screw up C++ as it is Java.
Java is not native so when it segfault, it just throw and exception and go on. While it may corrupt the application state to the point it still explode shortly after, it is not because of the fault itself, but the consequences of it. C++ application wont survive calling a method from an invalid pointer, array overflow and division per 0. It will close instantaneously.
That said, Java do suck and I code in C/C++/Lua when I can.
I don’t think a buggy java application doing the things you mention will survive it any better than C++.
I doubt many java developers setup exception handlers to recover gracefully (or at all) from a division by 0, an out of bound array access or trying to dereference a null pointer.
The thing is, a simple try {} catch(e){printstack} autogenerated by Eclipse will usually (and often accidentally) catch them and allow the application to go on. As long as you use regular exception instead of specific one, the application will stay open without any additional/intentional work.
Handling a division-by-zero error in C++ is trivial. Subclass std::runtime_error and handle it like you would any other exception.
Only if you’re using visual C++ and their silly structured exception handling, which is pretty bad for various reasons, including performances. Throwing an exception on a division by zero is non standard behavior.
Also, a division by 0 won’t necessarily cause the program to halt. If you work with floats it will usually just yield a special type of NaN value indicating an infinite.
The best way to deal with divisions by 0 is to special case the situations where they can happen, or assert if it is never supposed to happen.
If you don’t explicitly deal with the division by 0 cases it can result in either a crash (that includes uncaught exceptions) or bugged behavior, in absolutely any language.
Edited 2012-02-05 14:49 UTC
Eh? Any compiler with even a basic form of standards compliance can produce code that handles a division-by-zero exception. As I said, it’s trivial to implement, coming in at a few lines of code.
Of course, but that’s irrelevant to the point I was making, which is that division-by-zero is not necessarily fatal in C++. The OP was claiming this to be an inherent and irresolvable flaw in the language; I showed that the language provides you the tools you need to handle it if you so wish.
“Dividing by zero” errors implicitly assume that you’re working with integer types, at least in this neck of the woods. Surely everyone knows that it means something else when working with floats?
Don’t try to argument on these forums. People here are heavily biased when it comes to something Apple. E. g. you always read stories about how bad Apple is using patents but you hardly read anything about Motorola, Nokia, Microsoft and all the others doing the same. Apple is bad to produce in China but Motorola, Nokia and the rest doing the same are never mentioned (and surely not consumers who above all like to buy cheap PCs, TVs and other consumer electronics which are the root cause of the misery going on in consumer electronics). Etc. etc.
If the horizon of somebody is so narrow that it equals his/her point of view, you’d better leave him/her alone ..
When I write the IOS apps, they do…
I blame everyone except myself. I also invoke my right to blame every layer of software my code sits on top of, including previous chunks of code I wrote but others looked at.
Most of crashes on my iPad are low-memory condition.
So on a devices essentially without multitasking and fixed amount of RAM you get crashes caused by low-memory, that’s rather brilliant.
You confirm the bad applications on iOS part at least, since it can only be applications with memory leaks or broken application designs requiring more than the existing memory.
on android out of memory the application locks up and continue to hog the resources. a dialog pops up asking what to do with the app. When I see one of those i have to reboot the whole system otherwise that app will freeze up again in short order if I just relaunch it.
Edited 2012-02-04 14:31 UTC
You do realize that it essentially sends a kill -9 to the process when you press force close?
And I suggest you remove the “Advanced” Task Killer to remove the problem of memory hogs.
I just setup long press on my back button force closes apps (in the developer settings) works pretty well when apps lock up.
I agree with the crashes being due to poor programming. There’s this app, for example, iFormulas, that only worked on iOS 3, and as soon as it became 4 it stopped functioning, it just crashes. I’m surprised it is still in the app store.
…my Android phone (LG Optimus V) gets full system crashes (forced automatic reboots) quite frequently when running Google Maps and certain GPS applications.
I have seen a few programs crash without bring down the whole system, but sometimes it’s fixed by clearing the program’s data (Winamp). In other cases–the AccuWeather widgets–I just uninstalled them to get them to stop, because once they started they wouldn’t quit.
I think to a great extent it depends on one’s usage patterns, as well as the limitations of the hardware.
For example, my Nook Color with Cyanogen 7 stable has app crashes on a daily basis, and system halt dialogs every so often. The crazy thing is, it’s usually core Google apps that crash. I’d say Google Reader, Books and Docs are the worst offenders.
On the flipside, the iPod touch I used to own rarely suffered app crashes, and I only recall one spontaneous reboot the entire two years I owned it.
I also have a Motorola Cliq that will reboot several times a day whether it has the original Android/MotoBlur 1.5, the official 2.1 update or Cyanogen 7 installed. Obviously it’s a case of buggy hardware.
So in short, I don’t put a lot of faith in the “Android crashes less” claims. The true issue is that there is such a small pool of devices that run iOS and a comparatively huge selection of Android hardware out there. A few bad eggs truly can ruin a company’s reputation.
But again, as the article notes, how many times have you had the app just close without your conscious decision to exit it? Because that makes it seem like you never actually had a crash…
If you’re referring to the iPod touch, I said I rarely had apps crash, not “never”. As the article says, when an iOS app crashes it’s silent and just takes you back to the Springboard. That happened to me perhaps five times in the entire time I owned the iPod, and from what I recall, never in a native app.
I’ve found just the opposite, My Samsung Galaxy S crashes all the time, apps hang, the home screen hangs, it’s really awful actually, it’s so awful that I bought an iPhone 4S which aside from having Location Services get stuck in an area of poor reception, has given me no problems and has yet to crash on me!
My experience is the same. I’ve never had an iPhone or iPad because they’re too expensive, but I’ve had so many problems on Android (Samsung Galaxy S and, before it, HTC Hero using Android 1.5 and then 2.1) with apps hanging and crashing. Until last week, Facebook was the worst offender – hanging all the time, especially when you opened up any timeline, when it would hang for several seconds or until you switched out of the app and back in again, before it would let you scroll, and the notification list had the same problem – but the Mail app isn’t much better – it often hangs when opening messages. I would probably get an iPhone if I could afford one. Really wish Meego had done better – it looked like a much better open-source phone OS than Android and used C++, not Java.
Another day, more iOS FUD from Thom
If this indeed is a real problem the solution is already here, ARC.
I’ve not observed either android or iOS to be particularly bad with crashing apps… Which given the fact that iOS programmer have to deal with both pointers and no garbage collection is quite a feat
i’m not so sure that the amount of crashes is due to the os or platform in this case, i think it is more to do with the developers who make the Apps. heck, it seems that people in the IOS camp especially are a bunch of unprofessional pains in the — who keep releasing a new version of their Apps every day or two – why can’t they roadmap them and release them when they have something worth releasing, probably related, they probably didn’t test their Apps properly too. This to me is the downfall of IOS’s automatic updating – having to update 600 Apps (of the 2000 I have downloaded) each night, of which only a few may have added something worthwhile.
One runs apps on a custom JVM (Android – Dalvik), the other runs apps natively with an objective-C runtime. The benefits of iOS are speed of binary execution. The downside are hard-crashes. It’s way more difficult (impossible, if not using the NDK) for android developers to dereference a null pointer. If they do something stupid like divide by 0, they probably have it wrapped in a try/catch block and handle the error – thereby preventing a hard crash.
On my iPad and phone the crashes I get are usually from memory intensive apps that are competing for resources from other apps I had in the background. Things like video editing, drawing apps, and video games. I never have crashes in the core system software or the preinstalled apps. The good thing is that it tends to be just the app that drops away, and it never seems to affect the rest of the system.
Crashes are so rare for me in the desktop OSX world. I am sure having a sea of system resouces available hides memory leaks that stand out quickly in the tight memory space of a mobile device.
Edited 2012-02-04 15:34 UTC
Let me guess, most crashes are from Python/Ruby/Java developers wannabe rich iOS developers that never coded with pointers in their life.
Or had to care about manual memory management.
My iPhone 3gs started to have more crash problems with the 4.x iOs. It was much more stable (e.g. never crashed) with 3.x.
But still, crashes are not often and I’m happy with it. My old HTC G1 (android 1.6) crashed every single day. And I see my friends Android devices crashing all the time, and with simple (even native) apps.
There are some new (expensive) android phones that can’t even show photos without lags. That’s just humiliating.
Long live iOS!
Most crashes that i have seen from ios depends on callbacks from third party libraries and in rare occasions original libs. If i make a threaded app that fetches data and responds with a callback, well thats is nice. When the object that requested the data get’s freed because it’s no longer in use by the main application. What can happen does happen when you fetch data is that you never know when you will get it or if you will get it.
Well the problem now is to tell the fetcher that the object that requested the data no longer exists and cant be called without breaking the application. Many libraries that i have tried actually have functions to tell it that im getting removed so do not respond to me, but they often work badly since the work is done in separated threads so you can still get timing issues.
Maybe im stupid and doing stuff wrong but i have yet to find a good way to see if a pointer to a object still is valid. And also many times you are working with a deadline and use loads of third party components and hunting down bugs in these are really time consuming.
Another problem with ios apps are that the price is so low so in order to not lose money people rush the development and cheat in order to not make the development to expensive. this is something i believe is tossing money in the bin since updates of such software will be to expensive.
Android is not a standard Linux kernel.
http://elinux.org/Android_Kernel_Features
Ashmem is part of the secret why bad coders on android get away with being bad coders. If a ashmem block is not currently pin the kernel can dispose of it when Linux kernel runs out of memory.
Next is the Android killer system. More likely to take out something in background than forground. OSi kinda will kill everything. OSi something killed due to out of resources also displays as a crash.
Davik is doing better garbage collection.
There are design differences that explain what is being seen.
Well, in my personal camp, where I use my own phone, crashes have happened in the past from bad apps, doing bad things.
Now, the term “crashes’ is a bad term because it is imprecise.
Crashes that happen that require the phone to be rebooted are not acceptable.
An App crash is annoying, but acceptable.
I have friends who I see rebooting their iPhone’s because the entire phone becomes unresponsive, which I must admit very rarely happens on my HTC EVO. Rarely though do they complain about App crashes. But it seems to me iOS requires rebooting the entire iPhone way too often.
Now, if you give the consumer the ability to contribute changes or make corrections to your phone’s software, obviously that is going to speed corrections, and increase reliability.
Android does this. Apple assumes the customer doesn’t need that ability, which means technically you should see more crashes, and slower correction times for issues because the customer can’t do it themselves, they have to wait for Apple.
I can also see Android being a platform which yields more reliable up time for the phone, due to the apps use of Java.
Which as people have pointed out is a very effective sandbox, able to limit memory and system resources that prevent the app from whacking the phone.
Since this is a virtual machine environment, those sorts of things are quite easy to do.
I am not familiar with iOS enough to know whether or not they have a similar ability.
-Hack
My iPod touch which is now roughly 3 years old frequently crashes apps. It has come on gradually and relates the degrading of the storage with age and use.
The music which has always been on; and changes little seems to work reliably. Some of the newspaper apps that reload their text and images regularly … fail within 15 minutes or so of use.
Solid state storage does not have the durable reliability we imagine. You get used to trusting it and then it fails.
I’m probably out of step – most normal people replace their handheld devices every 18 months or so.
The Java/ C++ debate is only part of the debate … not all of it.
Far from it – for one (in the case of most typical nowadays handheld electronic thingy, mobile phone – it is what really brought DAP or digital cameras to humanity), most of the 5+ billion mobile subscribers own their devices, and are on prepaid / not on contract. The average usage time is, IIRC, at least two times longer than you imagine.
(also, flash storage typically accounts, I believe, for its slow degradation / I didn’t feel such problems as you describe / maybe yours are elsewhere)
Edited 2012-02-11 00:09 UTC
As a recently graduated software developer, doing my internship at a company that required a Android application, then finding a job at a company that required me to develop and IOS application I can give a small insight into this from a developers perspective.
The reason my internship was successful was based on one thing, and one thing only, for Android that uses a version of Java I have had enough experience using that language from my school courses.
My software development education started with ANSI-C then C++, Java ,C#, along the way we touched some other languages like python and php. The point of this transition from lower level language to higher level language was mainly due to change of marketplace.
It also meant that a lot of lowerlevel skills were less and less relevant, mainly the usage of pointers and included assembler language in your application. Garbage collection was also to become a process you did not have to keep track of yourself.
Higher level languages do this all for you, this means programming has a lower barrier of entry. This however does lead to being more like building with Lego rather than carpentry.
Some of my fellow students don’t have a clue what individual systems do, but can hack something to gather that on first inspection looks somewhat like the client has specified.
IOS on the other hand uses a dialect of C, objective-C that on first glance was completely unreadable to me. I had a hard time following what happened and what was done. Function names that also included multiple arguments were a big WTF for me.
Now the main point here was that I did not have a large investment in this language and my development time went grossly over budget. Coupled with me having to develop on a system that I have never used before (Mac, instead of Windows or Linux), a incomplete development toolkit, only recently (last few months) the debugger in x-code has been useful, no more selecting breakpoints and not stopping, etc.
Another thing my fellow developers struggled with was the use of pointers and garbage collection. As far as I have seen in IOS you have to do it yourself. This ofcourse can lead to highly efficient applications, but in the hands of novices that can only Lego that pretty much means applications will break at some point because developers have no clue what is happening.
And you can be certain that no-one making apps for IOS has any investment in objective-c. Everyone just jumped on the bandwagon.
P.S. I own neither device, no interest too, I have an old Nokia C1209 and I’m keeping it till it breaks!
Yeah, sometimes I wonder what do people learn on universities nowadays.
Back on my day, any student had several lectures about computer design, Assembly (MIPS and x86 on my case) and C, before going to higher level languages.
With C being the language requested for about 70% of the university projects.
So it always amazes me that today’s kids don’t know anything about low level programming.
Can one really compare Android OS to Apple OS when it comes to app stability?
Each app is in varying degrees of development within its market. Also, each device is in a different state-of-condition while it runs an app. Different daemons running (depending on make and carrier of the device).
To get the absolute best comparison, one would need to have various Android Devices and iOS devices tested in a controlled environment.
Another thing to consider: How are the devices used? The more bells and whistles that are in play, the greater chance of failure. Android incorporates more user control and custom ability.
One could say that iOS apps are more stable, simply because of the the restrictions Apple imposes on its market. Less bells, less whistles…less of a product for comparison.