Linked by Thom Holwerda on Sat 1st May 2010 18:52 UTC, submitted by kragil
PDAs, Cellphones, Wireless "Android is fairly unique in the ways it allows multiple applications to run at the same time. Developers coming from a different platform may find the way it operates surprising. Understanding its behavior is important for designing applications that will work well and integrate seamlessly with the rest of the Android platform. This article covers the reasons for Android's multitasking design, its impact on how applications work, and how you can best take advantage of Android's unique features."
Thread beginning with comment 422035
To read all comments associated with this story, please click here.
paradigm
by stooovie on Sat 1st May 2010 19:58 UTC
stooovie
Member since:
2006-01-25

Nice and verz informative article. I actually believe the need for multitasking as we know it now should be completely eradicated. When you think about it, opening applications is just a crutch designed to obfuscate memory limitation of today's PC architecture. Apps are really just an abstraction, interfaces between user and data. Ideally, user shouldn't even worry about opening an app to be able to work with data. We can get there in the future when there is no difference between short term (RAM) and long term (HDD) memory.

I think that's where Apple (and others, of course) is headed with state-saving on iPhone and dock in OSX, which further diminishes the indication of running/not running applications with each release.

Reply Score: 4

RE: paradigm
by pel! on Sat 1st May 2010 20:29 in reply to "paradigm"
pel! Member since:
2005-07-07

That's just wrong.

Multitasking has very little to do with memory (albeit memory management might be required) and everything to do with concurrency of processes - i.e. faking concurrency by interrupting program execution every now and then to let another process use the CPU(s) for a while.

And yes - it does context switching - but that is not necessarily the same as memory management.

Reply Parent Score: 1

RE[2]: paradigm
by oreissig on Sat 1st May 2010 21:34 in reply to "RE: paradigm"
oreissig Member since:
2010-05-01

Multitasking has very little to do with memory (albeit memory management might be required)

That's exactly the point, as mobile devices are low on RAM and do not have suitable storage for swapping/paging as larger computers do. So memory management IS key to serious multitasking on the phone.

Reply Parent Score: 1

RE[2]: paradigm
by tyrione on Sat 1st May 2010 23:01 in reply to "RE: paradigm"
tyrione Member since:
2005-11-21

That's just wrong.

Multitasking has very little to do with memory (albeit memory management might be required) and everything to do with concurrency of processes - i.e. faking concurrency by interrupting program execution every now and then to let another process use the CPU(s) for a while.

And yes - it does context switching - but that is not necessarily the same as memory management.


How assigning order of precedence to each process is now being coined as faking priority seems rather odd.

Concurrency of threads per process, now that's an even finer grained process methodology.

Reply Parent Score: 2

RE: paradigm
by oreissig on Sat 1st May 2010 21:32 in reply to "paradigm"
oreissig Member since:
2010-05-01

This concept of unified Memory for both RAM and HDD is not new, in fact is has been practiced for years in IBMs Minis: http://en.wikipedia.org/wiki/IBM_System_i#Features

Reply Parent Score: 1

RE: paradigm
by Neolander on Sun 2nd May 2010 06:44 in reply to "paradigm"
Neolander Member since:
2010-03-08

Nice and verz informative article. I actually believe the need for multitasking as we know it now should be completely eradicated. When you think about it, opening applications is just a crutch designed to obfuscate memory limitation of today's PC architecture.

What is the link between multitasking (having several apps running seemingly at the same time, crucial for any modern computing) and opening apps (which means, I suppose, loading them in main memory) ?

And why would opening apps be bad ? Do you want to keep everything running in the background permanently ? It would require an insane amount of memory to get everything properly on a computer where the whole Adobe Creative Suite plus Windows Vista plus Cubase are installed, as an example. And if you use swap space, you effectively go back to the need of loading the app from the HDD (and hence will experience lag and everything).

Do you think that main memory is going to be large enough that we won't need slow but large memories someday in order to hold all apps in the average computer ? I don't see that happening at the moment, because at least until now, each time computer RAM increased, software got more bloated in order to fill it, in an amoeba-like fashion... But who knows, those MRAM and memristor things could change the computing world as we know it...

Apps are really just an abstraction, interfaces between user and data.

No. It's this illusion of user-data interface and the unified "data" model coming with it that is an abstraction. Apps are needed at a core of an OS, in order to separate various pieces of work. It allows countless Good Things, including better security, bug prevention, hardware allocation to one task at a time, and so on...

Ideally, user shouldn't even worry about opening an app to be able to work with data.

Yes, maybe the unified data model should be pushed even further. Even though doing so works only when you make the assumption that all applications live to work with user data (A model which has some limitations when you have modern computing in mind : What about games ? What kind of user data do they operate on ? Same for web browsers ?)

But would it catch up or not, I think that apps (processes) will continue to exist at the core of an OS as a developper abstraction for a long time. It's a rather simple abstraction, and it works damn good. It's like threads : everyone seems to hate them, but I did not encounter an Occam-compliant replacement yet...

We can get there in the future when there is no difference between short term (RAM) and long term (HDD) memory.

Well, consider computing a long time ago. HDD were 20 MB large and it seemed huge. If we continued to make applications like on those days, the 2 GB of RAM that most modern computers have would be huge, and would allow one to keep everything in RAM and only store saved data on non-volatile memories.

Sadly, it didn't happen. Instead, every time RAM did grow bigger, software got bloated enough to fill it, in an amoeba-like fashion. DOS fit on a few dozens of KB of memory, now Windows XP takes hundreds of MB and its successors even more, though the purpose of operating system (help user to safely play with data and apps and help developers to code said apps) did not change that much. And let's not get me started about Steinberg and Adobe software compared to their freeware counterparts (if any).

Even if something as fast as RAM and as large and nonvolatile as HDD/SDD appeared someday, it will stay expensive and unreliable for ages. During this time, RAM would have got larger and faster, software would have got ten times bigger, and the old organization would take place again. At least that's how I see it.

I think that's where Apple (and others, of course) is headed with state-saving on iPhone and dock in OSX, which further diminishes the indication of running/not running applications with each release.

Saving state cannot be applied to everything, mostly because...
-> Storage memory is incredibly slow. If you have 10GB of apps running in memory + swap, saving it when the machine turns off and loading it when it turns on will take several minutes, which is unacceptable.
-> Endlessly saving state is not good for cleanness. Of course, in an ideal environment where bugs and memory leaks wouldn't exist, it would be fine. But human specie is not ready for bug-free coding. If there's a memory leak somewhere, memory usage of the OS will gradually increase, to the point where the whole software will fill the entire RAM and swapping will occur, leading to major performance loss. Sometimes, it's good to close all data, turn of the computer, and go back to a "clean" state, instead of always keeping some mess around in main memory...

Reply Parent Score: 1

RE[2]: paradigm
by Neolander on Sun 2nd May 2010 10:16 in reply to "RE: paradigm"
Neolander Member since:
2010-03-08

-storage memory +nonvolatile memory
I must have fell asleep for a second...

Reply Parent Score: 1

RE[2]: paradigm
by stooovie on Sun 2nd May 2010 15:01 in reply to "RE: paradigm"
stooovie Member since:
2006-01-25

I know all that, Neolander ;) I was talking about user interaction paradigms as a whole.

Reply Parent Score: 2