Linked by Thom Holwerda on Fri 21st May 2010 12:45 UTC, submitted by martini
OS/2 and eComStation After god knows how many years, it's finally here: the final release of eComStation 2.0. We first reported on eComStation 2.0 back in December 2005, when the first beta was released, and between then and now, we've seen countless betas and release candidates come and go, but the wait is finally over.
Permalink for comment 426003
To read all comments associated with this story, please click here.
RE[3]: Hmm
by vodoomoth on Fri 21st May 2010 21:08 UTC in reply to "RE[2]: Hmm"
vodoomoth
Member since:
2010-03-30


32bit kernel? That is not... a good choice for an OS, too limited. Even today, people can not use 32bit Windows to its full extent, more and more need the power and memory freedom from 64bit. If you really need stability and performance and no practical limitations, choose a real Unix, not Linux.


That's what I was saying in my previous comment: we've just settled for what's been given to us as some sort of OS gospel dyed with consumerism. Does anyone recall what was the standard laptop RAM size in 2000? I'm really interested into knowing. I have memories of my father using a word processor on an Amstrad CPC6128 in the late 80s. In 2000, I was using Office 97 and Windows 95 on a Pentium 133Mhz Olivetti laptop with with god knows how much RAM. What is it, that's so crucial, that Word 2007 can do today that I couldn't do in 2000 with Word 97?

Can you think of a single relatively common application (or task) that:
- exists today
- didn't exist in 2000
- requires that amount of RAM ?

I can't. Please, don't reply "games".

I've never filled the 2GB in my laptop in 2 years. So memory freedom is something I had never thought about.

The questions that arose from that comment are:
- why can't OS makers make smart OSes? Remember that 32 bit versions of Windows up to Vista were limited to 3GB? 32-bit Mac OS X 10.5 managed more than that. Obviously, there was a problem with Windows. My RAID controller and the Intel Turbo Memory are mutually exclusive. There's no explanation nowhere and I just have to deal with it.
- what's the proportion of 32-bit XP, Vista, Windows 7 compared to their 64-bit counterparts? Is it so unbalanced in favor of 64-bit that the viability of 32-bit in the coming 5 years is questionable?
- is RAM the reason for a 64-bit architecture? I thought it was twofold: speed of data transfers to memory and between registers, and width of computations.


... and in some time, hardware will cease to support 32bit architecture.

There's no way this will ever happen. As far as I know, all x86 still support using 8-bit registers in instructions and memory I/O. Same for 16-bit. Unless OS and CPU makers agree to get rid of the sacro-saint backwards-compatibility paradigm, thereby forcing it upon compiler and IDE actors, unless there's a radical shift and "legacy" becomes a banned word, this will not happen.

Furthermore, there's technically no justification to not supporting 32-bit on a 64-bit architecture. Computation-wise, you just dismiss the higher bits; same thing when reading from memory: read 64, use 32. But writing to memory would require reading 64 (or, in the worst case, 128 bits, with alignment considerations) first before changing only 32 and writing back all bits. That's what happens when changing overwriting 1 byte in a binary file: the whole sector (or cluster) is read before being written back.

Of course, software makers will get either lazy or greedy, come up with an excuse saying that 32-bit support is slowing things down (as if speed ever worried them) and we'll have to move on to something supposedly faster only to realize that we should have remained where we were (from my personal experience when I moved from 10.5.8 to Snow Leopard).

Reply Parent Score: 5