Linked by Thom Holwerda on Fri 21st May 2010 12:45 UTC, submitted by martini
OS/2 and eComStation After god knows how many years, it's finally here: the final release of eComStation 2.0. We first reported on eComStation 2.0 back in December 2005, when the first beta was released, and between then and now, we've seen countless betas and release candidates come and go, but the wait is finally over.
Thread beginning with comment 425951
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE: Hmm
by warpcafe on Fri 21st May 2010 15:48 UTC in reply to "Hmm"
warpcafe
Member since:
2009-09-09

Hi,
you are right. There is a lot it can't do, even at its high price. The thing is, people still think eCS WILL or MUST HAVE TO compete with Windows [whatever], Linux flavours or Mac... but that's simply not true. Other than that, nobody forces you to buy it.

While its UI will never be as pretty as Win/Mac, nor its core will ever be as "good" or "open" as with Linux, it's being "the thing in between" that I like. Windows is hiding its ugly internals behind polished interfaces and Linux will (forever) do exactly what you want it to do only if you quit its GUI and type a 300+ chars commandline by heart in a shell.

True, the kernel is 32bit "only" and will perhaps never grow to anything beyond that unless rewritten from scratch (good luck here...) and in some time, hardware will cease to support 32bit architecture. Who cares? I'll ride the pony til it dies. That day come, can still decide which OS to go with... Haiku? Why not.

Until then, I gladly pay 149 bucks to have an OS that has no exposure to any virus threat and (especially important for Germany) no way of being able to incorporate a government backdoor spyware. I can surf the web, do emails and use word processing and spreadsheets. That's it. For entertainment, I have a TV, a Wii, a smartphone and -yes- a Windblows machine here (which runs Ubuntu in a VM in case I need to bring some work home...)

Cheers,
Thomas

Reply Parent Score: 3

RE[2]: Hmm
by nt_jerkface on Fri 21st May 2010 16:26 in reply to "RE: Hmm"
nt_jerkface Member since:
2009-08-26

OS/2 is dead.

The only place this OS fits is in business terminals and memories.

You're better off paying a psychiatrist $150 so he can help you with your denial issues.

Reply Parent Score: 2

RE[3]: Hmm
by renhoek on Sat 22nd May 2010 21:25 in reply to "RE[2]: Hmm"
renhoek Member since:
2007-04-29

You're better off paying a psychiatrist $150 so he can help you with your denial issues.


No, i'm not.

Reply Parent Score: 2

RE[2]: Hmm
by Kebabbert on Fri 21st May 2010 16:30 in reply to "RE: Hmm"
Kebabbert Member since:
2007-07-27

This high price is for those companies that have a OS/2 solution, and must upgrade. They are the only one that are willing to pay. OS/2 will never get any new users, and they know it. They are not after new users. It is better to milk the old cow as much as possible. It they really wanted to increase the user base, they would have a much reasonable and lower price. But they know they can not compete with the open sourced OSes.

32bit kernel? That is not... a good choice for an OS, too limited. Even today, people can not use 32bit Windows to its full extent, more and more need the power and memory freedom from 64bit. If you really need stability and performance and no practical limitations, choose a real Unix, not Linux.

BTW, it is funny that OS/2 has GUI tech from Amiga, whereas Amiga got REXX scripting language in return, as Arexx.

Reply Parent Score: 2

RE[3]: Hmm
by vodoomoth on Fri 21st May 2010 21:08 in reply to "RE[2]: Hmm"
vodoomoth Member since:
2010-03-30


32bit kernel? That is not... a good choice for an OS, too limited. Even today, people can not use 32bit Windows to its full extent, more and more need the power and memory freedom from 64bit. If you really need stability and performance and no practical limitations, choose a real Unix, not Linux.


That's what I was saying in my previous comment: we've just settled for what's been given to us as some sort of OS gospel dyed with consumerism. Does anyone recall what was the standard laptop RAM size in 2000? I'm really interested into knowing. I have memories of my father using a word processor on an Amstrad CPC6128 in the late 80s. In 2000, I was using Office 97 and Windows 95 on a Pentium 133Mhz Olivetti laptop with with god knows how much RAM. What is it, that's so crucial, that Word 2007 can do today that I couldn't do in 2000 with Word 97?

Can you think of a single relatively common application (or task) that:
- exists today
- didn't exist in 2000
- requires that amount of RAM ?

I can't. Please, don't reply "games".

I've never filled the 2GB in my laptop in 2 years. So memory freedom is something I had never thought about.

The questions that arose from that comment are:
- why can't OS makers make smart OSes? Remember that 32 bit versions of Windows up to Vista were limited to 3GB? 32-bit Mac OS X 10.5 managed more than that. Obviously, there was a problem with Windows. My RAID controller and the Intel Turbo Memory are mutually exclusive. There's no explanation nowhere and I just have to deal with it.
- what's the proportion of 32-bit XP, Vista, Windows 7 compared to their 64-bit counterparts? Is it so unbalanced in favor of 64-bit that the viability of 32-bit in the coming 5 years is questionable?
- is RAM the reason for a 64-bit architecture? I thought it was twofold: speed of data transfers to memory and between registers, and width of computations.


... and in some time, hardware will cease to support 32bit architecture.

There's no way this will ever happen. As far as I know, all x86 still support using 8-bit registers in instructions and memory I/O. Same for 16-bit. Unless OS and CPU makers agree to get rid of the sacro-saint backwards-compatibility paradigm, thereby forcing it upon compiler and IDE actors, unless there's a radical shift and "legacy" becomes a banned word, this will not happen.

Furthermore, there's technically no justification to not supporting 32-bit on a 64-bit architecture. Computation-wise, you just dismiss the higher bits; same thing when reading from memory: read 64, use 32. But writing to memory would require reading 64 (or, in the worst case, 128 bits, with alignment considerations) first before changing only 32 and writing back all bits. That's what happens when changing overwriting 1 byte in a binary file: the whole sector (or cluster) is read before being written back.

Of course, software makers will get either lazy or greedy, come up with an excuse saying that 32-bit support is slowing things down (as if speed ever worried them) and we'll have to move on to something supposedly faster only to realize that we should have remained where we were (from my personal experience when I moved from 10.5.8 to Snow Leopard).

Reply Parent Score: 5