“I decided to do some simple tests to see how important RAM is when running applications under Rosetta. If you rely heavily on applications that are not yet universal make sure you load that new Intel Mac up with 1.5 – 3.77 times the RAM in your current PowerPC machine.”
The applications are recompiled dynamically to x86 code. So they have to have all the old PPC code, as well as the new x86 code in memory to function. And depending on how the program laid out its memory, it may require more memory to emulate that when compiled to x86 code.
Is that PPC (now legacy?) code using 64bit space as well? Could that account for the extra? i.e. 64bit + 32bit + chunking = extra RAM requirements.
Is that PPC code using 64bit space as well?
No, Rosetta emulates a G4, which doesn’t support PPC64.
(now legacy?)
Not yet. Apple still sells PPC, and at least the G5 boxes are still pretty competitive.
Who said it was surprising that running a program in Rosetta will use more memory? It certainly wasn’t that little blog entry, and as luck would have it you’re the first person to comment on this article so it can’t be anyone here.
Since you don’t require testing to know how much more memory will be used to achieve a reasonable performance profile for these applications, why don’t you tell the rest of us that might otherwise be forced to attempt to determine such information empirically. It would save everyone a lot of time and the world lots of reporting of awkwardly-obtained datapoints.
That is, unless you don’t actually know how much more memory is used.
It’s also interesting to note that it isn’t strictly necessary to keep all of the code in memory. That sort of takes a secondary role to people complaining about the very act of presenting information on a subject they feel is “obvious.” There’s already more than one knee-jerk response here, alone. It’s absolutely bizarre that it’s this, and not his methodology that is criticized.
The reviewer ran some applications in Rosetta on a Mac with 1GB. The largest single application took up 90MB, the total size of all apps running at the same time was 314MB, and he has therefore concluded that he should upgrade his RAM to somewhere between 1.5GB and 3.7GB.
Genius.
Edited 2006-01-23 18:26
you mean more RAM MAY make osx run faster?! I wish someone had mentioned this before!!
I wish i could mod this more than +1.
There is a rather negative tone in the abruptness of this submission, but where exactly is the revelation? Is it a gigantic shock that extensive reliance on emulating another complex platform is going to incur a gargantuan overhead? If anything, it is a point in Rosetta’s favour that only “1.5 – 3.77 times the RAM” will be needed for acceptable performance. The value of this story lies purely in its “shock” sensationalism; the idea presented is so obvious to any reader with a clue that it will serve no viewer besides the angry Apple elitist whose feelings towards Apple’s Intel switch are ones of betrayal and spite, desperately looking for some stick to beat Apple over the head with.
When the time comes, the major applications listed in this article will have native ports made available. Rosetta isn’t going to be what you run an Office suite with; it’s going to be what you use to run that obscure application that will be forever left without an Intel-native port.
Would you expect to run VMware flawlessly with a sub-standard or even average amount of memory? Although I admit, VMware is on somewhat of a larger scale, it is not by much that it is.
Yeah; this article’s writer sure is a regular Dijkstra.
…anyone who is a Mac Pro user is just going to stick with a PPC Quad for the next few years until all this mess gets straightened out.
Obviously the next upgrade cycle will require buying all new hardware and software, again.
Businesses can’t be incurring these sort of costs and painful changes.
Apple was right about a upcoming slowdown in Mac sales, after demand for the MacBook Pro is satisfied, expect a slumber until September when Intel releases the 64 Core Duo and then don’t expect a whole lot to jump off their present hardware.
Of course unless Mac OS X runs all Windows software without Windows.
Then watch as the cash will pour into Apple from heaven.
Of course Microsoft won’t allow this to occur with Vista. Most likely release a Windows version for MacTels.
Of course putting Windows on Mac is like letting a retard drive your Ferrari.
(no offense meant to the win loving crowd obviously)
“no offense meant to the win loving crowd obviously”
No, but who you offend grievously is the Acer loving crowd. You see, we also have Macs, just with Acer labels on them, and can’t see why running Windows on them is like doing anything with “retards” and “ferraris”.
You also offend a possibly different crowd who don’t quite know who you mean by “retards”.
And amuse another crowd, who find it charmingly bizarre that there are really circles in which Ferraris strike anyone as aspirational goods! Probably the same circles in which people doze off at night and dream of fine dining one day, off Lenox china?
“Of course putting Windows on Mac is like…”
But in this day and age only running Mac OS X on a Ferrari is illegal ;^)
PowerPC code is quite a bit bigger than x86 code, and the Rosetta-generated x86 code is probably somewhat bigger than directly compiled code as well.
But executable code normally only makes up a small part of an application’s memory footprint, therefore the code alone doesn’t explain those big extra memory requirements.
Did that guy determine the requirements correctly though? Perhaps some of that is temporary stuff generated during JIT translation that just hasn’t been reclaimed yet.
Or does Rosetta employ some memory-busting scheme to work around the endianness problem?