Linked by Jordan Spencer Cunningham on Wed 13th May 2009 01:18 UTC
Benchmarks Phoronix, known for their various speed tests and reviews, compared the latest in Ubuntu and what, until recently, used to be the lastest in Mac OS X with 29 different benchmarking tests. Some of the results were rather interesting.
Order by: Score:
intel on linux isnt a great comparison
by ideasman42 on Wed 13th May 2009 01:29 UTC
ideasman42
Member since:
2007-07-20

Ok, so you can justify testing graphics on linux with an intel card just because most people have intel cards but its not secret that intel drivers (whilst being opensource and improving blah blah), are nowhere near as good as intel drivers on other OSX or Windows.

Even so its fair to say that this benchmark shows exactly how poor intel performance on linux/ubuntu is so maybe this might do some good to kick the xorgintel driver team into making some better drivers.

The thing that gets my goat is that this benchmark and the comments that go with it suggest that linux has poor graphics performance in general which simply isnt true - animation studios and people doing serious 3D graphics work use linux, mostly with NVidia graphics sinec their support is way better then intel and amd.

Edited 2009-05-13 01:31 UTC

Reply Score: 8

averycfay Member since:
2005-08-29

They definitely should have used a system that would be decently supported in both OS's (nvidia graphics). Linux lost all the benchmarks that involved graphics acceleration, which is unsurprising I guess considering the quality of intel graphics drivers.

Also, this article reminded me of why I never go to phoronix... page 1 of 34834, sigh.

Reply Score: 7

Sorted
by Gone fishing on Fri 15th May 2009 05:17 UTC in reply to "RE: intel on linux isnt a great comparison"
Gone fishing Member since:
2006-02-22

I open The Register and what do I see

Karmic Koala - the next Ubuntu - will deliver improved video performance, tapping a new video driver architecture from Intel


The thing about Ubuntu and Linux in general it evolves and improves so fast

Reply Score: 2

Auzy Member since:
2008-01-20

That might be true, but I have yet to have seen any graphics card which runs faster on Ubuntu (even the Nvidia ones in the past at least seemed to be quite consistently slower on linux).

There are two problems here:
1) Having a kernel that forces people to provide open source drivers. Whilst linus tolerates closed source drivers, its a risk creating them. And companies aren't going to pour all their optimisations into open drivers which any other company can just steal the optimisations. Otherwise its like handing over money to the competition.

There is no good justification for this. If Open source is that great, then such drivers will succeed regardless.


2) Ubuntu doesn't develop linux, it just grabs a bunch of packages which other distro's have worked on. Canonical seems to only concentrate on their own projects.

Whilst the foundation of Ubuntu is shaky, Canonical is off spreading their resources further and starting other projects like the netbook remix, which has a shaking foundation too because barely any drivers on either are complete. And I have seen NO evidence of Ubuntu trying to collaborate with other companies either, to determine their needs. Everything seems based on assumptions.


3) The community. I've learnt from the Ubuntu brainstorm community that frankly, the most vocal Linux users are idiots. That's the biggest problem. I've argued with Linux users who believed that time shouldn't be wasted on wysiwyg editors because grandma should learn mockup languages / CSS instead for her site. And I've argued against many users who were totally convinced that DEB's/RPM's are more secure then shell scripts. Ubuntu's vocal population I think has turned too much into politicians who care more about spreading OSS then aiming to make the best software.

Compare QT/Cocoa to GTK for instance. GTK obviously gets dominated in general cases, yet, plenty of people seem to be on a crusade against C++. Its rediculous.

And because of the community, the end result is that Linux is still too risky to develop for.


I wouldn't blame the xorgintel team for this. If the community gave up their holy war and started once again writing the best software they can, because they want to (not because of politics). You'd end up with an MIT kernel which was completely open in all ways, and software which was developed with users in mind.

Reply Score: 8

lemur2 Member since:
2007-02-17

Having a kernel that forces people to provide open source drivers. Whilst linus tolerates closed source drivers, its a risk creating them. And companies aren't going to pour all their optimisations into open drivers which any other company can just steal the optimisations. Otherwise its like handing over money to the competition.


Assumes that only companies can write optimised software. Not a valid assumption at all. The one and only advantage that companies have in writing software is that companies have access to secret information held by ... companies.

Duh.

OK, so ATI have been good enough to release documentation recently, and even some code and a programming guide.

http://www.phoronix.com/scan.php?page=article&item=amd_r600_700_gui...
http://www.phoronix.com/scan.php?page=article&item=amd_r700_oss_3d&...
http://www.phoronix.com/scan.php?page=news_item&px=NzAxNg
http://www.phoronix.com/scan.php?page=news_item&px=NzE3Nw


Expect the decent, open-source, 3D drvier for ATI chips to follow within a month or so.

http://wiki.x.org/wiki/radeon
http://wiki.x.org/wiki/radeonhd%3Aexperimental_3D

Clearly not ready yet, but definitely on its way. Enjoy (when ready).

So ATI have released documentation (specifications) of their chips to open source programmers, open source programmers are busily writing an open-source driver for Linux for ATI chips, so that ATI chips will soon become the most powerful graphics chips available with a decent (non-binary-blob) 3D driver for Linux, which will no doubt be supported directly within the kernel, and hence Linux buying public will tend to buy ATI chips.

This is giving money away ... how exactly?

Edited 2009-05-13 02:56 UTC

Reply Score: 6

rockwell Member since:
2005-09-13

//3D drvier for ATI chips to follow within a month or so. //

Sure. A month or so. Just wait, it's right around the corner.

Reply Score: 3

Tom K Member since:
2005-07-06

Take a look at the sheer AMOUNT of documentation released, and the complexity of such a graphics processor.

Now try to tell me that the open source community can write as good a driver as ATI's for Windows in 1 month's time.

Reply Score: 3

alias Member since:
2007-02-11

Expect the decent, open-source, 3D drvier for ATI chips to follow within a month or so.


Intel had open specs and open drivers for far longer than ATI, the chip is simpler AND they actually have paid developers for the xorg driver. The intel driver is probably the best maintained driver for xorg right now.

The result?

The benchmark here clearly shows that the result is not what you would expect, the performance is awful. Comparing my hp box at work, the graphics performance are actually better in vista than linux with the xorg-intel driver, even after tuning the driver parameter for migration heuristics (which did a HUGE difference).

Compare this to the nvidia binary blob, which gives me the opposite result on another box. ATI is in an unusable state at the moment (I'm not taking sides with any graphic vendor, that's just a fact).

Reply Score: 3

uaxactun Member since:
2008-04-17

Ummm...no. Intel drivers were suffering from a *BUG*.

Reply Score: 1

averycfay Member since:
2005-08-29

This whole issue has nothing to do with having an open source kernel or gtk vs. cocoa or whatever.

It has everything to do with market share. Linux will get great desktop hardware support (3d graphics, wireless, acpi) when it has decent desktop market share. Look at the server market right now. You don't have to buy a server that's supported in linux. Every server product is well supported in linux. A manufacturer that put out a server product that wasn't supported in linux would be laughed at. All of that is because linux is installed on a good portion of servers.

Reply Score: 2

lemur2 Member since:
2007-02-17

This whole issue has nothing to do with having an open source kernel or gtk vs. cocoa or whatever.

It has everything to do with market share. Linux will get great desktop hardware support (3d graphics, wireless, acpi) when it has decent desktop market share. Look at the server market right now. You don't have to buy a server that's supported in linux. Every server product is well supported in linux. A manufacturer that put out a server product that wasn't supported in linux would be laughed at. All of that is because linux is installed on a good portion of servers.


There is a lot of misinformation being spread right now about Linux having only a small market share. They are actually talking ONLY about the desktop market.

If we are talking about the entire market wherein the devices you mentioned (3d graphics, wireless, acpi) are used, Linux would have a very decent market share of that entire market. Perhaps 20% or so.

http://blog.linuxtoday.com/blog/2009/04/windows-owns-96.html
http://blog.canonical.com/?p=151
http://itmanagement.earthweb.com/osrc/article.php/3818696/Linux-Des...

The ONLY viable reason why a device maker would refuse to support Linux would be if they had been paid not to.

Reply Score: 1

dagw Member since:
2005-07-06

The ONLY viable reason why a device maker would refuse to support Linux would be if they had been paid not to.

Yup, that has do be it. It's a big dark conspiracy.

The idea that a company might actuallty sit down and work out the number of extra sales they'll get by supporting linux vs. the cost of supporting linux and come to the conclusion that supporting linux doesn't make finanical sense is preposterous. It has to be Microsoft I tell you, they are EVIL!!!!

Reply Score: 4

DavidSan Member since:
2008-11-18

"The ONLY viable reason why a device maker would refuse to support Linux would be if they had been paid not to.

Yup, that has do be it. It's a big dark conspiracy.

The idea that a company might actuallty sit down and work out the number of extra sales they'll get by supporting linux vs. the cost of supporting linux and come to the conclusion that supporting linux doesn't make finanical sense is preposterous. It has to be Microsoft I tell you, they are EVIL!!!!
"

HA HA HA! I cannot believe how people think these days.

Developing software is a very expensive task. And developing drivers is even more expensive.

Particularly, what Linux has achieved on the desktop with almost no commercial support is admirable, but thinking someone is paying Linux programmers not to develop is really funny.

Reply Score: 1

ssa2204 Member since:
2006-04-22

The ONLY viable reason why a device maker would refuse to support Linux would be if they had been paid not to.


Well, I must say you are at least consistant in your ability to show you have absolutely zero clue to how a business, corporations, economies, markets, or even the world works. But then again things like facts and truth are really just an obsicle for blind fanboys like you.

Reply Score: 2

Rugxulo Member since:
2007-10-09

Who cares about market share? If something works, good for it. Use it, or not, who cares?

The ONLY viable reason why a device maker would refuse to support Linux would be if they had been paid not to.


Or if they're stubborn, lazy, or clueless. (Note that I don't mean to be insulting/condescending there, I consider myself all of those, heh.) It's not easy supporting everything.

Reply Score: 1

DavidSan Member since:
2008-11-18

This whole issue has nothing to do with having an open source kernel or gtk vs. cocoa or whatever.

It has everything to do with market share. Linux will get great desktop hardware support (3d graphics, wireless, acpi) when it has decent desktop market share.


The problem is it seems it is never going to happen until Linux gets its act together. I have heard : this is the year of the desktop Linux for the last 10 years.

Why? My guess too many options, too many opinions, to much liberties, too much configuration.

Have you ever try to teach a "normal user" the difference among RGB-BGR sub pixel rendering with hinting or no hinting in the Ubuntu Window configuration? People just look at your face an ask you: How does it look nice? How do I make it look like Mac or Windows?

Edited 2009-05-13 16:37 UTC

Reply Score: 2

SEJeff Member since:
2005-11-05

Its been awhile since seriously posting on osnews but this requires a response. In my professional opinion over the internet, you sir have no clue what you are talking about.

Hmmm lets see what does Ubuntu develop? How about notify-osd, the new growl-like notification system which is controversial but very pretty? How about usplash, the userspace bootsplash (soon to be replaced however). How about upstart, the new event based init replacement good enough for Fedora AND Debian to both adopt in the default installs? Granted, it is (IMO) crap, but how about the python ORM storm? Do you even know what an ORM is used for?

How about patches.ubuntu.com? Yeah they've written a good bit of code. How many patches you ask?
jeff@desktopmonster:~$ wget -qO - http://patches.ubuntu.com/PATCHES | wc -l
2632
jeff@desktopmonster:~$


Seriously how can you say, "Whilst the foundation of Ubuntu is shaky..." when it is based off of Debian? Are you saying there is a lack of Debian developers or packages? Sure a lot of them are not friendly to people like yourself but they aren't ghosts. Frankly, you are talking out of your fourth point of contact. Call your proctologist and ask him to find your head, then take a shower and come back to play.

If you get the "deb/rpm" from a reputable source like oh your distribution's package repository it will be GPG signed. Yes my friend, that is more secure than a shell script.

Really I shouldn't have bitten as you obviously troll. The only argument I totally agree with you on is QT/Cocoa vs GTK. GTK is crap compared to either of those but is being worked on. Until next time kids...

Reply Score: 5

fithisux Member since:
2006-01-22

I have to agree with you on Ubuntu grabbing packages. Moreover at least for me the Intel performance of Linux is not that good. Why? On my AMD systems I see great improvements. Moreover the quality of packages has degraded but not due to their development, but due to packaging. A comparison with Gentoo would be more fair if Linux was the target. If GFX is the target, Ubuntu is fine.

Reply Score: 2

r_a_trip Member since:
2005-07-06

Yeah, in a perfect world everybody sings Kumbayaa.

In the real world we really need the GPL for all those individuals who'd like to take for themselves and shove the rest in the dirt.

Reply Score: 3

John Blink Member since:
2005-10-11

I love hearing about movie studios using linux for their 3D work.

It would be great if OSNEWS could try contacting these studios and see if they would be willing to comment on it.

eg. If they were using mac or windows on their clients PC how did it improve things in using GNU/Linux?

What Distro?

Anyway.

Reply Score: 2

ideasman42 Member since:
2007-07-20

Not exactly Pixar but for BigBuckBunny at the Blender institute we ran all 64bit linux workstations with 4-8gig of ram and 2,4 and 8 core PCs.
Whilst we didnt benchmark the NVidia cards on windows and linux generally performance was good, and there was no way we were going to use 64bit OSX or Windows for the short movie.

Time lapse of the studio
http://www.youtube.com/watch?v=6IcLxNVWBX4

Info about the PC's
http://www.bigbuckbunny.org/index.php/maqina-workstations-benchmark...

Reply Score: 6

Lennie Member since:
2007-09-22

I think you are looking for this presentation:

http://www.linuxmovies.org/2008/fosdem.tux.with.shades.2008.pdf

And some people say graphics are shit on Linux. ;-)

"And GIMP doesn't compare to Photoshop", well there is CinePaint ( http://en.wikipedia.org/wiki/CinePaint ).

It does things Photoshop can't and "CinePaint originated as a rewrite of the GIMP 8-bit engine in 1998 and still superficially resembles GIMP". But it's a bit specific for it's field.

Edited 2009-05-13 08:34 UTC

Reply Score: 1

ideasman42 Member since:
2007-07-20

One thing that was obvious from that presentation is that linux was mostly being used for the renderfarms which is nice but not necessarily showing how 3D graphics performs.

Even so I have heard a number of animation studios use linux on the desktop too though this wasnt made clear in the presentation.

Reply Score: 3

Hardware was the same
by Beresford on Wed 13th May 2009 01:33 UTC
Beresford
Member since:
2005-07-06

The hardware used was the same Mac Mini. As least, that's the was I interpret it.

Reply Score: 1

RE: Hardware was the same
by GatoLoko on Wed 13th May 2009 02:23 UTC in reply to "Hardware was the same"
GatoLoko Member since:
2005-11-13

From the article:
"Ubuntu was running on this system via Apple's BootCamp."

I don't know whether BootCamp has anything to do with this results or not, but the system hardware is clearly the same.

Reply Score: 2

RE[2]: Hardware was the same
by weildish on Wed 13th May 2009 02:36 UTC in reply to "RE: Hardware was the same"
weildish Member since:
2008-12-06

Oh. So it is. I have no idea how I missed that-- I read that paragraph maybe three or four times. Thanks for pointing it out!

Reply Score: 1

RE[2]: Hardware was the same
by kedwards on Wed 13th May 2009 03:15 UTC in reply to "RE: Hardware was the same"
kedwards Member since:
2009-04-25

From the article:
"Ubuntu was running on this system via Apple's BootCamp."

I don't know whether BootCamp has anything to do with this results or not, but the system hardware is clearly the same.


BootCamp is just a boot loader(and drivers for windows), it shouldn't have anything to do with the results.

The article was an interesting read, but it's nothing more than bragging rights for a fan boy to say "My OS can beat up your OS."

Reply Score: 2

RE[3]: Hardware was the same
by Lennie on Wed 13th May 2009 08:13 UTC in reply to "RE[2]: Hardware was the same"
Lennie Member since:
2007-09-22

I've not seen this mentioned yet, so I thought I'd say it here.

The only thing you could say is, bootcamp might mean same harddisk and certain parts of a harddisk are slower then others, I think the partitions start on the outside and get slower the closer you get to the center, right ? And the Apple softwas was probably installed first. I could be wrong ofcourse, but it is something I did got me thinking.

Reply Score: 2

Apple Hardware
by timalot on Wed 13th May 2009 02:04 UTC
timalot
Member since:
2006-07-17

I am sure OS X is optimised for a Mac mini. Its easier when you design the software/hardware to get performance.

Reply Score: 3

RE: Apple Hardware
by stooovie on Wed 13th May 2009 09:34 UTC in reply to "Apple Hardware"
stooovie Member since:
2006-01-25

Yes, and by the same mark, Linux is not optimized for anything. So it`s slower.

Reply Score: 2

3rdalbum
Member since:
2008-05-26

As I mentioned in one of my recent comments, I've been troubleshooting a problem with a friend's Macbook Air, so I've been getting re-acquainted with OS X.

There are probably real-world performance differences with 3D graphics on Intel, but in my normal use, the Air was significantly less responsive than my regular computer; in fact it wasn't much faster than my netbook.

Embarrassing.

Reply Score: 4

Dasher42 Member since:
2007-04-05

How were you using it? I find my MacBook Pro far more responsive than my Gnome/Linux desktop with similar specs. Did you leverage platform advantages like OSX's consistent App-centric dock? For example, on Linux you wind up quitting the application constantly, just because you closed the last window. On OSX, quitting the application is typically a separate step, which means that your commonly used apps can be kept effectively preloaded. I really wish this would catch on for Linux.

Reply Score: 1

merkoth Member since:
2006-09-22

In Linux, and most other OSes, recently used stuff gets cached, hence improving loading times. Just test it with OO.org: the firt time it'll take a few secs to load. Close it, open it again a it'll load almost instantly.

The OSX feature you're mentioning is absolutely about the document-centric approach of OSX's interface, it doesn't have much to do with technical aspects of the OS.

Reply Score: 4

Dasher42 Member since:
2007-04-05

Valid point, yes, but having something cached isn't the same as having it loaded, running, and available to simply run a new window. Even comparing Gnome Terminal with the OSX version demonstrates that. In functional use, even with no terminal windows open I'll have the application running, and a new one is nearly instantaneous to load.

Of course, you could always use XFCE.

Reply Score: 1

steogede2 Member since:
2007-08-17

For example, on Linux you wind up quitting the application constantly, just because you closed the last window.


If you want to keep the app. running, don't close it. It isn't rocket surgery.

Reply Score: 1

rockwell Member since:
2005-09-13

Most dangerous job in the world: Rocket Surgeon ... don't cut the red wire!

Reply Score: 3

Dasher42 Member since:
2007-04-05

See, that's the thing. Just because you close the one window you have doesn't mean you're done with the application, and that's the assumption many Windows and Linux applications seem to make.

Reply Score: 1

r_a_trip Member since:
2005-07-06

Instead of hitting the X, why not hitting the _ (Score) and minimize the app to the taskbar? Or put it on a separate desktop and pull it back when needed again?

Cosmetic troubles, IMNSHO.

Reply Score: 2

Make a real statement about Linux graphics
by theosib on Wed 13th May 2009 02:33 UTC
theosib
Member since:
2006-03-02

For people who want to really do something about graphics on Linux, consider supporting the Open Graphics Project.

http://www.linuxfund.org/projects/ogd1/

Reply Score: 2

blitze Member since:
2006-09-15

And that is why I can't be bothered with Linux on the desktop. For years there have been projects to do things right but they never get finished and then there is a call to arms to do the same task again but a different way.

Linux as a backend no issues and great but for general desktop use - I'm over it.

Reply Score: 3

I'm surprised but does this matter?
by bousozoku on Wed 13th May 2009 02:48 UTC
bousozoku
Member since:
2006-01-23

I remember seeing so many comparisons between PowerPC machines and various machines running Linux and the whole monolithic kernel vs. micro kernel argument and how Mac OS X was always severely thumped in every performance comparison.

So, Mac OS X wins a few tests, good or bad, but why are people making excuses for Linux here? Wasn't it fair in the past that the graphics drivers weren't the best or that Canonical didn't optimise anything when Linux was winning?

It's not exactly that any of this matters since it's not going to change anyone's mind really. I'd be more interested in seeing how FreeBSD performs against Mac OS X since they share bits and pieces quite often.

Struggle is good. If gives us a constant goal, right?

Reply Score: 4

kaiwai Member since:
2005-07-06

I remember seeing so many comparisons between PowerPC machines and various machines running Linux and the whole monolithic kernel vs. micro kernel argument and how Mac OS X was always severely thumped in every performance comparison.


I too saw those marks and interesting how you ignore the follow up which explained why some of them were the result of the default configuration as with the case of the MySQL benchmark. It has nothing to do with micro versus monolithic versus hybrid versus chocolate bar with sprinkles on top.

Mac OS X was designed first and foremost as highly responsive desktop operating system. There are sacrifices when you focus on latency and responsiveness over throughput; and yes, when it comes to responsiveness, Linux doesn't even come close to Mac OS X. If I counted the number of times my netbook came bogged down and poorly responsive with a couple of applications open versus Windows on the same machine - I'd be here all day.

Reply Score: 1

6c1452 Member since:
2007-08-29

Mac OS X was designed first and foremost as highly responsive desktop operating system. There are sacrifices when you focus on latency and responsiveness over throughput; and yes, when it comes to responsiveness, Linux doesn't even come close to Mac OS X. If I counted the number of times my netbook came bogged down and poorly responsive with a couple of applications open versus Windows on the same machine - I'd be here all day.


It sounds more like you need to upgrade your ram than get a kernel optimized for desktop use. It's not exactly a secret that current mainstream Linux distributions are less memory efficient than XP

I find OS X to be downright viscous -- as though there is perceivable latency between the input devices and the screen. It's possible I'm just imagining things because of the way desktop effects slow down some other actions, but that's how it feels.

Reply Score: 1

kaiwai Member since:
2005-07-06

It sounds more like you need to upgrade your ram than get a kernel optimized for desktop use. It's not exactly a secret that current mainstream Linux distributions are less memory efficient than XP


Woah, hang on - it has nothing to do with efficiency of memory; this was running ArchLinux whose total memory foot print was less than Windows XP - so it has nothing to do with the memory consumed. What it has to do with is the algorithms that are used to balance processes/threads to ensure that the end user gets a responsive system.

I find OS X to be downright viscous -- as though there is perceivable latency between the input devices and the screen. It's possible I'm just imagining things because of the way desktop effects slow down some other actions, but that's how it feels.


Pardon? nothing is slow to me; maybe it takes a second to load up the window to display the contents of the drive, or it takes a couple of seconds for an application to load by clicking on the dock but what I am talking about is smoothness when running 3-4-5-6 applications at the same time. For me, I couldn't care less about the speed of one application all by its lonesome self; what I am talking about is a system under a reasonable load and getting some decent responsiveness from it.

Edited 2009-05-13 06:07 UTC

Reply Score: 2

6c1452 Member since:
2007-08-29

If we're talking about running multiple cpu-intensive applications on a single processor core I'll have to concede the point; I don't specifically recall the effect you described, but I can't test it.

Out of curiosity, have you tried switching to desktop-optimised kernel and seeing what happens? I would be hesitant to declare that the scheduler is the major factor in performance differences between two operating systems, but very interested to see what difference it makes.

As for OS X, my understanding is that low latency means a negligible delay between user input and the output appearing, which is exactly what I haven't noticed while using it.

Edited 2009-05-13 06:39 UTC

Reply Score: 1

Thom_Holwerda Member since:
2005-06-29

Pardon? nothing is slow to me; maybe it takes a second to load up the window to display the contents of the drive, or it takes a couple of seconds for an application to load by clicking on the dock but what I am talking about is smoothness when running 3-4-5-6 applications at the same time. For me, I couldn't care less about the speed of one application all by its lonesome self; what I am talking about is a system under a reasonable load and getting some decent responsiveness from it.


Funny, as repsonsiveness is one of my biggest gripes about Mac OS X. I've used the most powerful Macs you can imagine, and even those that herald the coming of the starborn ones (to paraphrase Yahtzee) have noticeable delay when launching applications or interacting with them (buttons, menus, etc.).

This is absolutely intolerable. Mac OS X is smooth, yes, but not when it comes to responsiveness.

Before I get the usual group of Mac fans on my bum: the above does not imply, in any way, that Windows does this any better.

Reply Score: 1

DavidSan Member since:
2008-11-18

" Pardon? nothing is slow to me; maybe it takes a second to load up the window to display the contents of the drive, or it takes a couple of seconds for an application to load by clicking on the dock but what I am talking about is smoothness when running 3-4-5-6 applications at the same time. For me, I couldn't care less about the speed of one application all by its lonesome self; what I am talking about is a system under a reasonable load and getting some decent responsiveness from it.


Funny, as repsonsiveness is one of my biggest gripes about Mac OS X. I've used the most powerful Macs you can imagine, and even those that herald the coming of the starborn ones (to paraphrase Yahtzee) have noticeable delay when launching applications or interacting with them (buttons, menus, etc.).

This is absolutely intolerable. Mac OS X is smooth, yes, but not when it comes to responsiveness.

Before I get the usual group of Mac fans on my bum: the above does not imply, in any way, that Windows does this any better.
"

I believe you are confusing things. Responsiveness has nothing to do with launching Apps. It has to do with how the system take care of your requests and events.

A slow application, does not mean the system is not responsive enough, or a slow launching.

However, if the menu bar that does not appear when pressed it is a responsiveness issue. That is particularly true in Applications written in Java, but they are unresponsive in every platform, even Windows.

Windows, and Linux are very fast, but when you have your processors at top capacity, both systems get very unresponsive, especially Windows. Mac OS X, usually keeps receiving and behaving properly under the same circumstances.

Reply Score: 2

Thom_Holwerda Member since:
2005-06-29

I believe you are confusing things. Responsiveness has nothing to do with launching Apps. It has to do with how the system take care of your requests and events.


Exactly.

So when I click a launcher, the app needs to be there instantly to receive brownie points. When I press the close button, it needs to disappear instantly for brownie points. When I press a menu button, the menu should appear instantly. Etc. Mac OS X simply does not perform optimal when it comes to responsiveness.

I'm from a BeOS world, and anything less than instant responses is evil and bad and should cause people to be fired.

Mac OS X totally sucks in this department, even on very powerful machines. Windows XP and esp. vista sucked balls here too, but Windows 7 seems to have nailed it pretty good (still not good enough though). Sure, it needs tricks like SuperFetch and such to get there, but I'd rather have tricks getting me there than not getting there at all.

On 7, all the applications I use appear instantly - Chrome, Office, Miranda, you name it.

Except for blu though. blu's a ridiculously beautiful Twitter client written in WPF, but it's goddamn heavy on resources. In fact, it's my most memory intensive app ;) .

Reply Score: 1

WereCatf Member since:
2006-02-15

I don't count application loading times as "responsiveness." It's how responsive the system remains while loading something that matters. Caching apps in memory does not and will not work in every case so when the system does need to start loading something, possibly even a really heavy app, it should do it in a way that the rest of the system stays responsive and useable.

Reply Score: 2

DavidSan Member since:
2008-11-18



Exactly.

So when I click a launcher, the app needs to be there instantly to receive brownie points. When I press the close button, it needs to disappear instantly for brownie points. When I press a menu button, the menu should appear instantly. Etc. Mac OS X simply does not perform optimal when it comes to responsiveness.


It is true. Mac OS X is not as responsive as it should on the user interface department.


I'm from a BeOS world, and anything less than instant responses is evil and bad and should cause people to be fired.


That's not true. BeOS was very responsive, I used it, but it was not that responsive, especially considering how old the graphic interface in BeOS was. BeOS was not as responsive as Mac OS 9, for example. Mac OS 9 has its problems, but responsiveness was not. All the time the user was first (Except when the system hanged itself).

Preemptive operating systems, are not as responsive as cooperative multitasking systems, for obvious reasons, but you gain robustness.

BeOS display technology, was not either even 1/10 of the sophistication Mac OS X or Vista has. BeOS was pixel related, very similar to what Mac OS 9 was and Windows XP is. Everything was a bitmap.

Mac OS X, in contrast, is PDF-vector related, heavy transparent (everything has alpha channel, even if it is not used), double buffered. (It has to be slow, because everything is written on the screen twice), heavy anti-aliasing, etc. How do you think those animations are made? Vista is similar to Mac OS X in that respect.

Reply Score: 1

bousozoku Member since:
2006-01-23


It sounds more like you need to upgrade your ram than get a kernel optimized for desktop use. It's not exactly a secret that current mainstream Linux distributions are less memory efficient than XP

I find OS X to be downright viscous -- as though there is perceivable latency between the input devices and the screen. It's possible I'm just imagining things because of the way desktop effects slow down some other actions, but that's how it feels.


I haven't seen that kind of performance since version 10.4.x, even when it's short on available real memory, though I've had stuttering from the virtual memory system when an application tries to implement its own system.

The graphics card has a lot to do with it, though, since OpenGL is used in many places. The early Intel-based machines with the early Intel graphics chipset lagged a lot but then, they weren't able to access 226 MB of shared RAM in Mac OS X the way that they could under Windows.

Reply Score: 2

bousozoku Member since:
2006-01-23


I too saw those marks and interesting how you ignore the follow up which explained why some of them were the result of the default configuration as with the case of the MySQL benchmark. It has nothing to do with micro versus monolithic versus hybrid versus chocolate bar with sprinkles on top.

Mac OS X was designed first and foremost as highly responsive desktop operating system. There are sacrifices when you focus on latency and responsiveness over throughput; and yes, when it comes to responsiveness, Linux doesn't even come close to Mac OS X. If I counted the number of times my netbook came bogged down and poorly responsive with a couple of applications open versus Windows on the same machine - I'd be here all day.


Until 10.4.x, Mac OS X was never interactively responsive for me. I know that was the intention but it never happened, even on dual processor machines. My Ubuntu machine feels better but even that has some odd performance foibles.

Reply Score: 2

gustl Member since:
2006-01-19

Responsiveness is heavily dependant on the machine load.

Under low-load (means less than 100%) conditions Linux is a bit less responsive than Windows. Under full load Linux is by far more responsive than Windows (at least XP).

I had two machines with 4 cores each, the Linux machine even being slightly slower (2.8 GHz vs. 3.0 GHz).
I ran a finite element calculation on each using all 4 cores, both machines needed 1.5 GB RAM for this calculation and had plenty of RAM available for other stuff. CPU utilisation was 100% at both machines, both processes ran with standard process priority. None of the machines had to swap.

On the WinXP machine it was not possible to do anything productive during number crunching. On the slightly slower Linux machine working was slightly less responsive than without load, but still good. And by the way, my work included software like Salome, GIMP and OpenOffice where responsiveness definitely IS an issue.

When looking into desktop performance, the high-load scenario is not the typical one, on the other hand if you sometimes DO saturate your processor, having still good response is definitely a plus.

I think that everybody needs to decide on his own which behaviour is most satisfying to him.

Reply Score: 2

DavidSan Member since:
2008-11-18

Responsiveness is heavily dependant on the machine load.

Under low-load (means less than 100%) conditions Linux is a bit less responsive than Windows. Under full load Linux is by far more responsive than Windows (at least XP).


I have experience the same. However, Mac OS X under the same circumstances is even more responsive than Linux.

Reply Score: 1

PlatformAgnostic
Member since:
2006-01-02

Some of these tests look like they should be compiler/processor limited so the OS shouldn't really matter at all. I wonder, for instance, if they would have gotten the same result on the OpenSSL test from both OSes if they compiled the benchmark themselves with the same version of GCC.

Reply Score: 3

Urban Terror
by J.R. on Wed 13th May 2009 05:18 UTC
J.R.
Member since:
2007-07-25

Tbh, they both suck if Urban Terror is the best game you can use to benchmark.

Reply Score: 5

Problem with Urban Terror test
by JLF65 on Wed 13th May 2009 06:06 UTC
JLF65
Member since:
2005-07-06

The main issue with the FPS test in Urban Terror is the difference in OpenGL versions. As stated, Ubuntu was using v1.4, while OSX was only using 1.2. Most games, especially ones as optimized as ioquake3 (the engine that drives Urban Terror), will use different render paths for different versions of OpenGL. Most likely, ioquake3 saw OSX was using an older version and turned off some of the features. In Ubuntu, the newer version triggered the newer features, which on Intel graphics is done via software since the Intel GPU hardly does anything in hardware.

So the difference wasn't so much optimized Apple drivers vs unoptimized linux drivers as it was old rendering path with more hardware acceleration vs newer rendering path using software. I'd bet that if they redid the test with UT set to the lowest values for the rendering quality, both would show nearly the same FPS values.

Reply Score: 5

RE: Problem with Urban Terror test
by J.R. on Wed 13th May 2009 08:02 UTC in reply to "Problem with Urban Terror test"
J.R. Member since:
2007-07-25

No, I think the main issue is that they are benchmarking an irrelevant game which barely no one plays. What is the point of benchmarking an old game with lower requirements anyway? Even though one OS would be better than the other, both would still suck. I find the entire test irrelevant.

I guess there are some weird rationale for choosing this game for the benchmark, but while Linux and OSX are arguing who has the better Urban Terror performance, Windows are bragging about Crysis performance. See my point?

Reply Score: 2

Tech background ?
by poohgee on Wed 13th May 2009 08:25 UTC
poohgee
Member since:
2005-08-13

Would be really nice if all these test results and the accompanying pictures where set into some technical context .

How do you test ?
What do you test ?
What are the results ?
What do the results say ?

Just publishing test-results ,does not give answers .

Reply Score: 2

SQLite and PostgreSQL Performance
by Abacus_ on Wed 13th May 2009 10:40 UTC
Abacus_
Member since:
2006-12-08

Anyone any idea whether fsync() / fdatasync() does anything in Mac OS/X ? The large performance difference for the SQLite and PostgreSQL performance benchmarks made me wonder about this.

Reply Score: 2

christian Member since:
2005-07-06

It might be that MacOS X is not turning off hard drive write back cache, whereas Ubuntu is. The Ubuntu figures for SQLite come out at around 112 transactions per second (assuming each insert is 1 transaction) while MacOS X comes out at 500 transactions per second. The Ubuntu time is more consistent with a drive that has write back disabled.

Reply Score: 2

The Crafty result looks fishy to me.
by vdbergh on Wed 13th May 2009 11:13 UTC
vdbergh
Member since:
2006-01-31

My experience with chess engines is that if they are compiled with the same compiler they run at exactly the same speed regardless of the OS. Could it be
that for some reason the MacOSX version used more
processors than the Ubuntu version?

I am assuming both Crafty's were compiled with the same compiler. If not then one has also to take into account that there is a large difference in performance among compilers.

At least until recently the Intel compiler produced much faster code than gcc (when using both with runtime profiling).

Reply Score: 2

steogede2
Member since:
2007-08-17

I have a friend that built a £600 hackintosh which when benchmarked, blew away the results of a £1500 Mac Pro.

Okay the Mac Mini's aren't as completely overpriced as the Mac Pro; but why would anyone pay the extra for Apple hardware, except to be allowed to legitimately use Apple software?

Edited 2009-05-13 13:37 UTC

Reply Score: 2

uaxactun
Member since:
2008-04-17

intel drivers in jaunty are broken:

https://bugs.launchpad.net/bugs/314928

Edited 2009-05-13 17:19 UTC

Reply Score: 1

wrong
by maxjen on Thu 14th May 2009 14:54 UTC
maxjen
Member since:
2007-07-15

"Before giving some detailed results of some of the specific tests, the overall testing showed that Ubuntu was faster than Mac OS X in 18 of the 29 tests. Some were landslides while others were only ahead with marginal differences for both systems."

(Sadly) this is wrong. Mac OS X won 17 of the 29 benchmarks. You probably missed the "fewer are better"-labels in some of the tests.

Reply Score: 2