Linked by Thom Holwerda on Thu 25th Apr 2013 14:56 UTC
Ubuntu, Kubuntu, Xubuntu Ubuntu 13.04 has been released, with the Linux 3.8.8 kernel, a faster and less resource hungry Unity desktop, LibreOffice 4.0, and much more. Ubuntu users will know where to get it, and you're looking for a new installation, have fun. Also fun: UbuntuKylin.
Order by: Score:
...
by Hiev on Thu 25th Apr 2013 15:03 UTC
Hiev
Member since:
2005-09-27

Good, but the bad part is that Ubuntu GNOME 13.04 ships with GNOME 3.6 and not 3.8.

Edited 2013-04-25 15:06 UTC

Reply Score: 4

RE: ...
by UltraZelda64 on Thu 25th Apr 2013 19:08 UTC in reply to "..."
UltraZelda64 Member since:
2006-12-05

Has GNOME 3 even got usable yet, as of 3.6 and 3.8? Last I checked, it wasn't even worth using, but 3.4 was slightly "better" (note: that's not saying much) than 3.0 and 3.2. Last I checked, even if you didn't mind the interface (I didn't like it), the desktop still required lots of memory and some strong, compatible 3D graphics hardware if you wanted any chance of running it.

Have there been any major performance/system requirement improvements, and noteworthy GUI improvements since then? Or is it the same old load of crap all around?

Reply Score: 2

RE[2]: ...
by Delgarde on Fri 26th Apr 2013 02:01 UTC in reply to "RE: ..."
Delgarde Member since:
2008-08-19

the desktop still required lots of memory and some strong, compatible 3D graphics hardware if you wanted any chance of running it.


It needs good drivers more than it needs good hardware. The current Fedora release runs just fine on my crappy five-year-old netbook, with a first-gen Atom processor, 1Gb RAM, and Intel graphics. I mean, it's not *fast*, but it's adequate, and no worse than anything other desktop I've tried on that machine.

It also works just fine on the onboard ATI chip on my current machine... I forget what it is, but it's about three or four years old, a 4550 or something like that. Again, not an especially powerful or memory-heavy machine...

Reply Score: 4

RE[2]: ...
by Hiev on Fri 26th Apr 2013 03:15 UTC in reply to "RE: ..."
Hiev Member since:
2005-09-27

Why don't you try it your self?

Reply Score: 2

RE[3]: ...
by nej_simon on Fri 26th Apr 2013 14:16 UTC in reply to "RE[2]: ..."
nej_simon Member since:
2011-02-11

Judging by the past comments by this user I doubt he/she wants to try gnome. ;)

Reply Score: 4

RE[4]: ...
by UltraZelda64 on Fri 26th Apr 2013 17:54 UTC in reply to "RE[3]: ..."
UltraZelda64 Member since:
2006-12-05

Actually, I would consider giving it a try in a virtual machine, *if* the interface and overall environment really has improved/evolved, *and* if it doesn't require fancy hardware acceleration (which makes trying it in a virtual machine... pretty much impossible). If it is the same old desktop with no worthwhile user/performance, then no I do not want to waste my time trying it.

I gave KDE4 the benefit of the doubt and tried it later on after it evolved... and right now KDE4 just happens to be what I am running (and has been for quite a while).

Edited 2013-04-26 18:05 UTC

Reply Score: 2

RE: ...
by Savior on Fri 26th Apr 2013 06:51 UTC in reply to "..."
Savior Member since:
2006-09-02

Good, but the bad part is that Ubuntu GNOME 13.04 ships with GNOME 3.6 and not 3.8.


And another bad part is that Ubuntu Unity comes with Gnome...

Reply Score: 2

RE: ...
by dulus on Fri 26th Apr 2013 09:36 UTC in reply to "..."
dulus Member since:
2006-07-14

No no, bad part is that they don't ship it with Gnome 2.32 !

Reply Score: 3

RE: ...
by nej_simon on Fri 26th Apr 2013 14:17 UTC in reply to "..."
nej_simon Member since:
2011-02-11

You can always use the gnome3 ppa. It has some pre-release packages currently but I expect it to be updated to stable packages soon.

Reply Score: 2

RE: ...
by cmost on Fri 26th Apr 2013 21:57 UTC in reply to "..."
cmost Member since:
2006-07-16

Really?!? Do you really think a PPA won't appear (if it hasn't already) to update Gnome to 3.8?

Reply Score: 3

RE[2]: ...
by Hiev on Sat 27th Apr 2013 00:18 UTC in reply to "RE: ..."
Hiev Member since:
2005-09-27

There is one PPA already, but is not complete and has drawbacks.

Reply Score: 2

v Comment by progormre
by progormre on Thu 25th Apr 2013 15:32 UTC
RE: Comment by progormre
by Alfman on Thu 25th Apr 2013 15:59 UTC in reply to "Comment by progormre"
Alfman Member since:
2011-01-28

progrormre,

Ubunutu upgrades have failed me in the past, but in your case without more details, is it possible that you are judging Ubuntu for what may be a third party problem? Vmware is after all a third party proprietary app you had to install from outside the repos.

The last time I installed vmware on linux, it needed to compile some kernel-specific extensions before it could run, that would very likely break during an OS upgrade. Of course I can understand why you want it to just work, but did you try to reinstall it? Just curious.

Reply Score: 9

RE[2]: Comment by progormre
by progormre on Thu 25th Apr 2013 16:52 UTC in reply to "RE: Comment by progormre"
progormre Member since:
2012-05-20

The first thing I did was to install a stable version of centos (or maybe it was some other stable distro with VMware support, I can't remember), just to find that it did not support the filesystem I had used for the disk where all my files were stored... By that time I had already spent the morning and stress was going through the roof so I ran to the nearest apple shop.

Sure it was a linux kernel dev who changed an interface, perhaps not knowing it would break apps using the interface...Then all the testers in ubuntu forgot to test using vmware. That or they all knew VMware was broken but couldn't care less...I don't know, it just ended up on my desk being broken.

Reply Score: 2

RE[3]: Comment by progormre
by Alfman on Thu 25th Apr 2013 17:44 UTC in reply to "RE[2]: Comment by progormre"
Alfman Member since:
2011-01-28

progormre,

"The first thing I did was to install a stable version of centos (or maybe it was some other stable distro with VMware support, I can't remember), just to find that it did not support the filesystem I had used for the disk where all my files were stored..."

I'm missing something, it's not clear what this has to do with a vmware instance that stopped working after an ubuntu upgrade? Or if you didn't have an existing vmware instance, how do you know the ubuntu upgrade broke vmware?


"By that time I had already spent the morning and stress was going through the roof so I ran to the nearest apple shop."

I'm glad it's working out for you now. Isn't it great to have a choice! To paraphrase Voltaire: I may not agree with your choice of software, but I'll fight to the death your right to choose it ;)

Reply Score: 3

RE[3]: Comment by progormre
by Wemgadge on Thu 25th Apr 2013 17:56 UTC in reply to "RE[2]: Comment by progormre"
Wemgadge Member since:
2005-07-02

Although VMware does work extensively with Canonical, it is not their, or the Ubuntu volunteers job to ensure that our product works. It's VMware's job. As I stated above, we do endeavour to support Linux and try to keep WS and Player compatibility as up to date as possible while at the same time adhering to our own release schedule. We do have an experimental setting in our install that will attempt to automatically rebuild new modules on kernel upgrade, but it does depend on having a sane build environment. Hope this is helpful.

Edited 2013-04-25 17:56 UTC

Reply Score: 4

RE[4]: Comment by progormre
by JAlexoid on Fri 26th Apr 2013 11:12 UTC in reply to "RE[3]: Comment by progormre"
JAlexoid Member since:
2009-05-19

Or he could do "the right thing" and stick to an LTS version of Ubuntu.

Reply Score: 7

RE[5]: Comment by progormre
by Morgan on Fri 26th Apr 2013 14:07 UTC in reply to "RE[4]: Comment by progormre"
Morgan Member since:
2005-06-29

I was hoping someone would point that out. Anyone who uses the bleeding edge of a distro is begging for problems and failures. Anyone who does that in a critical production environment is insane. The only exception is when the person is a developer on that specific distro or a derivative.

Even then, keep a second system with the LTS release handy!

Reply Score: 4

RE: Comment by progormre
by groversonus on Thu 25th Apr 2013 17:20 UTC in reply to "Comment by progormre"
groversonus Member since:
2009-03-07

Gosh!!! Only one Day and you gave up that quickly.. I'm laughing myself almost off my chair.

Seriously, that is nothing..

You must not have worked with MS Windows long enough then.

Aaah.. I should have just given up on Microsoft already and saved me all that lost time, mental anguish and pain.. Sigh.. all those lost years.....Waiting... Sigh..

:)

Reply Score: 0

RE: Comment by progormre
by ichi on Thu 25th Apr 2013 17:29 UTC in reply to "Comment by progormre"
ichi Member since:
2007-03-06

Which VMware?

I've had VMware to break with a kernel update (I'm not sure if it was also an Ubuntu upgrade) but that was back with VMware server 1.x where you had to go patching the kernel modules.

VMware Player on the other hand has always run fine, not as featureful but enough for the odd times I need to boot a Windows machine on my laptop at work.

Edited 2013-04-25 17:32 UTC

Reply Score: 4

RE: Comment by progormre
by Wemgadge on Thu 25th Apr 2013 17:47 UTC in reply to "Comment by progormre"
Wemgadge Member since:
2005-07-02

It is expected behaviour for VMware Workstation to break when you move to a new Kernel or new build of a distro, but it is not fatal. WS ships with precompiled binary kernel modules for the most popular distros available at time of release, however they can't take into account every linux kernel update in the repros. In the case of Linux hosting Workstation, the linux-headers are required to re-compile the kernel modules -- in this case, the kernel that ships with 12.10 is supported with a precompiled binary in WS 9.0.2 and it's sister product Player, but for the updated kernel, we need to make new modules. You should be able to successfully accomplish this as long as you have a sane build environment on your Ubuntu: sudo apt-get install linux-kernel-devel fakeroot kernel-wedge build-essential should do it. That will give you enough of a build environment to rebuild Kernel Modules. This is not a VMware specific issue. You would encounter the same issue if you had manually installed Video blobs (ATI or Nvidia installed via SH script instead of using .deb, for example would also break). Disclosure: I work at VMware and spent 3 years providing support for Workstation.

Reply Score: 4

RE[2]: Comment by progormre
by Wemgadge on Thu 25th Apr 2013 17:50 UTC in reply to "RE: Comment by progormre"
Wemgadge Member since:
2005-07-02

Also: do a sudo apt-get install libgtkmm-2.4-1c2a to get WS to use your GTK theme. http://kb.vmware.com/kb/2012664

Reply Score: 4

RE: Comment by progormre
by Soulbender on Fri 26th Apr 2013 01:31 UTC in reply to "Comment by progormre"
Soulbender Member since:
2005-08-18

Yeah, I have a grudge cause I lost a day worth of work to the lack of care from their part.


What, it somehow screwed around with your vmware images causing a days worth of changes to the images to be lost? A likely story....
More likely is that vmware or, even more likely, you yourself screwed up.

Edited 2013-04-26 01:33 UTC

Reply Score: 4

v Comment by kurkosdr
by kurkosdr on Thu 25th Apr 2013 16:45 UTC
RE: Comment by kurkosdr
by Alfman on Thu 25th Apr 2013 17:12 UTC in reply to "Comment by kurkosdr"
Alfman Member since:
2011-01-28

kurkosdr,

That's just trolling. He exercised his right to install 3rd party software, which he wouldn't have gotten at all in a walled garden. The point was that the responsibility for 3rd party software quality assurance lies with you, the end user, or the 3rd party software vendor. There's nothing wrong with it, it's just the simple fact of the matter.

Reply Score: 4

RE: Comment by kurkosdr
by Neolander on Thu 25th Apr 2013 17:20 UTC in reply to "Comment by kurkosdr"
Neolander Member since:
2010-03-08

+1 from a linuxero who thinks that over-reliance on the repository system is one of the main issues of the Linux world.

In my dream OS, the repository system would only serve to store a collection of well-trusted, "Editor's pick" software in a single centralized place. Packages would be perfectly standalone, relying only on system libraries, and of course secure: nothing like Windows installers, demanding to run arbitrary code with admin privilege just to copy a bunch of files to a given place on disk.

Mac OS X actually got pretty close to that with the bundle system. That's one of the things which I used to like about that OS, before Gatekeeper came around in 10.8, announcing the start of a gradual deprecation of decentralized software distribution.

Edited 2013-04-25 17:26 UTC

Reply Score: 2

RE[2]: Comment by kurkosdr
by moondevil on Thu 25th Apr 2013 19:47 UTC in reply to "RE: Comment by kurkosdr"
moondevil Member since:
2005-07-08

You mean like it used to be in MS-DOS, Amiga and Atari days? ;)

Reply Score: 2

RE[3]: Comment by kurkosdr
by Neolander on Thu 25th Apr 2013 20:18 UTC in reply to "RE[2]: Comment by kurkosdr"
Neolander Member since:
2010-03-08

Might be, I haven't used an MS-DOS computer for a long time now and have never used the other two. Besides, decentralized software distribution was the only choice at the time where the infrastructure for centralized distribution wasn't in place.

A difference, though, is that I'm not hostile to centralized software distribution per se. It's normal for an OS vendor to want to put what he think is quality third-party software on display, and to provide a set of trusted mirrors for these. What I'm against is the scenario where centralized software distribution becomes the only way to get software on a given platform.

First because it ruins the purpose of having a centralized store at all, since the worst crap ends up getting in, while some gems are left out forever for e.g. licensing reasons. Second because it puts a very large amount of power in the hand of the OS vendor, which no one should feel confortable with.

Edited 2013-04-25 20:20 UTC

Reply Score: 2

RE[2]: Comment by kurkosdr
by Delgarde on Fri 26th Apr 2013 02:15 UTC in reply to "RE: Comment by kurkosdr"
Delgarde Member since:
2008-08-19

+1 from a linuxero who thinks that over-reliance on the repository system is one of the main issues of the Linux world.


This isn't really a problem with the repo system though... it's a problem with trying to run 3rd-party software that hasn't been tested/certified on the new version of the OS you're running.

Pretty much any big commercial software provider would be laughing at you if you ask for help running their applications on an unsupported OS. At best, they'll tell you to wait until they've certified it themselves, or they'll charge you big money for doing the certification ahead of schedule. Just try dealing with Oracle on something like this...

Reply Score: 4

RE: Comment by kurkosdr
by Soulbender on Fri 26th Apr 2013 01:35 UTC in reply to "Comment by kurkosdr"
Soulbender Member since:
2005-08-18

It's not a walled garden, honest. It's not a walled garden, honest.


You speak of a walled garden but I don't think you know what it really means.

Reply Score: 4

Gaming on linux
by judgen on Thu 25th Apr 2013 17:01 UTC
judgen
Member since:
2006-07-12

I am bringin up this quote from the EA dev team mailing list:
"How EA killed their own market:
They tried for years with DRM to protect our multiplayer games, and lost heavily. The new approach is alwasy on DRM and even though we had 4 million pre-orders of sim city 5, we are still losing money ont that turd. Sim city 4 was awesome and is still one of the best selling EA games. EA made the NHL 2003 which is still the best sports game ever made.
Always on is equaling sadness for the consumer, death to free gaming and origin allthough i thought it would just be a steam competitor it has turned in to the worst place to get games, as we can not guarantee our uptime.
Steam DRM might be a teeny bit invasive sometimes, but atleast they are not assholes about their games demanding 100% uptime on the net.

For all the devs, make sure your game is rendering through opengl properly, or you WILL lose all the powervr users (iphone and android stuff) and also all the macs and linux. Linux as a gaming plattform has allready surpassed the mac in steam sales and if your game sells 100 copies it might not be worth it in the first place. But if you are interested in the Linux market: Sell your games through steam until we can get a origin client for linux, 10 cutoff in price is easily worth it. Ask Crotech who sold serious sam 3, a hard title for the PC as a linux game. They have earned more than half of their revenue from the game-starved linux community whilst their competitors squabble over the decreasing windows sales.
"

Reply Score: 7

RE: Gaming on linux
by judgen on Thu 25th Apr 2013 17:05 UTC in reply to "Gaming on linux"
judgen Member since:
2006-07-12

The poor spelling is in the quote, not my doing!

Reply Score: 3

RE: Gaming on linux
by judgen on Thu 25th Apr 2013 17:08 UTC in reply to "Gaming on linux"
judgen Member since:
2006-07-12

Edit: The mail can no longer be viewed, reason: Gone, lost or protected. I can not read the responses.

Reply Score: 2

Comment by Jason Bourne
by Jason Bourne on Thu 25th Apr 2013 17:58 UTC
Jason Bourne
Member since:
2007-06-02

[...] faster and less resource hungry Unity desktop, LibreOffice 4.0 [...]


Where is it? I just tried the LiveCD and spotted a radeon driver issue with Unity.

Reply Score: 2

Re: Re: Comment by kurkosdr
by kurkosdr on Thu 25th Apr 2013 18:33 UTC
kurkosdr
Member since:
2011-04-11

He exercised his right to install 3rd party software, which he wouldn't have gotten at all in a walled garden. The point was that the responsibility for 3rd party software quality assurance lies with you, the end user, or the 3rd party software vendor.


Theoretically, this is true. Practically, Linux Desktop doesn't care about API stability and backcompat, so, for the average user who doesn't want to do things "at his own risk" (aka see things getting broken), it has the effect of a walled garden.

I will bring Windows and OS X as examples again. You can run software outside their stores, and it works with remarkable back compat. For most users, such stability for software installed outside the repo (or store) is perceived as preferable to seeing some source code.

Just look at Android. Nobody mentions the ability to see the source as an advantage. They will mention the openness of the store compared to Apple's (emulators etc), the ability to sideload apps to buy directly from the dev, the hardware variety etc but not the source code. Even if they have an AOSP phone (Xperia Z, Nexus 4 etc)

Browser: Mozilla/5.0 (Linux; U; Android 2.3.4; el-gr; LG-P990 Build/GRJ23) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 MMS/LG-Android-MMS-V1.0/1.2

Reply Score: 1

RE: Re: Re: Comment by kurkosdr
by darknexus on Thu 25th Apr 2013 18:46 UTC in reply to "Re: Re: Comment by kurkosdr"
darknexus Member since:
2008-07-15

Theoretically, this is true. Practically, Linux Desktop doesn't care about API stability and backcompat, so, for the average user who doesn't want to do things "at his own risk" (aka see things getting broken), it has the effect of a walled garden.

It's worse than that. Even things in the "stable" repositories will often break, e.g. GPU drivers and audio support. The latter is, in fact, something Ubuntu is rather famous for doing when they modify and customize Pulseaudio all to hell, breaking many of the ALSA drivers in the process even when they are not proprietary. One no sooner gets it working than a "stability update" breaks it again. I think the real problem is backward compatibility as opposed to the repository system. Repos can work well, as Android has demonstrated, but it has to be done correctly and backward compatibility must be maintained.

Reply Score: 2

RE[2]: Re: Re: Comment by kurkosdr
by vivainio on Thu 25th Apr 2013 18:59 UTC in reply to "RE: Re: Re: Comment by kurkosdr"
vivainio Member since:
2008-12-26

Android doesn't really have a repo system (no dependency management)

Reply Score: 4

RE[3]: Re: Re: Comment by kurkosdr
by SeeM on Fri 26th Apr 2013 08:11 UTC in reply to "RE[2]: Re: Re: Comment by kurkosdr"
SeeM Member since:
2011-09-10

Android doesn't really have a repo system (no dependency management)


yum --nodeps install gimp
and still pulling from a repo. ;)

Reply Score: 2

RE: Re: Re: Comment by kurkosdr
by Alfman on Thu 25th Apr 2013 19:32 UTC in reply to "Re: Re: Comment by kurkosdr"
Alfman Member since:
2011-01-28

kurkosdr,

"Theoretically, this is true. Practically, Linux Desktop doesn't care about API stability and backcompat, so, for the average user who doesn't want to do things 'at his own risk' (aka see things getting broken), it has the effect of a walled garden."

The linux kernel has kept userspace backwards compatibility strong for ages. It's remotely possible an API regression occurred here, but it's far more likely that Wemgadge's explanation was the case since he indicates breakages are normal for vmware after OS upgrades. Vmware is very atypical in that it bypasses the *stable* userspace APIs all together. It is completely fair to criticize the lack of linux kernel space API & ABI stability for other reasons. However equating that to a walled garden is very...imaginative ;)

Edit: Look, I have no problem criticizing genuine linux distro breakages, they bother me too, but they all lack the most important attribute needed to call something a walled garden - the walls that block you from installing outside software.

Edited 2013-04-25 19:48 UTC

Reply Score: 5

RE[2]: Re: Re: Comment by kurkosdr
by kurkosdr on Fri 26th Apr 2013 09:01 UTC in reply to "RE: Re: Re: Comment by kurkosdr"
kurkosdr Member since:
2011-04-11

It is completely fair to criticize the lack of linux kernel space API & ABI stability for other reasons. However equating that to a walled garden is very...imaginative ;)


Well, the people behind Linux Desktop won't directly tell you "we don't want your proprietary app", they do it in a more subtle way. They will simply make it a PITA for you to ship your proprietary app on Linux. (same for proprietary GPU drivers)

Let's see: Ubuntu has like, 3 major proprietary apps, and during the last couple of releases, Ubuntu broke all 3 of them. VMWare, Steam ( http://tinyurl.com/bpbf6cv ) and Skype ( http://tinyurl.com/b8tnjmv ).

Am I the only one who doesn't get the dream of Linux Desktop? I mean, we are supposed to give up our Windows and OS X boxes, which are binary compatible to apps released in 2006 (to say the least), start using Linux, and confine ourselves to software found in the repos? And anyone who dares to dip it's pinky toe outside the repos will have to do things "at his own risk"?

I will get downvoted for this, but I will say it. There are lots of people in key positions in the Linux Desktop community with a downright hostile sentiment for proprietary software, who consider API and ABI instability a feature. This makes it impossible for proprietary ISVs and IHVs to target Ubuntu. If this hostile sentiment towards proprietary software didn't exist, I can easily imagine Linux Desktop having anywhere from 5% to 15% marketshare.

There are of course bright exceptions to this rule (Linux Torvalds and his insistence on API stability) but they are expections.

Edited 2013-04-26 09:02 UTC

Reply Score: 4

Soulbender Member since:
2005-08-18

Wow, can you please tell me an OS where there are no issues with any software when upgrading between releases? Maybe you can also point me to an OS that has no bugs in the initial release?
Because I still haven't found any of those two. Ever.

Reply Score: 2

Alfman Member since:
2011-01-28

kurkosdr,


"Am I the only one who doesn't get the dream of Linux Desktop?"

No, and I've never said it's for everyone. It's up to you to decide for yourself, you are free to use whatever works best for you. There are plenty of reasons people might not want to run linux. For what it's worth, I'm not criticizing your opinion, I'm criticizing your misuse of facts.

Reply Score: 4

RE: Re: Re: Comment by kurkosdr
by Kochise on Thu 25th Apr 2013 20:52 UTC in reply to "Re: Re: Comment by kurkosdr"
Kochise Member since:
2006-03-03

Aaah, Fedora, the most advanced unstable distro :-) Shouldn't be called Fedora Core but Fedora pre-alpha.

Kochise

Reply Score: 4

RE[2]: Re: Re: Comment by kurkosdr
by SeeM on Fri 26th Apr 2013 08:19 UTC in reply to "RE: Re: Re: Comment by kurkosdr"
SeeM Member since:
2011-09-10

Aaah, Fedora, the most advanced unstable distro :-) Shouldn't be called Fedora Core but Fedora pre-alpha.


You're just jealous, because you still didn't switch from Ubuntu. >:)

Reply Score: 2

Kochise Member since:
2006-03-03

It must be something like that, especially since Fedora 13's installer Anaconda aggregated all my ext3 partitions together by default, scrubbing out my Ubuntu and my swap partitions, fucking my partition table.

Sure, I should have read more carefully, but I never thought that Fedora would follow a destructive behavior by default. Even Microsoft Windows' installer is by far the most careful and data preservative installer out there.

Kochise

Reply Score: 2

RE: Re: Re: Comment by kurkosdr
by Delgarde on Fri 26th Apr 2013 02:22 UTC in reply to "Re: Re: Comment by kurkosdr"
Delgarde Member since:
2008-08-19

Theoretically, this is true. Practically, Linux Desktop doesn't care about API stability and backcompat, so, for the average user who doesn't want to do things "at his own risk" (aka see things getting broken), it has the effect of a walled garden.


Can you give an example of that? Because while the *kernel* doesn't care much for backward compatibility, the desktops are pretty good at it.

Prior to the Gnome 3 release, those guys had been maintaining not only API but also ABI compatibility going back many years, and even with Gnome 3, they made sure the old libraries could be parallel-installed. So if by some chance you have some ten-year-old Gnome app you want to run, odds are you still can.

Reply Score: 4

As usual...
by Gullible Jones on Thu 25th Apr 2013 20:10 UTC
Gullible Jones
Member since:
2006-05-23

Better performance but not by much. Needs a bunch of stuff adjusted in order to be usable.

Transparent hugepage support -> OFF. Memory compaction makes desktops lag like mad during I/O.

zram swap -> ON. Much better than swapping (or using up all your RAM and running out of FS cache space).

"laptop mode" support in pm-utils -> OFF, because it screws up important sysctl settings.

Sysctl variables: vm.dirty_ratio -> 2, vm.dirty_background_ratio -> 1. Modern computers have lots of fast RAM and lots of slow disk storage; you don't wait until pending writes occupy 50+ MB of RAM before starting to flush, that's stupid.

Also, zcache -> ON. Not as helpful as zram swap, but can't hurt.

Overall it's not bad. But the enforced 3D desktop still hurts, as does waiting 5 seconds for the alt-tab window to pop up.

Reply Score: 2

"It's the disk I/O, stupid."
by Gullible Jones on Thu 25th Apr 2013 20:26 UTC
Gullible Jones
Member since:
2006-05-23

BTW, a note on performance. Pretty much every major serious performance problem I've run into on Linux has involved disk I/O. Whenever you have
- Thrashing due to low memory conditions
- Large programs being read into memory
- Big disk writes
things slow way down. Especially with the latter.

Current "solutions" on Linux seem to revolve around delaying writes for as long as possible. IMO this is doubly stupid, because
a) Eventually the data must be written, and when it is you want the write to be of manageable size.
b) If your desktop freezes, your kernel panics, or your power fails, you do NOT want saved data to be stuck in RAM.

However, I'll readily admit I don't actually have a clue what would constitute a sane, broadly applicable solution; or if such a thing could even exist. If anyone has ideas on that, I'm all ears.

Edited 2013-04-25 20:27 UTC

Reply Score: 2

RE: "It's the disk I/O, stupid."
by Kochise on Thu 25th Apr 2013 21:02 UTC in reply to ""It's the disk I/O, stupid.""
Kochise Member since:
2006-03-03

Same for Windows : having 8 GB of RAM I disabled the virtual memory : 1- faster application starting, 2- snapier computer, 3- saved 12 GB on my hard disk.

Kochise

Reply Score: 2

lucas_maximus Member since:
2009-08-18

LOL, you know that a modern OS expects virtual memory.

These hacks from Windows 2000 day that people used to do are laughable.

Reply Score: 3

Alfman Member since:
2011-01-28

lucas_maximus,

Swap was always a bottleneck, a modern system shouldn't really need swap given how cheap ram is. Most ram already goes towards disk caching anyways, so there's usually a very large safety margin on systems with 4GB+.

There's the "just in case" factor, but consider this: a user with 2GB ram might be recommended to have an additional 2GB swap, yet Kochise's 8GB of real ram without swap still has a safety margin 3x greater than the 2GB swap. Any set of applications that can run on the 2GB system (which is most of them) should easily be able to run with 8GB. I'd say the need for swap is practically obsolete on performance systems.

Reply Score: 2

lucas_maximus Member since:
2009-08-18

The OS still expects swap to be there. Windows 7 and 8 pre-fetch regularly used programs. Which is why it appears snappier.

Turning it off doesn't make the system any faster when there is plenty of memory and makes the system more likely to fail when running out of memory.

Turning it off has absolutely no benefit.

Edited 2013-04-27 12:15 UTC

Reply Score: 3

Alfman Member since:
2011-01-28

lucas_maximus,

"The OS still expects swap to be there. Windows 7 and 8 pre-fetch regularly used programs. Which is why it appears snappier."

I'm not sure how MS's implementation works, but swap isn't fundamentally needed for this feature. Pre-loading to ram could work without swap too.


"Turning it off doesn't make the system any faster when there is plenty of memory and makes the system more likely to fail when running out of memory."

Sure, if swap gets zero use, then the performance should be identical. The "benefit" of having swap is having it "just in case" as both you and I agree. But you have to recognize the implicit truth in that if 4GB total ram+swap is enough for your work "just in case", then 8GB total ram without swap would clearly be as well.

The whole reason for swap is to make software work in ram constrained systems. I haven't seen the "your system is critically low on resources" warning on systems with even less total ram than my new ones. You can make fun of Kochise for not running swap, but frankly he doesn't need it.


"Turning it off has absolutely no benefit."

It increases disk space. Haha, just kidding, just be more careful with "absolutely" ;)

Reply Score: 2

Kochise Member since:
2006-03-03

I also disabled swap on my 1 GB Windows XP without any problem so far, for 5 years now, running up to 3 or 4 applications at the same time. I used to work on Atari machine with 1, 4 or 14 MB and things were working pretty fine. We have now 1000x more power and complain ?

Kochise

Edited 2013-04-27 20:26 UTC

Reply Score: 1

lucas_maximus Member since:
2009-08-18

There is some swapping going on there. Otherwise the OS would just fall over.

http://lifehacker.com/5426041/understanding-the-windows-pagefile-an...

It also been known for a long time that a lot of the XP tweeks such as this are bullshit.

Here is a list of some of the other ones:

http://lifehacker.com/5033518/debunking-common-windows-performance-...

TBH when people claim you don't need swap etc. I don't think the really understand how most Operating Systems work ... THE KERNEL EXPECTS IT TO BE THERE ... irrelevant on whether it is needed.

Edited 2013-04-27 20:46 UTC

Reply Score: 3

RE: "It's the disk I/O, stupid."
by Alfman on Fri 26th Apr 2013 01:42 UTC in reply to ""It's the disk I/O, stupid.""
Alfman Member since:
2011-01-28

Gullible Jones,

"Whenever you have
- Thrashing due to low memory conditions
- Large programs being read into memory
- Big disk writes
things slow way down. Especially with the latter."

Actually it depends greatly on whether the reads/writes are sequential or random. Just now I conducted a short test:

I timed the time it takes to read 2048 (10MiB) sectors individually using a sequential access pattern at random starting disk positions. On my system it took 0.03s or 333MiB/s.

I read another 2048 sectors, but this time with a random access pattern. This took 27.71s or 0.36MiB/s.

As you can see, the slowness is almost entirely caused by disk seeking, raw speed is plenty fast and this is on equipment that's 5 years old already. Oh, I would have tested writes as well but I don't have a sacrificial disk to write to in this system ;)


"Current 'solutions' on Linux seem to revolve around delaying writes for as long as possible. IMO this is doubly stupid, because a) Eventually the data must be written, and when it is you want the write to be of manageable size."

You can disable delayed writes, but think about what that would do in terms of total seeks. I could write each sector immediately to disk as soon as I get it (and cause lots of seeks), or I could try to bundle as many sectors as I can together, and then write them all at once.

"b) If your desktop freezes, your kernel panics, or your power fails, you do NOT want saved data to be stuck in RAM."

Fair point.

"However, I'll readily admit I don't actually have a clue what would constitute a sane, broadly applicable solution; or if such a thing could even exist. If anyone has ideas on that, I'm all ears."

There are some answers to the problem. Servers have battery backup disk cache, but these aren't found on consumer equipment. It seems like it should be feasible on a laptop, since it already has the battery ;) Also, solid state disks don't have as much seek latency as the spinny ones. As for software FS solutions, we'd have to research ways to consolidate reads. If we are reading 2000 files at bootup, that's almost 30s wasted to seeks in my little experiment. The FS should really be able to keep them ordered sequentially, that way the OS could read all 100MB it needs within 2s.

Reply Score: 4

RE: "It's the disk I/O, stupid."
by Savior on Fri 26th Apr 2013 06:56 UTC in reply to ""It's the disk I/O, stupid.""
Savior Member since:
2006-09-02

BTW, a note on performance. Pretty much every major serious performance problem I've run into on Linux has involved disk I/O.


So true.

However, I'll readily admit I don't actually have a clue what would constitute a sane, broadly applicable solution; or if such a thing could even exist. If anyone has ideas on that, I'm all ears.


I don't know if it counts as a solution, but ionice'ing every big "offender", while giving priority to the UI could help. Unfortunately, nothing like this is being done -- whenever e.g. the apt cache kicks in in the background, everything just becomes unresponsive (it took me a while to find out why I experience sporadic lags). Or, we'd need a saner scheduler...

Reply Score: 2

Comment by marcp
by marcp on Thu 25th Apr 2013 23:04 UTC
marcp
Member since:
2007-11-23

They should have changed their slogan from:

"Ubuntu - Linux for human beings!"

to:

"Ubuntu - Linux for the advertisers!"

Also, they are actually making their users' lives so much harder and less secure/safe/private. And last, but not least - they break Free/Libre software rules intentionally. They seem to transform to pure OSS model. They don't need human beings anymore - not in the process of deciding to whom this will go eventually. They need ad clickers and app buyers.

It's all actually pretty sad view.

Reply Score: 1

RE: Comment by marcp
by Soulbender on Fri 26th Apr 2013 01:40 UTC in reply to "Comment by marcp"
Soulbender Member since:
2005-08-18

Also, they are actually making their users' lives so much harder and less secure/safe/private


Maybe it's you who don't understand what users want.

And last, but not least - they break Free/Libre software rules intentionally.


Really. What rule are those, exactly?

They seem to transform to pure OSS model


I'm sure meant to say something else because this would actually be a good thing.

Reply Score: 2

RE: Comment by marcp
by tylerdurden on Fri 26th Apr 2013 06:11 UTC in reply to "Comment by marcp"
tylerdurden Member since:
2009-03-17

It's all actually pretty sad view.


... have you tried looking out of, and not into, your rectum?

Edited 2013-04-26 06:14 UTC

Reply Score: 2

Wow, one downvote already
by kurkosdr on Fri 26th Apr 2013 09:41 UTC
kurkosdr
Member since:
2011-04-11

What the downvoters don't understand is that for most people, a computer is a means, not an end. They buy a computer to run programs on, like they buy a DVD player to watch movies on. They don't just buy Windows or OS X itself (the UI, the kernel etc), [u]they also buy the means to run the awesome apps ISVs have written for Windows or OS X[/u] ("they buy the ecosystem" in marketdroid-speak).
So, there you have Linux Desktop, killing a significant part of it's "ecosystem" by making it hard to impossible for proprietary ISVs to ship software on Ubuntu.

But hey, the people in the key positions in Linux Desktop who made the decision to have an unstable API and ABI weren't planning to use that proprietary software anyway, so it "works for them".

Browser: Mozilla/5.0 (Linux; U; Android 2.3.4; el-gr; LG-P990 Build/GRJ23) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 MMS/LG-Android-MMS-V1.0/1.2

Reply Score: 1

RE: Wow, one downvote already
by Alfman on Fri 26th Apr 2013 14:42 UTC in reply to "Wow, one downvote already"
Alfman Member since:
2011-01-28

kirkosdr,

"So, there you have Linux Desktop, killing a significant part of it's 'ecosystem' by making it hard to impossible for proprietary ISVs to ship software on Ubuntu."

"hard to impossible"? Seriously?? And what evidence do you have of that? This is isn't the ipad, it's ubuntu. You can install outside software by downloading the appropriate tarball from a webpage, extracting it, and then running it. Maybe the installation process could be improved for software outside the repo, but face it, it's the same as windows. Heck if you have wine installed you can even run a ton of compatible *windows* software too.

appdb.winehq.org/objectManager.php?sClass=application


"But hey, the people in the key positions in Linux Desktop who made the decision to have an unstable API and ABI weren't planning to use that proprietary software anyway, so it 'works for them'."

None of this affects usespace apps, which is most often what the "walled garden" refers to.

However even if you wanted to apply it to the kernel, the linux kernel is still more open than windows. Not only is windows source closed, but MS requires developers to buy code certificates to install their own drivers. That policy basically killed the open source scene in windows kernel development since the certificates cannot be shared and all individual would-be contributors need to buy their own certificates to build kernel modules for their own machines.


You are completely exaggerating the difficulty of running proprietary software on linux. If anything, proprietary software doesn't do well on linux because linux users are wary of closed software, not because there's anything preventing users from installing it.

Reply Score: 3

RE: Wow, one downvote already
by tylerdurden on Fri 26th Apr 2013 17:42 UTC in reply to "Wow, one downvote already"
tylerdurden Member since:
2009-03-17

It's ironic, because your need to disclose the inconsequential details of your browser on your posts seems to indicate you view your computer as an end.


In any case, there are plenty of commercial software being developed and deployed on Linux. Also you keep using that term "ISV" which does not mean what you want it to mean. Linux has is plenty of traction among ISVs, specially in the middle and back ends of computing infrastructures.

Yes, Photoshop may not support Linux anytime soon (if ever). But then again, none of the commercial EDA packages we use on linux are going to be ported over to OSX either. And neither are some of the commercial databases, accounting/personnel/financial middleware, industrial control suites, etc, which run on Linux and not on OSX (or even Windows). Applying your own logic, that must mean OSX is hostile towards ISVs then?

Edited 2013-04-26 17:45 UTC

Reply Score: 3

RE: RE: Wow, one downvote already
by kurkosdr on Fri 26th Apr 2013 19:25 UTC
kurkosdr
Member since:
2011-04-11

It's ironic, because your need to disclose the inconsequential details of your browser on your posts seems to indicate you view your computer as an end.


Excuse me? The (retarded) mobile version of osnews inserts this junk at the bottom by default when I post from my phone.

Browser: Mozilla/5.0 (Linux; U; Android 2.3.4; el-gr; LG-P990 Build/GRJ23) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 MMS/LG-Android-MMS-V1.0/1.2

Reply Score: 3

RE[5]: Re: Re: Comments by kurkosdr
by kurkosdr on Fri 26th Apr 2013 19:37 UTC
kurkosdr
Member since:
2011-04-11

Wow, can you please tell me an OS where there are no issues with any software when upgrading between releases?


OS X? Windows when upgrading to SPs? Of course there are always chances something somewhere might break, but Linux Desktop is just ridiculous. Sometimes it's *designed* to break, like when PulseAudio got released, or every time X.org breaks Intel and Nvidia proprietary GPU drivers, and it's practically *designed* to break the upgrade and drop you to a blank screen or CLI.

Anyway, Ubuntu's problem is that even on a clean installation, sometimes old app binaries just don't work, because the API is not stable.

And don't get me started how most distros get upgraded on a breakneck pace. I can do a format and clean install everytime a new Windows version rolls out, but every 6 months?

Browser: Mozilla/5.0 (Linux; U; Android 2.3.4; el-gr; LG-P990 Build/GRJ23) AppleWebKit/533.1 (KHTML, like Gecko) Version/4.0 Mobile Safari/533.1 MMS/LG-Android-MMS-V1.0/1.2

Reply Score: 0

Soulbender Member since:
2005-08-18

OS X?

No.

Windows when upgrading to SPs?

No.

but Linux Desktop is just ridiculous.


Ah. Lets look at the products you mentioned.

Steam has only been in Ubuntu for a few months so it hasn't even broken "in the last couple of releases".
Steam is also incredibly buggy even on Windows so it's not surprising that Valve would screw something up at some point in the Linux version. Yes, screwing up a dependency is Valve's fault, not Ubuntu's.

As for VMWare, well, the guy who works there already said it's not Ubuntu's fault if/when it breaks. That's not even mentioning how suspect your described problem is.

Finally we have Skype. Skype is, and always have been, badly engineered to begin with so I'm not really surprised it breaks now and then. This is the first time it has been broken in a new release though and I'm sure a fix will be out for that from Skype or Ubuntu soon and there's a workaround so its not like the sky is falling.

Ubuntu's problem is that even on a clean installation, sometimes old app binaries just don't work, because the API is not stable.


Sometimes old software doesn't work in a new/clean Windows or OSX either. Also, there's no Ubuntu API so I don't know what API you're talking about that's supposedly not stable.

And don't get me started how most distros get upgraded on a breakneck pace


So, uh, don't use something that's obviously not suitable for you? Just because it doesn't suit you doesn't mean there's something wrong with it.

Reply Score: 2

lucas_maximus Member since:
2009-08-18

Steam has only been in Ubuntu for a few months so it hasn't even broken "in the last couple of releases".
Steam is also incredibly buggy even on Windows so it's not surprising that Valve would screw something up at some point in the Linux version. Yes, screwing up a dependency is Valve's fault, not Ubuntu's.


Steam isn't that buggy on Windows, the problem was that Steam was expecting certain dependencies to be present that happened to be present on an older version of Ubuntu.

Ubuntu broke Steam.

Finally we have Skype. Skype is, and always have been, badly engineered to begin with so I'm not really surprised it breaks now and then. This is the first time it has been broken in a new release though and I'm sure a fix will be out for that from Skype or Ubuntu soon and there's a workaround so its not like the sky is falling.


It probably broken because it was compiled with a certain library that is either in a different place or a different version. The problem is that unlike Windows or OSX to a lesser degree, libraries in the system change.

Getting spotify working on fedora required me to symlink and it still wasn't stable.

Sometimes old software doesn't work in a new/clean Windows or OSX either.


Doesn't happen nearly as often. Which is the core of the argument. Yes there are work arounds, but you just don't have to do it on Windows.

Reply Score: 3

Soulbender Member since:
2005-08-18

Steam isn't that buggy on Windows


TheDailyWTF tells a different story.

Ubuntu broke Steam.


No. Things changed and Valve wasn't on the ball. Ubuntu is their only officially supported Linux distro so we should really expect them to handle stuff like this.

Doesn't happen nearly as often. Which is the core of the argument


I can't remember when a program I use broke between Ubuntu versions and I've been using Ubuntu a long time.
Of course, I can't remember he last time it happened in Windows either.

Reply Score: 2

lucas_maximus Member since:
2009-08-18

TheDailyWTF tells a different story.


Oh-comon .. most of those screen-shots were actually content update errors from their Content Management System on their webstore. So their webstore was error'd not the application itself.

There were 2 actual application errors.

No. Things changed and Valve wasn't on the ball. Ubuntu is their only officially supported Linux distro so we should really expect them to handle stuff like this.


Well we have to differ on opinion on that one. I don't have to worry about Steam breaking with Windows Updates.

I can't remember when a program I use broke between Ubuntu versions and I've been using Ubuntu a long time.
Of course, I can't remember he last time it happened in Windows either.


If it is open-0source I doubt it will break, but anything outside the supported repos. You are on your own.

Edited 2013-04-28 09:24 UTC

Reply Score: 3

Soulbender Member since:
2005-08-18

I don't have to worry about Steam breaking with Windows Updates.


Perhaps but lets say, for arguments sake, that a Service Pack would change some functionality and break Steam I'm sure you would expect Valve to handle that and not Microsoft.

Reply Score: 2

lucas_maximus Member since:
2009-08-18

Perhaps but lets say, for arguments sake, that a Service Pack would change some functionality and break Steam I'm sure you would expect Valve to handle that and not Microsoft.


Depends whether it was documented functionality or not.

Reply Score: 2

lucas_maximus Member since:
2009-08-18

And don't get me started how most distros get upgraded on a breakneck pace.


Only if you choose a distro with such an upgrade cycle.

Slackware is yearly, CentOS and Scientific Linux are far longer.

Also I have upgraded fedora quite a few times between versions and most of it has been okay. While it isn't Windows levels of support ... it isn't quite as terrible as you are making out.

Reply Score: 3

Update to Ubuntu 13.04 guide
by maknesiumblog on Sun 28th Apr 2013 06:17 UTC
maknesiumblog
Member since:
2013-04-27

Hi,
I wrote a little step-by-step guide with screenshots to illustrate the upgrade process from Ubuntu 12.10 Quantal Quetzal to 13.04 Raring Ringtail very precisely.
I hope this screenshot-guide is of use for those looking to upgrade to the latest version of Ubuntu.

I am looking forward to your feedback!

http://www.maknesium.de/upgrade-to-ubuntu-13-04-raring-ringtail-in-...

Reply Score: 1

RE: Update to Ubuntu 13.04 guide
by Kochise on Sun 28th Apr 2013 12:41 UTC in reply to "Update to Ubuntu 13.04 guide"
Kochise Member since:
2006-03-03

You mean, it's not as straightforward as "Click next, click next, accept eula, click next" like in Windows ?

Kochise

Reply Score: 2