

This is just treating the symptoms of a wider problem in that Linux distributions don't have a sane software installation method for non-core distribution applications.
The next question is how long you will keep providing 'backports' (because that's all this is) for, and for how many releases, and who is going to quality check them. It just isn't scalable.
This just sounds like rearranging some deckchairs to be honest.
Edited 2010-11-24 18:13 UTC
So you're telling me the Windows way of maintaining 3rd party programs is BETTER than the repository method used by Suse, Redhat, Debian, Ubuntu, etc.
That sir is Moo - as in Udder Nonsense.
What's your definition of maintaining? I think he's referring to straight-up installation. You want the new Firefox? Go to firefox.org and download it. You don't have to wait for your flavor of distro to roll it out via the repo. And you don't often have to worry that any prereqs for the app are going to break your other apps, just as it's rare for a Windows Update to break things. Updates and service packs might break custom or in-house enterprise apps, but i don't think it's an issue most general users face.
If you want chrome on ubuntu for example you can get the deb from google, double click on it and it will install and automatically hook up with apt so that you get automatic updates (unlike in windows where a lot of apps have their own update application, I have like 5 running now: "adobe updater", "apple software update" etc). Is that really much more difficult than running a installer in windows?
Mozilla prefers to have the distros package their own builds, but they could have it working in a similar way if they wanted to, you can't blame it on linux that they don't.
1. Developers have to provide backports for every single release. When they can't be bothered you're out of luck.
2. It's Ubuntu specific.
3. There's a reason why the update menu item in Firefox was disabled on Linux when it wasn't on Windows and Mac OS. It just looks stupid.
1. Developers have to provide backports for every single release. When they can't be bothered you're out of luck.
2. It's Ubuntu specific.
3. There's a reason why the update menu item in Firefox was disabled on Linux when it wasn't on Windows and Mac OS. It just looks stupid. "
1. Developers have to provide "backports" for windows XP, 2003, vista and the various editions of them too. It the same thing for Linux, just choose the oldest release you want to support and make packages for it, then fix any problems when installing them on newer releases. If you require any newer libraries on Linux they can be bundled.
2. Ubuntu was just an example, fedora and OpenSuse works in a similar maner. Covering them is probably enough.
3. As I said, they should hook it up with the built in package management, there is no need to provide an included update application when the OS already provide the required functionality. The technology is there, if they choose to use it.
Well let's look at the amount of applications that you can install on Windows and the fact that you can install Open Office on a nine year old OS in Windows XP but not on a nine year old Linux distribution.
The only thing that Windows lacks is a means for applications to have their own update repositories and systems.
Well you can call me sir all you like, but no it isn't, unless of course your goal is to make it as difficult as possible for users to get updated versions of applications and for developers to get those applications to users.
The reason why Ubuntu is looking at this approach of providing continually updated applications is because the 'better' way you describe really isn't.
Edited 2010-11-25 13:25 UTC
"Well let's look at the amount of applications that you can install on Windows and the fact that you can install Open Office on a nine year old OS in Windows XP but not on a nine year old Linux distribution.
The only thing that Windows lacks is a means for applications to have their own update repositories and systems."
You forgot that windows development pretty much stagnated for some years before MS released Vista in late 2006. And vista sucked. Hence a lot of people still use XP. How many people do you think are still using a 9 year old Linux distro on a desktop?
And besides, XP has recieved 3 service packs nad hundereds of updates over the years. Do you think a pre-SP1 XP from 9 years ago would actually run modern software?
"Well you can call me sir all you like, but no it isn't, unless of course your goal is to make it as difficult as possible for users to get updated versions of applications and for developers to get those applications to users.
The reason why Ubuntu is looking at this approach of providing continually updated applications is because the 'better' way you describe really isn't."
I disagree. Having a central package management system is far superior to having 100 different installer and updater applications. How is that "as difficult as possible"?
The problem is that a lot of developers leaves it to the distros to make packages rather than making their own.
Edited 2010-11-25 18:38 UTC
People still use XP because there is a large, stagnant installed base, especially in businesses, who can't and won't upgrade as fast as some think they should. A nine year old Linux distribution should be so lucky, and the reason why people don't use one that old now is simply because few want to run Linux.
The rest of your comment are unrelated criticisms of XP and I don't see them as relevant.
Because people want to upgrade their applications, get new versions and actually keep their system relevant to the work that they want to do. Upgrading it every six months to do that is just plain stupid, hence why Ubuntu is looking at doing this.
Unfortunately, trying to do it through a central repository system is a gross duplication of manpower and resources where they will have to backport to each and every single release and there will inevitably be a delay until new applications appear. They'll also have to work out how long they will provide backports for. Most upstream developers refuse to support many distribution packages as well.
It's swings and roundabouts, pros and cons, and simply making a sweeping statement that a central repository system is the best way is just nonsense. It isn't. There are just glaring disadvantages that people paint over.
Chicken and egg. There is no sane installation and configuration system for third-party software in any Linux distribution that doesn't interfere with the distribution itself. A package management system isn't enough. When someone has come up with one they have been consistently told that they're stupid.
Look at how easy it is to configure MySQL through a configuration wizard on Windows versus the hoops you jump through when you install on Linux. That's just the tip of the iceberg.
You might think that but I'm afraid the sign of success is that you have a lot of people using a wide variety of older systems. The reason why no one runs a nine year old Linux system, on their desktop anyway, is because no one uses it.
I'm sure there are nine year old Linux server systems that haven't been upgraded simply because they'll be running applications quite happily, and there isn't the time or the urge to mess with them because someone on OSNews thinks they should be upgrading continually.
It doesn't work like that.
How is mentioning the fact that MS has maintained and updated XP over the years "unrelated critisism"?
That's the actual problem.
I see more advantages than problems.
Look at how easy it is to configure MySQL through a configuration wizard on Windows versus the hoops you jump through when you install on Linux. That's just the tip of the iceberg.
Yes there is. Packages can trigger post installations processes, including starting a wizard or whatever. If you install the dropbox client on ubuntu for example it will ask you to restart nautilus and to start the client. When you do it detects that it runs for the first time and shows a wizard that lets the user configure it, just as in windows. MySQL could do the same thing. The technology is there, it's just up to developers to use it.
Edited 2010-11-26 17:44 UTC
Well actually OpenOffice 3 runs on Win2k SP2 or greater. But yes just about everything runs on XP SP2 and SP2 came out in 2004.
I disagree. Having a central package management system is far superior to having 100 different installer and updater applications.
You can have a single installation and updating service without having a bunch OS/application interdependencies. See the iphone as an example.
The problem is that a lot of developers leaves it to the distros to make packages rather than making their own.
Oh so it is the fault of developers now. Is it the fault of developers when a distro breaks a program with a system update? Leaving it distro package managers is how (open source) developers deal with the mess. Some don't have the time and others simply don't want to waste it testing and packaging.
2000 was unitl recently still being maintained and updated. XP still is, and it's still used a lot. 9 year old Linux distributions aren't in most cases. That's the difference. But technically you could probably make OO 3 run on an old Linux system but there is no point doing so.
You could have that in Linux too. Compile static binaries or bundle required libraries. Or use something like java. The .tar.gz distributet by mozilla runs on pretty much any Linux system with no need to install dependencies.
Did I say that? Where?
Anyway, if you're targeting a stable release series chances are slim that an update will break the application.
2000 was unitl recently still being maintained and updated. XP still is, and it's still used a lot. 9 year old Linux distributions aren't in most cases. That's the difference. But technically you could probably make OO 3 run on an old Linux system but there is no point doing so.
It has nothing to do with being maintained and updated, it's called a stable interface. Of course you can get OO 3 to run on an older system but it requires far more steps than just going click-click-click. Why does everyone have a hard time admitting this is a problem? Forget even comparing Linux to Windows. Both FreeBSD and OSX are much better at maintaining binary compatibility.
You could have that in Linux too. Compile static binaries or bundle required libraries.
It's a PITA compared to Windows and OSX and there are unstable components of the system that cannot be added to the package. Ubuntu broke some statically compiled games by screwing with the sound API. What were developers supposed to do in that case? Distributing software outside the package management system is a major annoyance and has held Linux back.
Anyway, if you're targeting a stable release series chances are slim that an update will break the application.
Someone in this thread already pointed out how you can't install Postgresql 9 on 8.04 through the packaging system. 8.04 LTS came out in 2008. Is this acceptable to you? Requiring a major system update to install a freaking command line database program?
Note that this was filed on 2008-10-14:
Please backport OpenOffice.org 3 to Hardy
https://bugs.launchpad.net/hardy-backports/+bug/283137
It's a PITA compared to Windows and OSX and there are unstable components of the system that cannot be added to the package. Ubuntu broke some statically compiled games by screwing with the sound API. What were developers supposed to do in that case? Distributing software outside the package management system is a major annoyance and has held Linux back.
Someone in this thread already pointed out how you can't install Postgresql 9 on 8.04 through the packaging system. 8.04 LTS came out in 2008. Is this acceptable to you? Requiring a major system update to install a freaking command line database program?
Note that this was filed on 2008-10-14:
Please backport OpenOffice.org 3 to Hardy
https://bugs.launchpad.net/hardy-backports/+bug/283137
I didn't say the situation today is great. Only that a central package management system is better than 100 different installer and updater apps. And of course this would work better if Linux distros would provide stable interfaces but that's beside the point.
Developers could provide packages that works across diffrent versions and fix them for the changes but in most cases they leave it up to the distros to make their own packages. Hence the current situation is not because the way linux distros handle software installations.
And what would be the better alternative? Lke Windows where there are 5 billion installer systems, few companies provide automatic updates and those who do all use their own update system?
Read what Ubuntu is proposing here and you'll realise that the status quo is not acceptable. They know there is a massive problem where applications are tied to a particular distribution version, meaning that you need to upgrade every six months if you want to get a new application version. That is just plain stupid.
The Windows system at least provides the means to install a wide variety of applications, and you can install Open Office on a nine year old system in XP that you can't do with any Linux distribution. What it lacks is a sane update and remote installation mechanism, but that's because software installation on Windows has existed for a very long time.
Scale this up for Ubuntu and they're going to have to maintain an extremely long list of backports, the quality of which will inevitably be compromised. Where applications are concerned that just shows you that it's the developers and users who should be responsible for maintaining and installing the software that they want to use.
Eventually they will end up realising this after another ten years maybe, but until they do they'll have to jump throuhg hoops such as just where they will draw the line as to what they will update in a distribution and what should be kept static.
Edited 2010-11-25 13:40 UTC
- they vary in flexibility and functionality ( eg some operate via scripts, some can install the Application on a per component basis) but all of them basically do the same thing, extract the application files to the installation folder, and set some registry keys
- the underlying system (as applications and installers are concerned) is for all intents and purposes the very same (thus, a unified platform compatible with itself across releases) for over a decade - thus allowing nearly any combination of <<arbitrary application for "windows">> and <<arbitrary windows version>> to work
Scenario: You are presented with two systems with equivalent functionality: Office suite, multimedia, graphics applications (such as photo management, raster & vector graphics editors), CD/DVD burner, Internet suite (email, IM, browser), PIM, etc
If you have one which is a Windows system that is "stale" ... hasn't been touched in a couple of years ... and the other a Linux system which is also stale (also hasn't been touched in a couple of years) ... and you are asked to bring them both right up to date without losing any user data ... it is absolutely a given that the Linux system will be done in less than a quarter of the time of the Windows system.
That is a reason to care.
Edited 2010-11-26 02:47 UTC
The App bundles of Mac OS X is in my opinion the best compromise to date for installing new software easily and removing it without leaving junk in system folders.
Updating is an issue, though. Apple should provide a standard update procedure, allowing third-party software to be updated along with system packages through the Apple Software Updater tool.
But looking at how much they care about their users keeping their system up to date, I wonder if this is going to happen before the only way to install and update software on a mac is via a "Mac Store" using Apple's repositories, effectively making the replacement of ASU able to update all software.
Edited 2010-11-25 20:28 UTC
Superficially the OS X system looks great....until people actually start developing real applications on OS X that have shared, logical dependencies on one another.
It is based on the assumption that most non-system applications, developped independently from each other, don't have such dependencies on each others. That Photoshop does not depend on Office to work, that Pro Tools does not depend on Corel Painter, etc...
This assumption works very well on the desktop, as long we don't have a lot of apps depending on several hundreds of MB of common non-system libs, in which case sharing is best.
So...
-The set of system libraries must address all common needs
-Developers must use it.
Sounds reasonable to me.
Edited 2010-11-26 05:54 UTC
Just silly.
Hold on a second. Nobody has to reinstall "Linux" every six months to have their applications upgraded. They have to reinstall Ubuntu and that's quite a difference.
My old desktop has Debian Sid running from the same install since 2004, survived several parts replacements and it keeps going without missing a beat. My laptop has been running Sid since 2007 without any problems either.
I am now running Sid on my newest quad core desktop since two months ago and it flies on it. Something tells me that it is going to be there forever, too!
And I am increasingly looking towards Arch to satisfy my urge to stay on top of the latest KDE improvements before everybody else knowing that I won't have to reinstall anything at all and the fact that Arch does not stray too far away from upstream - if at all - just sweetens the deal.
That people have to put up with Ubuntu's weaknesses because they don't know better is one thing. But do not lump all the Linuxes together with Ubuntu just because one does not want to look elsewhere.
I am really hopeful that Canonical can pull this off, though. They already have the blueprints (hint, hint, nudge, nudge) so it is just a matter of following them...

Edited 2010-11-26 12:21 UTC
Bogus; the Linux software managment system is *amazing* and far better than any of the comparable alternatives. yum / apt / zypper are all good tools. More projects should use the OBS; then they can offer one-click installs that automatically add the repo and install the packages. This works VERY well. Check out installing Monodevelop and Banshee as good examples.
Of course Ubuntu is the worst for this because it is more of a fork than a distro - but it still works reasonably well. Switch to openSUSE for a distribution "for humans who need to get work done".

It does, somewhat. Stability means finding a combination of packages that works, and sticking with it. And typically, that means that those concerned with stability tend to be one or two releases out of date - upgrading only once the version they're on is end-of-lifed by the vendor.
It does, somewhat. Stability means finding a combination of packages that works, and sticking with it. And typically, that means that those concerned with stability tend to be one or two releases out of date - upgrading only once the version they're on is end-of-lifed by the vendor. "
I think what fepede was getting at was that it's not a one-to-one relationship. IOW, most (all?) stable software is "old", but not all "old" software is stable.
If they would like to go down the path of rolling releases or just more frequent updates to packages such as Firefox and LibreOffice. Then they should implement delta updates with a fall back to normal packages (in the event that the delta cannot be applied for some reason).
Currently the smallest change to the Kernel or OpenOffice can mean that there is over 50MB to download.
Edited 2010-11-24 18:36 UTC
I guess I better get upgrade from Woody then.
But serially, folks, this is a good move. One thing my son really disliked about Linux was how long he had to wait for the latest Firefox or OpenOffice. When Firefox releases a version with a 10x improvement in javascript performance, you don't want to wait 6 months for it. I know you can manually do some of this, but this defeats the purpose of the repository concept.
I assume there will still be an LTS release (that does not do this) for the millions of enterprise customers...
It doesn't. That's the central point behind the central repository model - that you can control what applications get installed.
Unfortunately, that means that there is still a relatively long lead time between the release and when it appears in a new distribution or in umpteen backport repositories - as opposed to developers creating one sane installation package to users as soon as a new release happens that they can support.
Edited 2010-11-25 23:29 UTC
This is a sensible approach AFAICT - and very close to my idea of a multi-tier distribution.
1) Stay rock-solid in regard to the toolchain (GCC, glibc, binutils and such) - updates every 18 months, except for critical bugfixes and security issues
2) 6 or 12 months update cycle for lower level libraries above the toolchain (the kernel (not part of the toolchain), GTK+/QT, WebKit and such) - of course with the usual quick releases when fixing critical bugs and/or security vulnerabilities
3) End user applications and libraries specific to these applications (bygfoot, firefox, libreoffice, chromium, epiphany blahblahblah) can be updated at will.
Another option is to make sure that API and ABI are not broken when releasing new versions. In this regard the *BSD's tend to be much ahead of GNU/Linux.
Agreed. A step in the right direction would be a central "repository" for patches used by the different distributions. Today it's a true PITA to search for all the patches used by different distributions, comb them for "useless" (read: distribution specific) patches and apply those which one can use.
LSB is kind of a step in the right direction, but then again - it's really not.
True. And it's quite annoying that API and ABI is constantly broken. ABI breakage can be fixed through recompiling and/or relinking. API breakage is much worse.
I've played a lot with LFS, BLFS and recently CLFS (and CBLFS) in an attempt to create a multi-tier distribution, but it's painful to maintain. Of course, with me deviating a lot from the Linus FSH, it's kind of my own fault.
I've tried applying different package management systems, but neither .deb nor .rpm seem to be adequate, and attempting to use Slapt-get creates a dependency on a rather old tar-package. I like the latter approach though, since the extended slackware format is less obnoxius than .deb or .rpm.
This is the way it should be done, for a desktop distro, and Red Hat and the RHEL clones do this already, except in a very extreme way.
The kernel is locked to a version, and RH backports improvements to keep the API/ABI stable. I haven't paided attention to the toolchain as much, but I believe they follow a similar philosophy. Applications get updated with patches and revisions, but maybe not version numbers. Once again, I haven't looked at the applications that closely.
What you could do is use the RHEL kernel and toolchain as a base, and create a repo which provides source built software. Maybe have something like pacman and yaourt like Arch LInux has.
Edited 2010-11-26 17:38 UTC
As long as all the proper testing is done before releasing the software for update.
I like incremental updates. Both Mac OS X and Windows 7 do that - and you have the option to NOT update the apps or system components if you would rather wait and see all the posts in forums the next day from the people who installed the updates then ran into trouble.
[edit]
Also, when I ran Ubuntu I tended to go out and get the latest versions of software (non-Ubuntu repo) and install it manually anyway. This way people would be less tempted to do that and not risk messing something else up with shared libraries or something. Tho' in the case of Firefox I'd install into my $HOME directory and just set my path and start icon[s] accordingly.
Edited 2010-11-24 18:41 UTC
Which means you never saw the following in the theatre:
http://www.imdb.com/video/screenplay/vi2750742809/
It was the saddest and most boring movie I ever saw. I haven't seen it in 28 years, but I'm pretty sure I don't want to again, without the ability to fast forward or stop. But I don't know maybe your unicorn-itis will be cured, or sated, or embiggened.
I've always been a fan of ArchLinux and its sane rolling system. Ubuntu should do this, IMHO. They could release a snapshot every 6 months to supply clean installations.
(And, yes, by that it could be the end of the ridiculous out-of-nowhere animal names and weird inflated release versions)
It seems to me that Canonical is trying to flee from quality issues in their latest releases. Using PCLinuxOS atm, I can only imagine how demanding it must be to continue the rolling release method without breaking anything. Canonical, looking for a similar release scheme, has to improve their QA drastically IMO.
I've said it before, and I'll keep on saying it until some Linux distro clues into it: you need to create a clear separation between "base OS" and "user apps". And they need to be developed separately, but in tandem.
Windows does this.
All of the BSDs do this.
MacOS X does this.
It's only the Linux distros that don't.
You can install Windows XP today, and run the latest version of Firefox on it. Or the latest version of OpenOffice.org. Or the latest version (with a few exceptions) of AppX.
Same with the BSDs. You can install version Y from 3 years ago, and still install the latest (with a few exceptions) version of AppX.
Same with MacOS X.
But it's almost impossible to do that with a Linux distro. Want the latest Adobe Flash 10.1? You need to upgrade GTK+, which means you have to upgrade glib, which means you have to upgrade half your installed packages.
Want to install the latest Firefox? You have to wait for your distro to include it, then upgrade a bunch of inter-related packages. Or download it from Mozilla, and have it poorly integrated.
We're fighting with this right now with Debian. Even on our 5.0 (Lenny) boxes, we're stuck with Flash 10.0 due to the GTK requirement being higher than what's available in the Lenny repos (no, we're not going to install GTK from the backports repo, as that requires upgrading some 100+ packages). And we're stuck with 9.0 on our Etch boxes for the same reason.
But, I can install Adobe Flash 10.1 on FreeBSD 7.0, released how many years ago? And on Windows XP, over a decade old. Without having to upgrade half of the installed OS.
Repos are good for package management. But the same repo shouldn't be used for the core OS and the user apps.
Indeed.
I like MacOS X / BeOS way of dragging a file to a location and it is installed, removing the file removes the software (well most of it, there can still be configuration files left behind).
Windows is worse in some respects because records in the registry get left behind when removing software. Also, a lot of software provides think that it is an excellent idea to have their separate update notification programs load automatically upon login in and staying loaded. However your point remains true its easy to install programs in Windows without issue compared to Linux distributions if the software you want is not in the repositories.
There would not be so much of a problem in Linux distributions if the community actually agreed and stuck to a set of standards. The distributions could use their own internal format like DEB for Debian and Ubuntu or RPM for SuSe and Fedora. But they should still be able install for example PBI file if that was the agreed standard.
OpenSUSE has been doing this for some time now. The build service allows for the creation of separate repos containing software that is automatically built against specific versions of the underlying OS.
I've been easily able to stay synched with the latest versions of KDE, firefox and OOo etc. by adding the appropriate repos, without having to worry about the underlying OS breaking.
Although the build service was designed to support additional distros such as fedora, deb and ubuntu, I thought that the other distros were picking up on this as well.... Isn't that what the Ubuntu PPAs are all about? Honestly don't know, I don't really follow Ubuntu.
The biggest problem I've always had with the Linux world is the enormous number of distributions using custom patches that unleash hell onto the end user and the upstream project being inundated with bug submissions caused by those stupid patches.
Could we please have distributions stick to vanilla builds that don't use insane optimisations? I'm not really asking for much - stick to the pure source provided by the upstream project and when it comes to compiling I'm quite happy with the distribution sticking with -Os instead of the alphabet soup of tweaks and optimisations afterwards.
We're at this situation because with each patch, with each optimisation the further apart the distributions become in terms of compatibility. A bug that might appear in one distribution doesn't appear in another, a problem with one distribution might not appear in another thus making upstream projects like Chrome, Firefox and so on pull their hair out dealing with distribution specific bugs.
I've always been tempted to, one day, just create such a distribution that doesn't attempt to be exciting but merely a stable platform that gives end users what they need but without all the patching and fanfare that sometimes occurs by distributions.
I can't mod you up, so I'll just say: Amen, Reverend.
Particularly the parts about stupid patches and sticking to vanilla builds.
Oh yeah, and the part about merely being a stable platform instead of exciting.
Well, really - all of it.
EDIT: Missed the part about your thoughts on creating your own distribution (part of merely being stable). Been there, tried it. Might try again, if anybody's with me *nudge nudge wink wink*
Edited 2010-11-25 03:24 UTC
Obviously
I can't remember the last time gentoo was "exciting". All the odd stuff happens in other places mostly - except for assogiate :p
Of course one can make gentoo very "exciting" - but you can also make it very stable and bland. I'm doing the latter, though I admit to be running Compiz.
Oh I probably shouldn't forget paludis, but it's not in "stable" so I haven't switched to it. I almost never go past "stable" when it comes to system-stuff.
Finally, something Ubuntu has done recently that I agree with. It seems that for every release the last few years, Ubuntu has made some annoying, stupid, or outright braindead decisions with seemingly no real/good reason. They have lost me as a result of putting such crazy ideas that come off as sounding like they just want to dumb their distro down in ways Microsoft and Apple have done in pretty much every version of their distro since... well, I lost track, it's been so many versions ago.
But I can't deny, this is a step in the right direction for those who would like to keep up to date without... eh, updating the whole OS every six months, fetching third-party packages constantly, dealing with many third-party repositories, or even worse--having to go the trial-and-error route of building from source and hoping it works (hint: more than half the time, in my experience, it doesn't).
This was always one of the benefits of Windows, which I somewhat miss. Sure, its "software management" if you can even call it that is a joke, but it works, and it allows you to download an installer of a program the second a developer releases a new version on their site and install it. If you don't like it--no problem, keep the old version's installer and revert. Linux simplifies package management by centralizing it, but it's horribly limiting when you're pretty much forced to stick to the same old, often-outdated packages that came out with the version of the distro that is running.
In reality though, this is not surprising given that Ubuntu already made this change for Firefox not too long ago. I was glad when they did that, and this really seems like the next logical step to continue what they started with the Firefox exception. Hopefully this is done well enough that an Ubuntu version can be chosen in the future, and if it works well enough, then only basic applications can be upgraded to newer versions, not compromising the stability of the underlying OS. And if a version doesn't work very well... a slightly older version could be used but with common programs updated.
As a wholeheartedly user and lover of Debian Sid, I see this latest move from Ubuntu as something mostly positive, except that I don't really see the point as far as their current user base is concerned.
I mean, you see these people bitching and moaning everywhere on the Internet that Ubuntu is slightly behind one or two versions for some popular software and that it is a bitch to reload everything every six months at the same time that they sing praises to the fact that each iteration is more or less a no moving target and therefore they don't have to update it as often as, say, Sid or Arch users.
I've recently set up Linux Mint (the regular Ubuntu-based one, not the Debian-ish flavor) on my brother's machine - which was somewhat wary initially but fell in love with it once I installed MediaTomb for his PS3 and a few other goodies - and after having tried it for a while, the one thing that stood out from him was a remark as to why I put up with so many updates on my machine whereas his only gets them every now and then.
I did point out the advantages of having patches and new features available on a nearly daily basis but it was clear that I was saying gibberish to his ears. These people simply don't see the point in updating their systems - hence the large number of users that disable the automatic updates on Windows despite every warning to not to do it - and strangely appear to be averse to the idea.
Furthermore, it is clear at least to me that at this point in time the QA work done by Canonical is nowhere near the minimum acceptable for such a thing to work properly and they tend to patch things unnecessarily. A LOT! To such an extent that its distro usually does not survive an upgrade - and it is not even recommended - despite its Debian roots. Even a distro like Debian, which has an overwhelming amount of developers, will let a few annoying things slip through the cracks every now and then so I have a heck of a hard time seeing how Canonical expects to pull this off.
Experienced users that want a rolling release lifestyle would probably look elsewhere anyways.
I am seriously looking forward to see how this will work out.
Edited 2010-11-25 11:55 UTC
Near enough this one.
http://www.tmrepository.com/trademarks/linuxforgrandmas/">Linux...
PostgreSQL 9.0 is better in every way compared to PostgreSQL 8.4, but you can only get it in 10.04 and 10.10, and that's through a PPA. Want to have the latest, greatest, and stable PostgreSQL on your 8.04 LTS server? No such luck.
Same goes for Python. Python 2.6 is better in every way compared to Python 2.5, and can run all Python 2.5 code without any problem. But you can't have it in 8.04 LTS, unless you compile it yourselves.
I hope Mark is not removing the ridiculous policy for desktop applications only.
I see I was modded down for that comment.
I may have to do a blog post that shows how LTS is a joke compared to FreeBSD.
Maybe I'll also point out how Linux had some serious kernel exploits this year that left a lot of websites hacked. FreeBSD has a history of being more secure and stable, that is a fact that can be buried but not denied.
Nice idea, I really hate the fact that I have to stick with not-at-all-fast Firefox 3.6 for another 5 month. Of course there is Mozilla's PPA, but it is buggy as hell.
If only someone could solve the other problem of the deb-style package management: the lack of incremental updates. Why does at least 100M have to be downloaded every time I type 'sudo aptitude safe-upgrade'? That generates lots of excess traffic, which can be quite a nuisance on a wireless connection.
They can't get upgrades to work. They break systems on each upgrade, yet they want us to believe they'll be able to backport apps (for how long?) and keep everyone happy.
If they only could guarantee that they can provide non-breaking upgrades, this wouldn't be much of an issue. I would gladly wait for the next release if I knew I wouldn't have to re-install because upgrade won't work.