KDE is working on some interesting stuff. Wayland support for one, but they’re also going to work on the frameworks for… KDE 5.0. Yes, I’d say KDE 4 still is far, far from done, but 5.0 is already on the horizon. This time around though, they’re not going to pull a KDE 4.0, since this is mostly going to be about lower-level changes.
Min-Kwin the core of KDE 5.0?
According to a new report, there are as many as 6,000 references to Min-KWin in an internal KDE 5.0 repo. This may provide more clues and validation of KDE’s Wayland plans for the coming K Desktop Environment release.”
Dear KDE developers.
Please make KDE 5 reliable, I wasn’t able to use any version of KDE 4.x, because even 4.7 has memory leaks and consumes 100% of one of my cores.
Take your time, do what you have to do but please, don’t let users down, I know you can and I trusth you.
Please install valgrind and the faulty application debug symbol (and those of glibc, kdelibs and qt). Then run a leak check and report the result. This should be quite straightforward to fix with a proper bug report. You can also use callgrind, part of valgring, to produce the equivalent for the 100% CPU problem.
Bugs with that kind of information usually get fixed quite fast. I submitted a bug to Xorg a few week ago and it was fixed within 24 hour. These bug happen, your probably not the only one, but it does not happen to everybody, so if none of the dev can reproduce it, none of the dev can fix it. But with a complete valgrind memory leak and callgrind output, the problem will be easy to spot.
I know generating that kind of data for a normal user is quite challenging (but quite rewarding too), but you know, it’s open source, someone have to do it.
*some application need to be called with –nofork to prevent them from going in the background, this is needed to produce a good valgrind output.
**the valgrind manpage is quite interesting, but to make a long story short –leak-check=yes –track-origins=yes –leak-check=full –show-reachable=yes
***for callgrind –tool=callgrind
In order for the OP to do his, there would actually have to be a real bug.
Are you saying KDE4.7 *doesn’t* have bugs which cause ones system to grind to a halt? That would be at odds with my experience.
Any time I’ve tried KDE4 as a whole I experience occasional freezes and lockups, plus excessive CPU usage, requiring me to occasionally kill various processes (plasma stuff mostly). I could provide a ton of anecdotal evidence about how e.g. running Kmail can cause runaway CPU usage.
It’s not my hardware and it’s not my imagination; it’s KDE, and I say this as a fan of KDE. I can’t even begin to debug whatever it is that’s wrong because I can’t ever isolate where the problem is coming from. I have on occasion had to kill and kill and kill KDE-related processes before I get normal non-KDE behavior back. Running Kmail shouldn’t require me to occasionally open a terminal and start issuing killall commands.
If I wanted to hate my computing experience, feel powerless to understand much less solve my problems and get told that “Shrug, it works for me, must be you” I’d use Windows.
My anecdotal evidence differs from yours and I use KDE heavily.
I guess the lesson here is no software is perfect (which is obvious to anyone who the slightest idea about IT) and that anecdotal evidence is usually easily contradicted.
Anecdotal evidence doesn’t contradict anecdotal evidence, that’s the problem. It never proves anything.
I’m almost convinced that people who report no problems and who use KDE all the time have simply grown accustomed to deficient behavior. They probably don’t experience the worst problems (input locks, computer unresponsive, one finger salute to resolve) but the “minor” annoyances where stuff is awfully slow or excessively memory hungry are shrugged off as normal.
I’m still open to the idea that it depends on your distribution. What distribution are you using? I base 100% of my experience off of the Debian packaged version of KDE.
I’m telling you, I’ve been running kde 4 series for two years at work, home, and on the go. I’ve experimented with Gnome, lxde, x-monad, awesomewm, xfce, fluxbox, openstep. KDE with (Kwin with graphical effects off), is as fast as any of them on my modest hardware.
Kwin’s composite renderer seems to have various issues with *one* of my graphics cards that has caused it to crash and run slow as dirt, but I’m hoping 4.7 will fix that. Ironically, it happens with my most powerful graphics card that works great under gnome3 and/or compiz.
I’ve installed different distros in 3 kind of laptops and one desktop computer with the same bad luck with different video cards, with open and propietary drivers, KWin slowdowns, KNotify using 100% CPU, and other glitches, and I know im not alone here, that’s why I’d like to se a reliable version of KDE.
Give me your recommendation, what distro should I try?
Edited 2011-08-09 15:22 UTC
I’d recommend Slackware or Mandriva. But the problem may be linked to a feature you are using or a set of features that others don’t use. Did you try to disable Gl effects?
I’m sorry If you misunderstood my reply. There may be configurations that Kwin does not work well on under any circumstances. I’m not disputing that. I was disputing your implication that everyone else that says it runs fine is delusional and simply ignores the problems. I’ve run a variety of DE and WM so I think I would notice a difference between a really slow KDE and a fast xfce.
I run Fedora. I’d guess that you’ve probably already tried it, due to its relative popularity. It may work for you, it might not. My particular box at home is anti Ubuntu, or at least it has been for a couple releases. Ubuntu simply won’t install on it. Don’t know why, haven’t really looked into it, but I’m assuming its the motherboard or video card. Unfortunately, sometimes even the best Linux distros have trouble with particular hardware. But then again so does windows.
The KNotify thing is because one of the package is outdated. I can confirm some combination of libraries do cause it, I even managed to replicate it, but it’s fixed.
However, finding the mispackaged component can be challenging. It is less or more what happened with KDE 4.0 on KUbuntu. Features that were stable in OpenSuse were broken in Kubuntu for that very reason.
And I can tell you that what’s kept me from being a KDE user is that I can’t keep the system up and responsive for more than a couple of days at a time. This has been true for every release of KDE4 since 4.1, which is when I started experimenting with switching. I can say for sure that just using KDE applications is enough to bog my system down enormously.
I’m glad KDE works so satisfactorily for you, but it certainly is not the smooth and trouble-free experience you describe.
You’re joking right? I switched away from KDE/Gnome several years ago because they are so damn slow. I don’t see why I should use a desktop environment that presents me with a loading progress bar during startup. Linux and X11 do the driver stuff, the apps do the actual work. The desktop’s only task is to make apps accessible and manageable. If I can write it myself in a few weeks, the startup time should not be noticeable on todays systems. Awesome does that for me.
Just look at Okular for an example on how not to write software. Starting okular takes considerable time compared to other readers, especially if not cached. Okular requires about 100 MB of KDE libraries and several KDE services at runtime. It is very sad that no standalone version exists and it can’t even open files without those KDE services. But then, those KDE/Gnome people never seemed to understand the Unix design philosophy but only strife for Windows eXPerience. Well, they got too close for my taste…
Ok, I wasn’t talking about loading times for the environments, just using them when they’re open.
Personally, I love Okular. Its the best PDF reader I’ve found. It loads faster than adobe, and searches are faster than any other. Memory is something you don’t notice, unless you do. If I have 2 GB of memory, do I really care if its 100 MB of libraries or 20? For my systems, thats a non issue for speed or performance. It doesn’t mean its a non issue for anyone or any use case or that work should or should not be done to reduce memory foot prints and or speed of application loading. I’m not trying to prove a universal law here, just trying to discredit the belief that anyone who uses KDE is some how ignoring problems that exist with their uses of it.
You do, because the bandwidth of the HDD which stuff is loaded from is around 30MB/s, meaning that the thing will take 3s to load, which is arguably quite a lot when you just double-clicked a PDF and want to see it right away.
Often, when people complain about memory&HDD space usage, they actually want to complain about how much stuff needs to be slowly loaded from disk before the requested application is available.
A possible solution, on modern computers with crazy boatloads of memory, would be to load everything in RAM. You take some serious performance hit at boot time, but then the system is very snappy. Puppy Linux (AFAIK) does that successfully.
Windows Vista and Windows 7 use a technology Microsoft calls SuperFetch to pre-load stuff into RAM after boot, so that commonly-used applications (such as MS Office apps) will load faster on demand.
http://en.wikipedia.org/wiki/Windows_Vista_I/O_technologies#SuperFe…
The equivalent in Linux is called Preload.
http://en.wikipedia.org/wiki/Preload_%28software%29
I don’t believe it is possible to disable Superfetch on Windows. To compare apples with apples regarding applications start times, one must therefore install and run Preload on Linux. I don’t know if Mac OSX has similar technology, or not.
I have not installed Preload under Kubuntu Linux on the Acer Aspire One 522 netbook in my earlier anecdote.
http://www.osnews.com/permalink?484159
Nevertheless this under-powered netbook still loads and runs LibreOffice under Kubuntu Linux (without the benefit of running Preload) noticeably faster than MS Office 2010 loads and runs under Windows 7 (even with the benefit of SuperFetch).
Needless to say, Linux boots and is ready for use at the desktop far faster than Windows 7 is.
PS: Okular is far faster to load and start running than Adobe Acrobat.
Edited 2011-08-10 07:06 UTC
Am I mistaken, I thought modern HDD were more like 100 MB/s. Which would make it a second. In any case they are probably preloaded for me by fedora, or I typically use okular so much that its sitting in the fs cache. I also leave the system up for weeks at a time. Maybe the first pdf I load after a fresh boot might take 3s, and I didn’t notice. That’s not really much to complain about.
For external HDDs, which are pretty much laptop HDDs in a new casing, I can attest that transfer rates are around 30-40 MB/s. Maybe higher-RPM desktop HDDs are faster, though.
I have to date been extremely disappointed with Debian versions of KDE. It seems to me to be one of the most problematical.
PCLinuxOS … rolling release, well tested before it is released, fairly large selection of packages, 32-bit only.
OpenSuse … reportedly a great KDE distribution, but not a rolling release. This one probably gets the most recommendations.
Mandriva … a very good solution normally, vastly under-rated. Also not a rolling release.
MEPIS … based on Debian, but fixed somehow. A very stable distribution, but don’t expect the most recent levels of performance or function.
Arch … a cutting-edge rolling release which still manages somehow to be pretty stable and reliable. Not recommended for newbies, it doesn’t do any hand-holding. No GUI package management solution for KDE.
Chakra … Based on Arch, also rolling-release, but with more GUI help, perhaps a better option for newbies than Arch itself. Repositories are far fewer.
Slackware … I haven’t used it, but reportedly it is a good option for KDE. Not cutting edge. Package management would be the weak point.
Kubuntu … it seems to attract a lot of heat, but I haven’t had too much trouble with it. Avoid version 10.10 (KDE 4.5), disable strigi and nepomuk.
All I can say is that I keep hearing many stories of KDE problems that I have never encountered myself. KDE is never slow or lacking in performance for me.
My recommendation: avoid proprietary graphics binary blob drivers, or ones based on reverse engineering, or running Linux at all on systems using the Intel GMA500.
Edited 2011-08-10 00:05 UTC
I promise you I don’t suffer from those.
I wont deny there are occasional bugs (usually momentary rendering glitches that most compositing WMs can suffer from).
I will also admit that KDE4 is one of the most memory hungry DEs, but then you get a lot more bang for your buck. Some might call that bloated, however as I use much of those features so it’s not bloat for me. But then if you really want a low-resource DE then KDE isn’t designed for you in the 1st place, thus you’re expectations are unreasonable (there’s LXDE, XFCE and a whole host of other DEs and tiling WMs that better suit those needs).
I’m using ArchLinux 64bit.
I’ve tried Ubuntu’s KDE4 distro (albeit quite a while ago now) and that was awful. I’ve heard Kubuntu has progressed significantly since but if my experiences of the two are any gauge, then I’d suggest maybe the you’re right in regards to how KDE might perform differently on different distributions.
If ArchLinux’s install is too labored for you to test KDE4 (which I could understand), then perhaps give Chakra a go ( http://chakra-project.org/ ) as that’s a live distribution based around KDEmod on ArchLinux.
(replay for both side)
Please don’t troll, I am a dev (not a kdebase dev, I spend my time on “extragear” apps and some KDE SC apps) and I see those happening too, they exist.
There is many possible explanation why do these bug happen:
–Linux: Some of the Linux Kernel component have huge performance penality for I/O intensive application. Linus Torvald himself have one of these computer. We are not talking about niche here, but EMT64 and AMD64 computer with a bunch of chipset combinations. Apparently, it’s not even driver bug. It slow down under certain workflow (mostly i/o), and nobody know why.
–packaging: This is a bad issues. Kubuntu already made us a bad reputation with KDE4.0. About half of the crash were due to mispackaged libraries. If a symbol is not found, the application will crash. If a two packages are not at the same version compile in the same circumstances, they may behave somehow differently (infinite loop, 100% CPU). If package A is compiled against library C and then used with library B instead of C because C was not packaged while B was, it may not work. Note that in this example, B and C are the -same- libraries, but compiled in different circumstances.
–dodgy patching: Remember the distribution have the final say when it come to the code that get compiled. If they apply a patch on code that do many things to fix one of these thing, who know the side effect it will have on the other use case. Even in good will, adding patch without having the complete understanding of the side effect it will have can be very damaging to to the well behaving of the application.
–driver and X: When talking about KWin, this is a big issue. Many drivers are in a terrible state and “provide” feature they don’t actually have. While in some case, the code path is just so terribly unoptimized it will molasses the whole thing. The KWin developers are aware of this and trying hard to fix it, even if they need to troll upstream. KDE 4.0 was not working on NVIDIA, you could not move a plasmoid without X crashing or freezing for an extensive period of time. That was because one of the compositing extension was software emulated. It eventually got fixed because NVIDIA took the time to make KDE4 work fine. To this day, it still the best driver for KDE because it was actually TESTED with the OpenGL feature KDE use. We have not created OpenGL, driver claim to support it (2.* for Mesa and 3.3-4.1 for prop drivers). So complaining that KDE is slow because the driver do not work is frustrating. That said, writing a driver, especially from video dump is incredibly hard.
–options: As everybody know, KDE have a lot of options. Nothing new there. There is even more than in KDE3. So if every option is a boolean one (on or off), and KDE have, let say, 500 000 options that can randomly be set in any possible combination. That would make 2^(500k) possibilities. So 2^(500k) different code path. Having flexibility have a cost, and this is it. There is no way more than 95% of these combination can work flawlessly. There will be some issues, bug report are welcome.
–cosmic ray, dark magic, unicorns, –assumptions–: Those and many more
_____
Saying that KDE4 is perfect is wrong. Blaming the dev for every bugs is also wrong. I can understand that, as users, you need to blame someone and the devs are the most obvious targets. After all, even I can’t tell if I face a real bug or if my build system / Gentoo is just misconfigured/outdated. But trolling for the sake of trolling hurt more. Do you really thing anyone love to be blamed for something he is not responsible for? How do you expect you will always have a nice, clean and gentle reply from him/her, everybody is not aseigo .
This is a great little rundown on how to help the devs, thanks for posting it!
I would also note though, if the bug is intermittent/hard to replicate, the valgrind route is pretty tough, due to the performance hit it requires on the running application. A bit like treading through molasses to find that need in a haystack .
Dear Hiev.
Please buy a working computer and stop flaming every single KDE news item.
I did that, same results.
Maybe you should turn your computer upside down.
Can someone please explain to me what is the advantage of Wayland?
A clean code base and API for one thing. Also, Wayland allows clients to control the rendering themselves that we’ll never see tearing, lag, redrawing or flicker.
It will feel a lot faster than X, since all the X overhead and cruft will be gone. Think of it like a small, next-generation and very slick graphics stack.
Edited 2011-08-09 07:28 UTC
Thank you, that was the information I was looking for.
For more information, you might want to have a look at this : http://www.phoronix.com/scan.php?page=article&item=linuxtag_2011_wa…
Well,
1) you will loose the network transparency by default of X, there will probably be replacementS but they will be incompatible and buggy at the beginning.
2) New code –> new bugs
3) currently Wayland developers seems to prefer that a program manage totally how it displays itself, so if you have a stuck application, you won’t be able to killl/iconify the window.
So as you can see Wayland is much better than an evolution of X which would remove unused features and add an option to share GPU’s memory!
Isn’t that the job of the window manager and compositor under waylaid?
Edited 2011-08-09 09:00 UTC
Wayland devs have already indicated that they prefer client side decorations, something the kwin devs do not like at all and hope to counter. Nevertheless it seems to be the only objection the kwin devs have against Wayland. Network transparancy can still be achieved if necessary, but the future will show if the X-type-of-network-transparancy is required to have a good functional system.
Indeed, those are valid concerns.
I have my own worries too.
I understand now why they are doing Wayland. There are some problems with Xorg. I was always frustrated by the lack of vsync to vblank for one thing. I understand that they want to do something about that.
My fear is that they may through the baby with the water.
Xorg is already dumping a lot of useful things by trying to improve. They dropped XEVIE because xtst and xrecord provided similar functions. Then they broke xrecord. I understand it’s hard to maintain such a big project and shit happens but it’s frustrating. My project depended on xevie and then xrecord. It makes my project harder to maintain and I spend more time trying to keep up with the deprecation of the underlying technologies than improving the project with new functions.
In an ideal world, people using bleeding edge software would expect the shit and don’t complain while the mass would use stable and working software. The bleeding edge users would help us while the software matures. In reality, the mass uses bleeding edge software and they complain that it sucks.
My fear is that wayland will be pushed to soon to the hand of clueless users. Wayland should be an experiment first for tinkerers to play with and see if it is worth to drop xorg for it.
I hope they will give it time. I will certainly look at it and see what can be done for my software to work with it once I’m done fixing it for GNOME 3 and the new at-spi over dbus.
This does not work with X11 currently when using heavy Opengl applications anyhow.
Other methods like VNC still work with Wayland.
In fact newer X11 servers have more new code than Wayland. So odds of bad new bugs are on the X11 side.
In fact this is a full lie.
Wayland applications render to a buffer Wayland compositor decides if that ever sees the light of day. Yes application in wayland make be completely in the dark if it displayed at all.
Wayland gets rid of the divide between windows manager and compositor.
In fact what you have just stated is what the Wayland project is. X11 stripped of all old features with GPU memory sharing. That is wayland.
So? Network transparency doesn’t work for 1% of applications (mostly games) then this means that network transparency is useless?
Eventually yes, but it’s quite likely that at the beginning it won’t work as well as what we have currently with X: note that this “stabilisation” time can be *very* long.
Uh? Wayland + the toolkits adaptations are both new and they will do exactly the same thing as the Xserver + the toolkits do currently so I find that difficult to believe.
Instead of calling me a liar, you should better read the Wayland mailing list, see http://lists.freedesktop.org/archives/wayland-devel/2011-May/000988… for example.
Kristian Høgsberg is “only” the *main* Wayland developer, and he is pushing for client-side decorations in Wayland, which means that when you click on a ‘close’ decoration, it is up to the application to treat the close request, so if the application is stuck, you need a workaround “a la Windows”.
Of course this is only what Wayland would provide by default, KDE on Wayland could change this (by implementing their own window manager).
These “old” features includes: network transparency, server side window management and compatibility with the existing software: all these “old” features must be added on top of Wayland (and each toolkit will do it differently), which makes nearly sure that these “old” features will be quite fragile, that’s what I don’t like in Wayland.
Certainly features such as network transparency are useful. But X is getting far too crufty and incapable in the modern world. You cannot get dynamic gpu switching because of X and it is notoriously hard to maintain and its codebase is supposedly utterly hostile to newcomers.
Personally I don’t really mind if X stuck around, but in modernizing it would take the same or more effort than starting from scratch.
Because of X specification or because of current X’s implementation?
That’s a big difference..
I’m not sure that Wayland will be much more simple here, for example input management must still be done and I don’t see why it would be simpler in Wayland than in X.
I don’t know.. One puzzling thing is that there were already some modernization of X done such as XCB but the toolkits are still using XLib??
At this point I think that changing the implementation would require major surgery inside X
I’m sure that even to develop for waylaid will require a rather specific skill set. However, so many parts of X are so ancient that modifying them requires being able to understand something that has been around for decades.
TBH X protocol isn’t the best solution these days. I find RDP more usable (especially under high latency connections) than running remote X. Also Due to the nature of some toolkits running remote X programs is really slow.
You should try FreeNX. It’s really fast. RDP is nice but there are some limitations that does not make it a replacement for X. It only runs a full desktop. You can’t run an app over RDP. Citrix is the only option for remote apps on Windows and it costs money. I hope the Wayland devs don’t forget that some of X features are really used, even if they don’t use it themselves. Accessibility comes to mind when thinking about things that are easily dismissed and forgotten on new systems.
How about SPICE?
http://www.spice-space.org/
RDP, as part of Terminal Services in Windows Server 2008, allows for remoting individual apps. Pretty much the same way that remote X works.
http://technet.microsoft.com/en-us/library/cc753844%28WS.10~*~@…
So? That’s just because “dynamic gpu switching” is a feature *much more recent* than current X implementation, I don’t understand why this is a criticism against X: if Wayland had been implemented before the “dynamic gpu switching” feature, it’s quite likely that it would have required major surgery too!
X is for LAN not high latency connections, there are some “add-on” such as NX which are better for high latency connections, I agree with you that this isn’t a good situation: NX should be merged in X so that by default you can have a good remote connection both for LAN and WAN.
Well, given that Wayland don’t provide network transparency, it will be up to the toolkit to implement it, and as you say some toolkits don’t work well with X for remote display, so I’m not optimistic that the situation will be better when the toolkit itself will implement the remoting!
Certainly, but implementing it in wayland could be done in a far simpler and cleaner way than in X.
Yeah, but even in a lan (granted I’m using wireless) I notice the speed difference between X and an application running inside a rdp window. Both can be used quite easily but the faster response time in RDP is more confortable.
This was more an extension of my previous point. If toolkits are slow with remote X, what have I really gained? After all, nobody uses motif anymore. If the toolkits implement it it means that they can dynamically change their drawing method or switch to themes that do not require hardware acceleration (much like vista and win 7 do with RDP). FreeNX does not require an X server to work in windows so a solution like that would be better than full blown X.
They have a very vibrant community
No, No, I like to use it as it come as default, I don’t tweak anything at all, I’ll try with Mandriva.
Edit:
I see Mandriva 2011 will be released in 19 days, I’ll waith.
Edited 2011-08-09 17:15 UTC
I suggested Mandriva because they have developers working on KDE, their primary desktop is KDE and I have first hand experience with it. Their KDE implementation is better than on Debian based distros (whose primary desktop is GNOME)
However, I can not recommend Mandriva 2011 because I did not try it, because it’s not released yet. I can only guess they will have a good KDE implementation based on their previous releases but it’s only a guess.
For those missing remote X sessions: you can get the same thing by running X server as client on wayland, problem solved. No need to discuss the issue of network transparency.
Pretty much the same way OSX uses X then…
P.S It works quite well