Windows Server 2003 will bring an end to one of the biggest headaches for Windows users and administrators, according to Microsoft. The problem, which relates to Dynamic Link Libraries–software modules that can be shared by several different applications–has become an increasing headache over the years.
A very misleading article. Windows Server 2003 ships with the new 1.1 version of .NET. .NET DLL hell is ameliorated through some trickery in the .NET Global Assembly Cache.
It doesn’t sound like Windows Server 2003 does anything for non-.NET DLL’s.
And allegedly, this new version of .NET does away with the registry, allowing “xcopy deploy”. However, whether or not this works in practice will be interesting to see. Most Windows apps, .NET or not, contain lots of little moving parts strewn about.
Two steps in a good direction. It’d be nice if there were some competition so that Microsoft would have to include more useful features in their software. Having to wait 3 years just to get ‘strong naming’ in the Global Assembly Cache is criminal and shows a immense disregard for human beings aka customers aka users aka people.
…and they were supposed to get rid of that Blue Screen a while back too.
.NET certainly solves one dll dependency problem, but it does nothing for all of the non-.NET shared libraries. From what I hear the new version of Windows will still support all the regular DLL’s for legacy apps. Besides, it’s not likely that major third party applications will use the .NET environment anytime soon nor will .NET be used for all the drivers, DirectX and the rest.
Saying that Windows Server 2003 solves the dependency problem is the same as saying some future Linux distro that includes DotGNU Portable.NET or Mono will solve the Linux dependency problem
>…and they were supposed to get rid of that Blue Screen a while back too.
I have never seen a blue screen on XP. So for me, they have got rid of it (for the largest part at least).
Microsoft claims the same thing about Windows 2000:
http://www.microsoft.com/windows2000/techenthusiast/features/wfp.as…
And Windows XP:
http://www.microsoft.com/technet/treeview/default.asp?url=/technet/…
I guess they’re still trying…
I’ve seen blue screens. Or rather, I should say, crash screens in XP. It dumps you to a “terminal debug” kind of view and then it reboots automatically (unless you configure this behavior otherwise). Granted, I’ve not seen it more than a few times and this was related to a driver installed into the system by an early version of TASCAM/Nemesys’ Giga Studio/Sampler. The driver was “filespy.sys” incase anyone is curious (XP crashed hard whenever anyone tried to delete a file). So, credit to MS, I’ve only seen crashes like this in the case of foreign drivers causing problems.
DLL Hell… the description of this “feature” was given out when WinMe and WinXP came out, was it not? I swear I remember this same nonsense about a “smarter” Windows system that “automatically heals” overwritten system files and such…
My solution to DLL hell? On ALL operating systems… Stop using them!
“My solution to DLL hell? On ALL operating systems… Stop using them!”
Not going to happen. Programs need to “share” code to keep from having redundant code, taking up disk space.
If you know of a better way to deal with them, let us all know…
I like the part where the guy says “You just copy applications instead of re-installing them”. Right… This is MS we’re talking about. Haven’t they gone to great lengths in the past to prevent people from copying application installations from one machine to another? I wonder if the xcopy command is now equipped with Product Activation features so that if you don’t register the copy of the application you xcopy’ed, it will delete the files after 15 days?
I have. It occurred after the first reboot when installing a wireless card on my notebook (d-link card).
Blue screen of death and then 0.5 seconds later an automatic reboot.
They have improved the font used in the death screen though.
I am sure it happens very infrequently now.
I have been developing on and using NT since 3.51 up to and including NT 4.0, Win2K, and XP, and the only time I have ever had a Blue Screen was due to a bad memory module on NT 4.0. These Operating Systems have not been perfect, but some of you seem to think that Blue Screens are a regular occurrence. They are not. I really don’t understand this.
Here’s an idea for you. Get some decent hardware, install Win2K or XP Professional, buy a copy of ‘Inside Windows 2000’ (It’s an Internals book) and maybe a Richter book and really hack at it for a six months. Then come back and give some solid technical reasons why Windows sucks instead of spewing the uninformed Blue Screen FUD.
Disk space is cheap and we have far more of it than we need. Developers need to staticly link everything and stop sharing code when you don’t know how to do it right.
To share code properly and make sure it doesn’t cause “DLL Hell” requires an intelligent programmer, obviously not your average programmer. You must make sure all software packages come with all the libraries required to use it. And you must make sure all new libraries maintain backward compatibility with previous APIs. Then whenever a program needs to install itself it just has to check if it has a recent version of the library, if not, upgrade it. Its that simple. If you don’t break your API, release premature code under the production library name, etc. Then you avoid DLL Hell. But how many programmers honestly use common sense? How many have ever been a systems admin? My guess is not very many.
If they don’t even understand how the system works I don’t see how we can expect them to write code for it without breaking it.
“Not going to happen. Programs need to “share” code to keep from having redundant code, taking up disk space.
If you know of a better way to deal with them, let us all know…”
I really don’t understand why it should be such a problem. In AmigaOS, each shared library has its version number embedded in the binary. Each program (or version of a program) requires version X _or later_. It never happens (in my experience) that replacing a library with a later version causes any problems. The calls are the same, new ones can be added to the list without affecting a program that doesn’t use them.
When you run a program, it loads the required libraries into RAM (if they are not already there) and checks that the versions are OK as it does so. The version number required is always the oldest that will work.
A problem that does happen is that bad install scripts can overwrite a library with an older version. If you use the correct command in the install language, this can’t happen.
Why all these problems in Windows and Linux? Bad planning?
someone said :”I have. It occurred after the first reboot when installing a wireless card on my notebook (d-link card).
Blue screen of death and then 0.5 seconds later an automatic reboot. ”
Please, please showing your (lack of ) knowledge ! drivers work in kernel mode, so any memory pb in a driver can cause a blue screen on any Nt kernel, or a black one under linux. If a driver is crap, Linux doesn’t do better, neither FreeBSD, neither Mac OS X. All these systems have no strong memory protection for programs which work in kernel mode.
I just crashed 10 times linux today because of modules errors
By Eugenia …and they were supposed to get rid of that Blue Screen a while back too.
I have never seen a blue screen on XP. So for me, they have got rid of it (for the largest part at least).
I thought that it was a thing of the past too but last week I finally managed to crash XP. I was working with some large Photoshop files and the computer just restarted on it’s own (did it about 6 times in one day). The BSOD is now merely a slight Blue screen flicker (BSFOD ?)and then the computer just restarts.
how about the brain-dead installer app writers stick the dll’s (however redundant) they need in their PROGRAM DIRECTORY instead of the system32 directory. Then other apps wouldn’t get confused. Sounds simple enough a solution.
Glad they have some plan for .NET’s hundreds of dll’s, but that doesn’t fix it for the old stuff so i don’t see how it puts an end to dll hell.
Re: Not universal
“It never happens (in my experience) that replacing a library with a later version causes any problems.”
Well, this does happen. MS has become better about it of late but it still sounds like it can occur. The best example that comes to mind is from a while back. I forget the exact names but here goes…. an object/method used to retrieve file names returned full paths included in the file names. Upgrade to newer dll now only returns file names with no path info. Any application relying on the earlier version is now broken. This is why strong binding is desired.
Re: XP crashing
I have never gotten a bluescreen on XP. However, on numerous occasions I have gotten XP and 2000 to become so hosed and unresponsive that a reboot was required to get anything done. This is just as bad IMO.
I’ve had explore tank a couple times when i did lots of file moving around, like orginizing my HD’s and stuff and after a bit it slows and then doing something a a delete kills explorer. But fortenly it comes right back up so I don’t care to much. I’ve had XP just up and close like twice (as in shuts down instananously) but thats no blue screen. Also i doubt it was XP, probably hardware. I’ve actully tried to cause blue screens just messing with stuff but have yet to succed. I don’t want to kill my system I just want to see if i can get XP to BSOD without doing something that obivouly will like bad drivers.
far as DLL’s I to was one that thought they did something with XP and maybe further back to fix alot of the DLL hell. I guess I’m fortent i have never had a problem with DLL’s but would be nice to know there is no potential for trouble.
I agree with others. There is no reason I can see to be using shared librarys and such. Just make big uber-file applications. We have more storage then we know what to do with. For aplications like embbedded, and handhelds shared still makes sense. But in a day of 80gig drives is normal theres not much point. Also seams like Uber-file setup would get things back to being more like Beos and I belive OSX some where an app is a single file and i can run it from anywheres and there is no install. This is how everything should be. I don’t care if it’s big, I want it to be clean and work without hassel. I have zero problem with apps being big. I got 120 gig in this computer, a cdrw, and a 750 zip drive. If I need a dvd to install and app then fine. Lets go to the level of dvd being standard, make the jump we did from diskettes to cdroms back in 95.
a) This GAC assumption will only work when all app’s installed on the server are written in .NET, older code which still call com can still get into dll hell (maybe worse than before, maybe less)
b) They solved dll by versioning them, so they’ll install dll after dll with a different version. I can see how the GAC will grow and steadily slow down the system. (they have possibly created a new issue similar to the registry-problem)
So they actually solved the problem by
– not removing or overwriting the old dll
– install new dll’s with another strong key (some kind of versioning)
nevertheless the GAC is a nice thing for programmers which allows them to share code through different applications. It might actually be pretty fast and stable if you don’t install too much on those servers.
I hope that maybe, if you’re a bit careful with your servers, they might hang in there for quite a while
By the time Microsoft had a “system-wide scripting language” working in the Windows/DOS/NT world, people had poked so many holes in the security system (such as it was,) to leave me with several hundred virus messages a day to worry about. And a whole “cottage industry” to both develop those viruses and defend against them.
Microsoft will never “put an end” to anything until they have *autitable* refereed source.
When I can stick a diskette into a computer, load enough marbles to restore a full backup, and restore a server, then they can come talk to me about “xcopy-deploy.”
I can’t remmember the last time I was missing a DLL!
Windows is the best desktop OS…. and yes I’ve tried Linux and BeOS….
I agree with others. There is no reason I can see to be using shared librarys and such.
Riiiiight, lets just statically link directx to every app or game that uses it. Not a very good solution if you ask me.
“I can’t remmember the last time I was missing a DLL!
Windows is the best desktop OS…. and yes I’ve tried Linux and BeOS….”
I have, a couple of weeks ago. I was a dll missing for no apparent reason.
I went to http://www.dll-files.com, downloaded the appropriate dll file and put it in my %system dir%/system32
Problem solved.
…man I was just making a joke about the Blue Screen.
I remember MS explicitly (spelling?) saying that they were doing away with the Blue Screen, altough it looks the same shade as it always has.
Linux doesn’t have DLL Hell, it has dependancy hell, which are two different (but related things).
On Linux you don’t get apps breaking because you upgraded a library, unless the library maintainer seriously screwed up, which is rare and when it happens we all go shout at them. The rules are pretty simple, if you release a new version that breaks the ABI, bump the soname version, if it doesn’t, up the minor version.
So you can have many versions of GTK installed at once, and things should go OK. But now you have to properly manage the installation and checking of all these things…. and you get dependancy hell.
MS have started using a system a bit like Linux lately, for instance COM has versioning sort of now, and you have say RICHED3.DLL, RICHED4.DLL, which is sort of like libfoo.so.2, libfoo.so.3 etc
“This is a classic problem,” said Ivo Salmre, .Net and developer tools and technology manager at Microsoft, speaking to ZDNet UK. “We have been beaten over the head about this for years now. You ship an application that uses component A. Someone else writes an application that also uses component A, but installs a newer version, and this breaks the first application.”
This very problem was the bane of my existence for 4 years straight. Installing GroupWise on a system with Outlook on it, or vice-versa, was a never-ending complication of conflicting hell.
The culprit? MS’s “updated” MAPI32.DLL from Office, which overwrote the standard one that shipped with Windows. Yeah, MS overwrote its own code, which broke applications that relied on that code to begin with — antitrust material, if you ask me, especially considering that MAPI was supposed to be an open standard for everyone to follow.
The solution? Remove the updated MAPI32, re-install the original from the Windows CD. Worked like a charm. And never, EVER install Outlook on a work machine again.
>> I agree with others. There is no reason I can see to be using shared librarys and such.
> Riiiiight, lets just statically link directx to every app or game that uses it. Not a very good solution if you ask me.
The problem in statically linking is not just about hard disk space. Shared libraries are loaded into memory and can be used by many programs at the same time. If they’re statically linked you’ll load the same thing in memory several times when different programs that use the sames libraries are running.
Apple has one:
1) Shared libraries are called Frameworks and are stored in their own directory.
2) Each directory has a sub-directory where the actual library is shared.
3) The sub-directory has a Major version number in its name.
4) Each sub-directory contains the Major.Minor.Bug version so the installers can update the library.
5) If a new major version is release (ie not backwards compatable), then it creates a new sub-directy and installs.
Thus, you get the best of both worlds; you always have the latest minor.bug release for your application and it will always be for the major version you’re expecting.
So you can have many versions of GTK installed at once, and things should go OK.
I think Don’s point is that you didn’t even need many versions of (say) MUI. Whatever the most recent one was, would do. I don’t know how libraries are set up in Linux or Windows, but IIRC on AmigaOS they start with a jump table, and programmers are strongly encouraged by guidelines to add new functions at the *end* of the source, so the compiler would add them at the end of the jump table, and it would ensure that you didn’t break older dependencies. (If anyone remembers this more clearly, I’d appreciate enlightenment, because I’d like to know why I’d have to keep several versions in Linux.)
Of course, this worked only because Amiga programmers generally had the self-discipline to follow the guidelines. Even so, I seem to recall reading that after Commodore went splat, some chaos erupted with the datatype libraries.
Sounds like they’ll try to implement something like what MacOSX is doing.
OSX keeps track of all version of the same DLL. When an application is build to use a certain DLL, the version of the DLL its using is burned into the Apps executable.
Then when the app gets executed it loads only the version of the DLL it used when it was build.
I thought that it was a thing of the past too but last week I finally managed to crash XP.
Just as a matter of interest, you’re not using an ATI Rage Pro-based graphics card are you?
This used to happen a lot to me on my work laptop in XP, I’d be editing some large files in Photoshop and all of a sudden it would bluescreen and spontaneously reboot. Lost a lot of work that way the first few times it happened.
Eventually I tracked it down to the Rage Mobility M1 (which is basically a Rage Pro) on the laptop, I couldn’t find more recent drivers from the laptop vendor (Sony) but I tracked down some unofficial updates and this solved the problem.
Sorry if this has no relevance to you, it just sounds very… familiar.
Linux use something like this M$ promise: you can have many versions of a shared library ( *.so.number ) and symbolic links pointing to some ( *.so -> *.so.3 ).
I think that M$ is only trying to copy (steal) Unix ideas.
1) add program access rights to OS so that they are ANDed with the user’s rights. ie: netscape is forbitten from writing executable files (every apps short of the system’s linker shouldnt write executables without explicit permission anyway, that would stop 99.9% of viruses in the first place) even if root user has right and is using it.
2) every programs is run under a virtualized filesystem and is on a need to know basis. ie: irc program doesnt need to access or even know about your tax report files, that way even if the program is compromized it will only be able to do limited damage.
3) when a program copy a file into the system directory, the OS fools the program by updating the program’s virtual file system but the OS actually puts the file in the program’s directory (hidden from the program, but not the (super/root) user and maintenances (super/root) programs) and replace the file’s data (delete, does not store, whatever) with previous identical files at the file system level with a “copy on write”.
4) when a program request the file the OS hides the fact that there are (virtually) N versions of that file (DLL) with the same name and gives the right version as if that was the only one installed.
that way:
– you have only one copy per version of the DLL,
– programs cannot overwrite others (virtually no viruses) or the OS can let them think they do (find out about viruses / spy ware / backdoors / intrusions)
– you can cleanly uninstall a program by deleting (decreases merged file data reference count) all its related files in one OS-provided operation.
– you can create virtual file systems for yourself (ie: sort your MOD/XM/S3M/IT/MP3 collection by song name, author, file type, in virtual directories)
maybe i’m just dreaming… maybe i’m working on it…
Linux use something like this M$ promise: you can have many versions of a shared library ( *.so.number ) and symbolic links pointing to some ( *.so -> *.so.3 ).
And indeed it goes further than this: individual symbols within a library can have versions. So as long as a library uses versioned symbols, a program (or other library) can depend on a specific version of a function and it will get that version and that version only, even if the library has been upgraded.
If only this was more widely used… I’m still smarting from the dismal way the libpng3 transition was handled.
I think that M$ is only trying to copy (steal) Unix ideas.
If it’s a good idea, it’s a good idea, even if Microsoft does it…
No bull: My blue screen of death in the afternoon
http://www.chron.com/cs/CDA/story.hts/tech/weekly/1800158
Please stop asserting these never happen, they do.
what a mess… and all this trickery to plaster over the fact. and this trickery itself will consume cpu cyles and will itself crumble…
… and since when did 3rd party application vendors have the authority to over-install a core system library? I don’t expect an install of mozilla (even as root) to update my glibc.
The oly sensible way forward for microsoft is not to plater over these design faults but to fix them – andif the undiscipled 3rd party coders don’t like it – tough!
keep it simple… and keep it intelligent. i’m quite pleased with the linux/unix way… numbered libraries… major number change may mean binary incompatibility… minor number change _must_ ensure compatibility…. example libpng.
1) I hate Microsoft’s naming conventions. This “strong binding” thing is basically the static linking that has existed in UNIX since, oh, the 1980’s! Read the PE (Portable Executable) spec sometimes. They take the exact same concepts that have existed in other executable formats for years (hell, PE is based on the UNIX COFF format!) and then rename them just for the hell of it. For example they call “relocation” (patching a binary to run at a different virtual address) “rebasing.” It’s simple. If a word for something already exists in a field, don’t rename it just out of sheer jealosy. [rant off]
2) It’s impossible to go back to static linking. First, while disk space isn’t a big deal (though, with the KDE libraries hitting dozens of megs, I’d rethink that assumption) memory and cache still is. In particular, most applications spend a large part of their time making library calls, and if those library calls exist as a whole bunch of seperate copies, you easily trash the cache. With a good versioning system, like in Linux, the necessity of static linking goes away. Unfortunately, RPM is an utterly braindead system, which makes a “dependency hell” out of a non-issue. People using distributions based on APT (or something else entirely like Gentoo or Slack) don’t have this problem.
3) I honestly don’t like OS X’s application mechanism. It’s too low-level of an abstraction. It’s suitable for an intermediate user, who understands that programs are a collection of binaries, libraries, and data, but it’s attachment to the filesystem makes it confusing for new users. In this respect, the Linux distros have it right. The application repository is a black box. The user doesn’t (and doesn’t need to) know where applications are stored. They could be stored in XML files on a remote server for all they care. They simply use a command (or a nice GUI tool like Synaptic or KPortage) to select the applications they want to install or remove, and everything else, including putting a entry in the main menu is done automatically. Now existing implementations do have some issues with this mechanism, in particular installation of menu entries is not standardized. However, the freedesktop.org people are moving towards standardizing that, so these rough edges should be resolved sooner or later.
>I really don’t understand why it should be such a problem.
>In AmigaOS, each shared library has its version number
>embedded in the binary. Each program (or version of a
>program) requires version X _or later_. It never happens
>(in my experience) that replacing a library with a later
>version causes any problems. The calls are the same, new
>ones can be added to the list without affecting a program
>that doesn’t use them.
Indeed, this is one of those things I liked about my amiga. The copyver command never let a library get overwritten by an older library. I installed hundreds of applications over the years and never had a library conflict.
Once again it is welcome to 1982 for microsoft.
Cheers
David
>Well, this does happen. MS has become better about it of
>late but it still sounds like it can occur. The best
>example that comes to mind is from a while back. I forget
>the exact names but here goes…. an object/method used to
>retrieve file names returned full paths included in the
>file names. Upgrade to newer dll now only returns file
>names with no path info. Any application relying on the
>earlier version is now broken. This is why strong binding
>is desired.
Perhaps they should hire programmers who understand what a library is. Or perhaps the writers of the applications should learn to read the library documentation and not rely on it’s behaviour rather than it’s documented API.
You see libraries are not just a bunch of methods to use. The libraries API is an agreement between the libarary and the application. You don’t change the way a API call works. It should always be possible to move up to a newer library version. Don’t like the old API and the way the methods work, then create a new library.
You can add methods and even fix bugs but you must not change the behaviour.
Cheers
David
1. Windows COULD be stablised by simply exploiting the 4ring levels that x86 has at its disposal, HOWEVER, the net result is that it makes it non-portable to alternative architectures that don’t have this scheme, mainly, RISC platforms, the Itanium however I am not too sure, however, assuming that Intel kept with its idea of 4 rings, I am sure Itanium has it as well. SO this is what Microsoft has to weigh up, a VERY portable operating system with reasonable stability OR a semi-portable operating system with awsome stability. Maybe in the future Microsoft will do that, I’m not too sure.
2. Windows XP/2000 does NOT crash all the time. I’m running Windows 2000 right now and have gone for a couple of weeks without a reboot (due to patching a security issue), Windows XP, again, I’ve see a BSOD (yes I have disabled the automatic reboot). 99.9999% of the time when there is a BSOD, the XFree86 Server crashes or you get unexplained issues, it is related to faulty hardware. I’ve repaired computers for family and friends, the overwhelming problem has always been people buying cheap crappy hardware. Cheap crappy hardware results in poor system stability. If I were to get two motherboards, one non-name and one from ASUS with a serverworks chipset, which is more likely to bring down the system? you’re right, the non-name one.
3. There is an easy way, each vendor gives a name like so:
[dllname]-[version]-[vendor].dll
for example:
foobah-1.2.3-microsoft.dll
meaning when corel installs theirs
foobah-1.2.3-corel.dll
the two can sit side by side without any issues.
Not going to happen. Programs need to “share” code to keep from having redundant code, taking up disk space. If you know of a better way to deal with them, let us all know…
The disk-space argument is now, itself, redundant. It is an old answer to an old question. The answer is no longer valid. I wish people would stop using it.
To share code properly and make sure it doesn’t cause “DLL Hell” <snip> You must make sure all software packages come with all the libraries required to use it. And you must make sure all new libraries maintain backward compatibility with previous APIs. Then whenever a program needs to install itself it just has to check if it has a recent version of the library, if not, upgrade it. Its that simple.<snip> But how many programmers honestly use common sense? <snip> If they don’t even understand how the system works I don’t see how we can expect them to write code for it without breaking it.
Once again, we have a complex system when there need not be one. Look at all the issues here. Forwards compatibility, backwards compatibility, record keeping, packaging, etc. Not using libs at all solves all of this.
how about the brain-dead installer app writers stick the dll’s (however redundant) they need in their PROGRAM DIRECTORY instead of the system32 directory. Then other apps wouldn’t get confused. Sounds simple enough a solution.
This is how I solve the problem before it starts when I use BeOS and when I use some Windows apps (because only some will do it right). A program folder should contain everything the program needs to run, which is not included in the OS, with the exception of its settings file which is best stored in a central user-accessible repository as a single file. (don’t get me started on the Registry – another design idea for which someone needs a good beating).
The problem in statically linking is not just about hard disk space. Shared libraries are loaded into memory and can be used by many programs at the same time. If they’re statically linked you’ll load the same thing in memory several times when different programs that use the sames libraries are running.
Okay. So what? Same problems caused here, same solution to it and the same argument for not implimenting that solution. How many apps use the same non-OS libs? How many of them do you run at one time? Let’s say you are running three apps at the same time that all use the same libs. How big are those libs? Do each of those apps require the same version of the libs?
You want to complain about wasted hard drive space and wasted memory space, but you still want to use Windows, Mac OS X, Linux, etc. The waste is caused more often by bad design than redundancy. Besides, even with the libs in use, there is still redundancy caused by apps that each demand their preferred version of a lib.
Riiiiight, lets just statically link directx to every app or game that uses it. Not a very good solution if you ask me.
The easy answer to that is, shared libs must be made a thing of the past for 3rd party, non-OS software/components. The OS ships with its services, such as DirectX, and any third parties use those provided services. Anything else they need, they include in their binary and they do not install their own libs. This solves a great majority of the problem right there and still allows for the use of system services that 3rd party apps should be able to get from the OS.
Joe P’s mention of the Apple solution in OS X… it sounds like a working system… that completely defeats the supposed purpose of shared libs to begin with. If you have to keep track of multiple versions of shared libs, the point in having them has been defeated. Once again, elimination of at least 3rd party libs is the solution.
Contrast:
Then when the app gets executed it loads only the version of the DLL it used when it was build.
with:
Programs need to “share” code to keep from having redundant code, taking up disk space.
and it sounds to me like we have one big idiotic design. A design that demands a workaround which defeats the original benefit of the design. Round and round we go and how many people out here get it yet?
In the end, we can have all the “solutions” to the problem you want and the only actual SOLUTION (that is, something that is NOT a mere workaround, adding more CPU cycles, more complexity and more and more and more etc) is to define a cut-off point where ALL libs are made “illegal” and no apps are allowed to use anything other than the OS provided libs. All else is bound to the application binary. This is the only SOLUTION.
<snip>
Disk space is cheap and we have far more of it than we need. Developers need to staticly link everything…
<snip>
Hell yes. I will say that in windows, I only encountered .DLL hell once, between a version of GameSpy and Netscape 4.7something. One of the programs wouldn’t load, I don’t remember which, and I fixed the problem by reversing the order I installed them(I was building a new system).
I have, however, had COUNTLESS times had drivers disappear, and spent many, MANY hours fixing those kinds of problems on mine and other ppl’s computers… Sometimes the only way to fix it was to reinstal windows. That was in win9.x. I hear 2000/xp is better, but that just shows IBMs engineering expertise in the “nt kernel’s” bastardized OS/2 heritage.
>…and they were supposed to get rid of that Blue Screen a while back too.
>I have never seen a blue screen on XP. So for me, they have got rid of it (for the largest part at least).
I have seen many XP bluescreens. You know what? the “if it doesn’t work, make it look pretty” took place. XP bluescreens are in framebuffer mode..!
Yeah, and they also have advertising:
http://www.bbspot.com/News/2002/10/bsod_ads.html
(not serious btw; Damien is correct)
Notice a totally inappropriate somewhat adult version of Windows BSOD:
http://www.victoriaspanties.com/errorwear.htm
Please people read the url before you click.
I would hate if anyone got offended.
“The disk-space argument is now, itself, redundant. It is an old answer to an old question. The answer is no longer valid. I wish people would stop using it. ”
Programs are a means to an end. People buy big drives so they can fill them with data. Besides even in this age of Giga this and giga that, what we do will soon find a way to occupy the space (like a gas). The “answer” will become truely invalid when either we have unlimited machines, or our demands stop growing. I don’t see either happening, and hope it never does, for that path leads to progress, instead of “‘180Gb drives’ is good enough for everyone”.
Besides the best plans of mice and men is laid foul by the slip of an individuals fallibilites. (discipline people
There is one thing in common about dependency and DLL hell. Both are typically caused not by evil programmers, bad companies or those evil hippy opensource bastard who are all out get you. :->
They are caused by bad packaging. I am in Software Configuration Management and System Administration. I can pretty much make any valid piece of code run anywhere but that is not hard. It is hard to make a package that will run anywhere and not screw everything else up. If you don’t believe me try packaging up a few opensource packages for windows and making all the registry entries work. It is not impossible but doing it right is an art that no one in IT or software development seems to give a flying flip about.
My personal experience with dll hell.
Business guy Project Manager updates OfficeXP but his ass-old version of Visio would not work after that. Office had updated a couple of dlls and well….
Ok, developer wants to use this icon editor program written in VisualBasic. The program uses an older version of the VBRUN dlls. The deal is that he refused to overwrite his current version of the dlls. The icon editor would not work.
A friend of mine had this old game he wanted to install back on his new XP box. The installer had the DirectX install box checked by default and he did not notice. Not only did his old game not work that well but his new games were hosed until he installed DirectX of course.
I could go on but that is silly. Why? I can list a bunch of situations where I ran into dependency hell on a linux box. Systems that use shared libs or dlls will have this problem as long as the packagers of some projects small and large open and closed do their jobs badly.
OpenOffice handles it stuff smooth acting and behaving like you would expect from any closed project. This is whether you download the installer or rpms for OpenOffice from Ximian for example. It does not matter. Unless you go and update the glibc on the fly you are good to go.
Even with Gnucash’s laundry list of dependencies the project includes all libs that were updated from the distro’s version alongside their own package distribution.
There are bad examples like Gnumeric where the project does not (as far as I can tell) even bother putting together rpm packages, installers or anything and the thing has a zillion dependencies. I can’t fault the project in one way because they are providing software for freakin’ free (lots of people get real critical of opensource like they forget this). However, if you want a new version of this thing wait for your distro to update or apt.
*smile*
I’m waiting for the “kernel panic” line to come out.
Listen static linking is necessary and good in some cases.
Bad in others.
Your new version of your package now needs the functionality available only in version 2.3.1.1.1b of some damn system lib where almost all the distros are back on 2.2.
Static link the damn thing.
I wanted XFT mozilla. I wanted to pass it out in my office. I built XFT statically linked version of Mozilla. Why? We use SuSE in the office. SuSE does NOT have XFT not 8.0.
This is fine. Opera wants to build a statically linked version of Opera so you do not have to have QT on the box. Fine. I think that is a GOOD thing.
Should every app link every lib? No.
Should packagers be forbidden to statically link? No.
Extremes are stupid and I think many projects nowadays act all religiously opposed to the idea of statically linking and I think that is as silly as could be. Sometimes it is the only way to ship a project without burdening the end user with updating 12 different packages just so they can get a new email program (Evolution).
But to statically link for every project and every lib is just as unnecessary.
Why does everybody have to think in extremes?
“Why does everybody have to think in extremes?”
You missed this part of my reply.
“Besides the best plans of mice and men is laid foul by the slip of an individuals fallibilites. (discipline people ”
In other words, a lot of problems isn’t with the tools, or the methodology, but the human element.
There has never been a blue screen on any of my W2K boxes either… so weak up, my friend.
I have never seen a blue screen on XP. So for me, they have got rid of it (for the largest part at least).
I have seen it many times. It’s that memory exception thing (do not remember exactly). As far as I remember, one of my friends tried to transerf pistures from his Olimpus camera and puff… BSOD. Then again the same thing, and again, … (not everytime but too frequent).
> Eugenia:
> I have never seen a blue screen on XP
Lucky you!!! Do you ever do any software development under XP? If not then I can understand your position. XP seems a lot more stable to end users. Having said that, we had a Dell Laptop BSOD here the other day whilst stripping it down to be shipped back to a leasing company. That was under XP, and Pro too iirc.
There has never been a blue screen on any of my W2K boxes either… so weak up, my friend.
I am with wibble here on this one. I do software builds W2k box lots of compiles and such. Any badly enough built piece of crap software (alpha stuff from my developers) can bring that system to its knees. Actually my worst problem though is the way ntfs handles file locking. There have been files locked that I cannot remove after a build without killing and restarted explorer itself.
My bud working in the education sector has a program running off the MS SQL server on w2k and if under a heavy enough spike load it will BSOD every other time.
Other people I know have problems odd hardware drivers and other issues.
I don’t honestly know about XP.
They have gotten a lot better.
If you are a developer and are able to make efficient use of the registry it can be used to your advantage and be a brilliant tool.
The only issues with the registry are when individual try to make illegal copies of software onto another machine, this way the registry is brilliant for software developers. Even BeOS PE had a registry entry.
But still its possible to completely copy a piece of software if you know the registry entries. I’ve got a bad feeling about removing DLL’s completely, I think M$ will probably try to shift the functionality into the registry. There needs to be better management of the registry…possibly by the OS, or some 3rd party developers.
The registry is a misbegotten hack. It’s not a fundementally bad idea, having all configuration information in once place. However, the implementation sucks. It’s incredibly opaque and prone to error. It serves as a single point of failure for the machine. It makes it unnecessarily difficult to move Windows installations between machines. It jumbles together system configuration and individual application configuration. It’s impossible to edit without a special program.
Windows embeds the DLL version in the DLL, it is just that no on ereally respects it and that it is a pain in the ass when you need features in an older version of a DLL that is not included in an old one.
Unix jusdt sticks numbers at the end of the actual shared libraries for versioning, so that multiple versions of the same library can be included on the same computer. Example: libGL.so.1 libGL.so.2, etc.
Never understood here why Microsoft didn’t just follow Unix, seeing that it would be the easiest thing to do. Shared libraries are great, and allow programmers to make changes to parts of programs without having to recompile the whole thing again.
“My solution to DLL hell? On ALL operating systems… Stop using them!”
So, every time you write an application, you want to have to compile all of the GUI and network code right into the application? Real smart. So we will have huge bloated applications that do not benefit from the changes that the rest of the applications on the OS get.
Here is a great Dr. Dobb’s article that dismisses the supposed advantages of DLLs and recommends static linking. As noted above, I think static linking is especially important when using third-party libraries, including/especially Microsoft’s MSVCRT and MFC DLLs. When you statically link, you don’t have to worry whether the user has those DLLs or installing those DLLs or if the user has the wrong version.
“Windows DLLs: Threat or Menace?”
http://www.ddj.com/documents/s=909/ddj9875i/9875i.htm
So, every time you write an application, you want to have to compile all of the GUI and network code right into the application? Real smart. So we will have huge bloated applications that do not benefit from the changes that the rest of the applications on the OS get.
Okay Chris Parker, read the other thing I posted. I said that it makes sense to eliminate all 3rd party libs and keep the System (OS) libs available. These are really the only ones that you know for sure most apps will use and these will only change when you change your OS version. AND, if the OS is designed properly, nothing else will modify these libs.
I know there are a lot of posts here, but if you’re going to comment on someone’s posting, check to see if there’s more info related to it further down…
MS promised to end DLL hell with WindowsXP. Before that with Windows2k. And even before that, with the transition from WinNT 3.x to 4.x
Oh great. Microsoft is now going to go down in computing history for it’s innovation of the copy command. I can see the headlines, “Microsoft finds forward thinking solution to software woes!”. They will create a utility called System Utility Copying and Maintenance Environment or SUCME.
Gee, I have been deploying *nix installs with tar and ftp for 15 years, but it’s only clever if MS does it.
Microsoft itself developed, supported and promoted dynamic link libraries as desired way of code sharing. I can’t even imagine the way how some of the old COM objects could be used in the future Win* platforms without these.
Even more, if you don’t buy Microsoft Visual Studio Enterprise Edition (C)(R)(TM)(etc.etc.) you are not able to produce standalone code. Please correct me if I’m wrong.