So, you want to use Windows 2000 in 2021? Well, you’ve come to the right place, although we’re not the only place you’ll want to keep handy. You’ll find some great tips, software advice, and know-how at the MSFN Windows 2000 Forums. Special thanks to @win32, who provided many of the pointers and suggestions used in this guide.
This place is a message… and part of a system of messages… pay attention to it!
Sending this message was important to us. We considered ourselves to be a powerful culture.
This place is not a place of honor… no highly esteemed deed is commemorated here… nothing valued is here.
What is here was dangerous and repulsive to us. This message is a warning about danger.
The danger is in a particular location… it increases towards a center… the center of danger is here… of a particular size and shape, and below us.
The danger is still present, in your time, as it was in ours.
The danger is to the body, and it can kill.
The form of the danger is an emanation of energy.
The danger is unleashed only if you substantially disturb this place physically. This place is best shunned and left uninhabited.
Why did I immediately recognize the message?
What I find fascinating with Windows 2000, despite its old fashioned look (Windows 95/98 theme) is that it was pretty darn good (SP4 + patches). Quite as much as Windows XP SP3 with “Classic” theme. Did it have multiprocessor and 64 bits support, it would be still considered a viable option today for many (even though it is not updated any more). Almost like Windows 2008 server.
https://www.youtube.com/watch?v=DjLcGbSzuPE
But what fascinate me even more is its desktop experience that is still somewhat better than Linux Desktop experience, even 20 years after.
https://www.youtube.com/watch?v=TtsglXhbxno
(and I know there was many flaws on Windows 2000 too, like… printer and network drivers)
Win2k Advanced server does support 8 CPUS… Win2k Pro which is probably the minimum version anyone would have supports 2…
Thom seems to have quite the double standard for old non-Windows computing like Amiga and whatnot, vs people running old Windows OSes.
Thom have a gripe against Microsoft management, whereas their technologies can be very good (because they hired the some of the best in their respective fields). Windows NT/2000/win32 came out from VMS for a reason.
https://www.itprotoday.com/compute-engines/windows-nt-and-vms-rest-story
https://www.howtogeek.com/676095/remembering-windows-2000-microsofts-forgotten-masterpiece/
https://news.ycombinator.com/item?id=23529918
Agreed. Not sure why this contradiction exists. I presume he thinks that interest in Windows 2000 would only be from users who wished to run it full-time as a daily driver. However being 21 years old, there may be younger folk who are interested in it as a historical artifact, much like BeOS is to Thom.
Windows XP is so much more suitable for modern HW because:
1) It supports more than 2 CPU/cores
2) It has AHCI drivers
3) It supports the HD Audio Codec, it has relatively up to date Realtek drivers
4) It has decent VESA support
5) It has a native 64bit version
6) It boots a LOT faster (parallel services loading)
7) It can be made to look almost exactly like Windows 2000.
Both however contain outdated SSL support and are basically unusable for browsing the web out of the box.
Windows 2000 is more lightweight, like taking 40 MB of RAM after boot, perfect for “embedded” systems. XP takes around 180 MB on start.
Windows 2000 had a 64bit version! It was awesome (as long as you had hardware with drivers) and to this day I think it was the best version of Windows. All the versions after had either crappy themes, or garbage interfaces…
There was a beta version of 2000 running on alpha. But 2000 was not a 64 bit system.
2000 was a 32bit os running on a 64 bit capable hw durum testing, just like nt before it. The release version of 2000 only supported 32bit x86 code
He is probably thinking of Win2K3, many of us used Win2K3 as a desktop OS and it frankly kicked serious ass. It was lightweight will giving you full 64bit support for large amounts of memory (128GB of RAM IIRC) and it was rock solid stable.
I think you must be as drunk as the other fools that loved XP.
Xp was the OS we did not want to usae becuase we were running 2000. Some idiots stayed on 9X tech and used MEand it was understandable. 2000 was out before ME and was better. Unless you clung on to hardware that was in dire need of a bin.
But more to your sins.
> 5) It has a native 64bit version
No it had a tech demo that was never updated and made me go back to normal XP (as 200 was no longer usable due to directx upgrade not applying to it). As soon as I could I moved to MS’s first real 64bit OS. Vista. No it is not as bad as you recall. SP1 is pretty much 7. Well 7 is vista SP2 just becuase people needed a rename. (people place too much on “names”).
And then you say:
> modern HW
Great I’ll use 11, then, failing that 10. Or does modern mean something different to you?
I’m a fool, nice. It’s the Internet right? Being anonymous and crap?
Considering the amount of spelling errors, looks like one of us cannot even use a spell checker. So much for insulting others and feeling righteous.
My “sins”? You’ve crossed the line here.
Sorry won’t bother replying to this. You have no manners, no respect and feel it’s OK to insult others.
It’s a sad fact that when people write like the commenter did, hardly anyone will read it. I read the first two sentences and bailed on the rest.
Windows 2000 has the same version of DirectX that windwos xp does (9c).
Yes, it supports multiple logical processors per physical processor, and is licensed by physical processors, so the practical number of logical processors is unbounded. Unfortunately it has some fairly inefficient I/O APIC behavior that mean running any multi-processor VM configuration is very inefficient. 2003 does this much better.
Does it matter? I mean, would you actually run a real system on VESA? The good thing about VMs is the environments provide drivers that can run higher resolution displays, and if running on a real system, you’d want hardware drivers.
Except that’s 2003 claiming to be XP 🙂 And if you’re going to run 2003, you may as well run “real” 2003 if only because it has another year of updates and a few nice server features like multi-session RDP.
XP had boot prefetching, but parallel service loading happened in Windows 7. Honestly I’ve often found 2000 boots a lot faster – the prefetcher feels like it can optimize for a specific thing but degrades badly when you fall off it, so XP’s boot is either great or awful whereas 2000 is consistent.
Up to a point. It’s possible to turn off themes, turn the start menu into a classic mode…but it’s obvious this isn’t the environment that was being developed/tested/polished, so there’s plenty of aspects of it that just aren’t as clean as 2000.
I think the biggest benefit of XP’s VESA support was that it was much better at handling DOS games. Win2k wasn’t terrible – games using VESA graphics would still run, and VDMSound was available to provide SB16 emulation so audio in DOS games would work (XP included SB16 emulation for DOS VMs), but DOS games could still be wonky in 2k, and were much less so in XP
The Universal 9x/NT VESA driver by BearWindows is recommended.
I have a blazing fast Ryzen 7 5800X along with 64 gigs of fast DDR4 RAM running in dual channel mode and I can install both XP and 2000 into a RAM disk.
XP is a whole lot faster than 2000 at booting. XP here boots in mere four (!) seconds (cold boot) in a standard configuration, i.e. I stripped nothing, I didn’t disable any built-in services. 2000 on the other hand takes at least 20 seconds to boot.
People are so full of myths and crap and cannot take 20 minutes to stand for their words I have a strong desire not to ever open my mouth. Each point on my list is not an opinion, it’s a bloody fact which I can demonstrate and provide proofs if needed under a controlled environment.
Except if you have a brain on your shoulders then you’ll remember that pretty much all NT based desktop OS’es so far have had server counterparts and shared pretty much everything except server specific components. Drivers for Windows XP and Windows Server 2003 were interchangeable. This has been the case for over 20 years now and started with probably NT 3.51 if not earlier.
So, claiming that “XP 64 was Server 2003 in disguise” is … I have no mild words for that, only anger and resentment.
Right, that was my experience too. It’s what happened after I installed drivers and updates that it fell off a cliff – on the system I was using at the time (Athlon64, rotational hard drive), XP clean install boot took around 20 seconds, 2000 took around 35, and XP after drivers and updates took 50.
People have different experiences. One person having one experience does not invalidate a different person having a different one. I’m glad you had a great experience with XP.
(Would you want to live in a world where we all had the same experiences? Wouldn’t it be a tad dystopian? But if we don’t have the same experiences, how should we interact?)
…except this one. NT 3,1, 3.5, 3.51, 4, 2000, Vista, 7, 8, 8.1, and 10 all shared builds between client and server. The kernels were the same, and the same service packs applied to both. Windows XP was an odd one out, where the client release happened in 2001 as 2600, then an extra 18 months went into the server release, which arrived in 2003 as 3790, and the two never again converged. This is why there’s an XP SP 3 from 2008, but XP 64 only has SP 2 from 2007. Because these were different source trees with different backports, they didn’t always get the same updates.
There’s one more release like this – the recently released Server 2022 (20348) is a build not shared with a client release.
They were highly compatible, that’s true. But it’s also true that XP and 2000 drivers were highly compatible. The delta between 2000 and XP is very similar to the delta between XP and 2003 – it’s about the same amount of development time. From a kernel point of view, things like the I/O APIC changes are quite visible to drivers – see things like KeEnterGuardedRegion, which necessitates things like KeAreAllApcsDisabled, which must be called for certain 2003 drivers to be correct but doesn’t exist in XP at all, so the driver needs to probe for it dynamically.
I’m really sorry you feel that way. I appreciated your post, and agree that it’s quite informative. No post can convey all of the facts of the universe, and it’s quite possible to add facts to earlier facts. I certainly didn’t mean to be disrespectful with my post, and I’m sorry if it came across as dismissive.
XP had its ups and downs.
Its memory manager was optimized for RAM constrained systems (256MB back in the day). It would swap out the current program on task switch, even when memory was available.
(Cannot find references anymore, with modern Google/Bing, my search skills have deteriorated).
It is the last version also before the major driver ABI changes in Vista. Thus, modern hardware will have limited support for those systems. (Anything newer than GeForce 600 does not seem to have Windows XP drivers). So, practically nothing in the last 10 years in terms of video or audio.
Not to mention terrible security. I remember getting my Windows hacked during a clean install. They did not even wait for me to reach the final desktop before injecting the rootkits over network. (Yes, I know my fault).
There are some legitimate uses for old versions, like legacy system support, or just nostalgic curiosity. But all those other benefits can be replicated with a modern Linux desktop with a proper Wine/Proton stack.
Even including the looks: https://www.xfce-look.org/p/1118738
Windows 2000 allowed me to develop games software without having to reboot every half hour while also being a user friendly experience which let me play retail games too. I was never a fan of Windows XP and only moved to it when forced obsolescence kicked in. Produce an OpenGL driver for Windows 2000 and pretty much every current game would run on Windows 2000.
If the developer has built the graphics pipeline in a scaleable way (which they should do every single time) you don’t need half the nonsense on graphics cards. There’s very little reason to make a game only playable on a ninja machine. I’ve even seen 2D games run like slugs because of this. There’s also the design aspect as well as creative assets which play into the need for a ninja machine. Developers unconscious biases fill out the design space until they bring computers to their knees when there’s no real need to. That said I remember other game developers “back in the day” saying XYZ wasn’t possible because because so they impose artificial limits on their design vision.
Here’s one thing: Not one single game engine today employs full scaleability. Not one can scale from a sprite to a fully rendered and animated model with leeway for 8K content. If you take a really close look at how parallax and distance and perception work you can see they are chewing up processing power to render street scenes. There’s a long list of other stuff but I’m just focusing on the one illustrative example. They also create a hard floor on processing power which isn’t necessary.
Even database programmers and business application programmers miss optimisations. You don’t have to recalculate everything every single time one variable changes…
Apart from video (or games which I don’t play) there isn’t a single task I do today which couldn’t run on a Windows 2000 class machine. Not one. Bloated web pages which don’t add any meaningful value don’t count.
HollyB,
I’d say it depends on what you mean. Games tend to scale better than other types of GUI applications because they can just scale the entire canvas without having to worry about the complexities of “reactive design”. As long as the hardware is capable of driving the pixels you can often just up the resolution and it will work. That said, many old games don’t look great scaled up because they used low resolution textures and low polygon models.
One way to address these limitations would be to procedurally generate resources including textures at arbitrary scales. But I think the industry is actually approaching diminishing returns with textures that are already high enough quality such that increasing resolution doesn’t noticeably increase visual fidelity anyways. It’s not like the old days when each new generation of hardware resulted in drastic improvements to output.
Do you have something specific in mind?
Obviously it depends on what you’re using the computer for. 2005-2010 era machines can still be quite usable for some productivity software, though I would replace the hard disks at least. The web has gotten much more bloated and I don’t relish using older machines to browse it today. Also, apart from low resolutions my computers from that era had trouble playing videos…this would undoubtedly cause problems for some. As a developer though older computers can be downright painful. I still support software written in visual studio from the 90s. VS2003 was quick and snappy and as the company upgraded VS every few years things got slower and slower. Now you need a monster machine just to have a reasonable experience building software from the 90’s. It really is insane, haha.
Speaking as a game developer who actually designed and coded for this to somebody who has never been a game developer it would be better if you just took what I wrote as read.
There’s a lot in what I said covering everything from high level to low level design decisions, forward planning, backwards compatibility, performance, artistic choices and asset creation and asset management, engine reuse and code maintainability, and project management, and a lot of other things.
You aim for this or you don’t. I couldn’t care less what choices other developers make because that whaboutery can go on for ever.
Procedural generated content unless you know exactly what you are doing is utter garbage for a lot of reasons beginning with artistic integrity and psychological responses of users through quality of asset to qualitative evaluation of end product. It’s as garbage as the iPhone 13 camera…
I have no comment to make on post 2005 Microsoft compilers other than coding in Textpad (or equivalent) and clicking the button for compile wasn’t sluggish. And yes because my portability layer was actually useful I could compile with Borland C++ Builder too (which was miles faster than VS for compiling) and any other compiler or compiler version number or SDK I had installed. I never used the Visual Studio IDE because I coded to Win32 or had my own tools. Post Visual Studio 6 I never bothered with VS online help and used Google search engine instead. It was and still is from what I hear 100 times better.
Just for once can you find it anywhere in yourself to go “hey that’s cool” and actually get it?
I also hate useless nosey questions you should know yourself or find easily in comments on The Register if you bothered to read them. It’s just another form of control I almost always see coming from the direction of men and it gets tiring fast. If you need to know that badly do your own learning, or just fess up you’re wasting my time and energy.
HollyB,
I know much more about graphics than you are assuming and you aren’t making it clear what, if anything, we disagree on…?
You didn’t go into any of those details, but I encourage you to do so!
I think procedural generation of content has a lot of merit. Even back when I did graphics demos in DOS I created some cool procedural generated fire effects that were more fluid and variable than what you could get with static texture mapping or even 2d video and with far less disk space. As sophisticated as manual 3d modeling has gotten, the result is often “canned” motion and graphics that tend to get repetitive. Real world textures, bump maps, and motion are imperfect and can benefit from procedural generated sources. Advanced 3d graphics packages like blender today include support for procedural shaders and they can improve quality without increasing model complexity and file size. I think there could still be room for improvement by modeling physics in ever smaller detail and objects that get defined by physical simulations rather than models, especially with things like cloth and even mechanical devices. Procedurally generated terrain can also produce good results with lots of variability.
BTW I’m really not saying this to be difficult, I’m saying it because I think it’s a cool topic to dig into.
I do think it’s cool you worked on games.
That’s not reasonable. I asked you a question specifically relating to what you said. You are the only one who can answer because you are the only one who knows what you had in mind. If you provide a link to the specific comments you are referring to, then I will be happy to read them.
To be clear, you’re referring to the IDE, not the compiler, right?
The compiler has gotten slower but nowhere near at the rate of hardware getting faster. Looking now, Visual C++ 6 compiles my project in 7 seconds; 2022 takes 22 seconds. That’s a 3x regression, but hardware in the last 23 years has improved way more than that. Today I work on 90s code on very low end machines, because today’s dual core Atom can compile 90s code much faster than the highest end devices on the market in 2000. Obviously I’m not running the IDE on an Atom though.
Personally I ended up avoiding Windows C/C++ code in the mid-90s just due to performance. There was a time when asking Visual C++ to build its default (aka template) AppWizard project took over a minute. I really don’t know how developers got anything done – compiler performance was one of the main reasons for Borland being so successful, until compiler performance stopped being a major problem.
malxau,
A bit of both but yes I’d say the IDE has fared far worse in terms of resource consumption.
The counter argument is that the IDE is more sophisticated, has more features, or whatever, but given that it’s the same project and not actually using more features, the overhead and requirements to work on the same project today are so much higher than they need to be. I shouldn’t be surprised though because this is the new normal.
Yeah, C/C++ are flawed languages. Include files are a poor substitute for proper interfaces that most other languages have. This often leads to bad compile time performance due to the frequent need to re-parse files. C declarations are very sensitive to ordering issues, which is just archaic after using other languages. The size of integer data types are not well defined and can cause frustration when porting code.
Untyped macros are bad practice and yet remain common in C code. Sometimes macros are unavoidable because C “constants”, which are property typed, cannot be used as constant expressions…ugh what a stupid limitation! Visual C/C++ and win32 take this bad practice to a new level using macros everywhere even in reference to function names.
A lot of people do recognize the flaws but like me have to keep coming back because of its dominant position.