Grizzled Unix vet Paul Venezia tips his cap to the Windows Server crew, suggesting that the lessons of Unix history have not been lost on Microsoft — and that’s one reason why Windows Server has become so complex. ‘The Windows Server of today has more in common with Unix than many people want to admit. The upside: more stable servers, greater scope of services, better adherence to standards, and Microsoft’s newfound willingness to work with its competition. The downside is that Windows has become more complex than Unix from a management and administration point of view,’ Venezia writes, even if he still sees some Windows admin practices as prime examples of how not to administer servers.
What a bunch of malarkey, yes you heard me, malarkey.
Been saying it for years.
“Given enough time and money, Microsoft will eventually invent UNIX….and call it innovative.” – me.
Edited 2011-03-09 04:26 UTC
And I’ve been saying this for years
Microsoft has always tried to build the software people needed at the time. They didn’t pretend to have perfected the OS in 1970.
And by in large they succeeded.
That meant, heavy tie in with the graphics back when GUI speed was important on low end computers.
That meant DOS integration with further versions of Windows.
That meant low/no security in desktop versions to make it consumer friendly back when most home computers were not networked or connected to the internet.
And as the world’s needs increased and computer horse power rose, they put many things in… always trying to maintain backwards compatibility.
I remember having to write a utility program to run on some lab computer at university. They had old computers that they wanted to use to connect to some oscilloscopes. We’re talking 386s and 486s. I did in QT, but I couldn’t find a usable version of Linux that ran well on those machines. The GUIs were ghastly slow. Win95 though, did the trick.
If you want to wonder why Windows did so well as opposed to Unixy systems. It is reasons like this. They got things to work. Meanwhile, a Unix person would have probably said… we can only get it to run this fast due to X windoww, message passing…blah blah.
There’s an article somewhere I read about MS actually writing custom memory code to handle bad software in games like SimCity so it wouldn’t crash.
I have little doubt the folks at MS could have done things like Unix. They have always had many technically smart people there. Yet, they chose to do their own thing to make their product succeed by designing it for users at the time and getting their own API for lock-in.
I just don’t buy the attitude that MS was ‘stupid’ and couldn’t figure out how to build an OS. Eventually they’ll learn all the lessons and become a Unix.
Rather, I think they knew all about Unix. They saw that it didn’t fit the current customer needs nor their own business model, so they did something else.
They’ve been very successful at it. And now as the world moves towards servers, thin clients… MS is once again adapting… providing the ‘server’ like behaviors that have been in Unix since day one.
Who knows if they’ll succeed or if their old baggage is just too much.
Until recently, MS did not know how the Windows Kernel actually worked, according to Windows Kernel expert Russonivich. No one knew what was going on inside the Windows Kernel. I dont know if this is true, still, but Russonivich said MS was working on this 2010, to hope to be finished 2011. I dont know if anyone yet knows what is going on in the Windows kernel. It is a total mystery to everybody. The Windows kernel is huge, maybe 100MB or so, a black hole. Unix kernels are typically a couple of MB.
Regarding sim city, yes MS worked hard to keep backwards compatibility, but failed big. For every new release of Windows, there is a huge list of software that does not work with the latest windows and need to be rewritten. Just look at MS own homepage, there usually is a list of software that does not work with the latest Windows.
So, many say Windows is a mess. And is not fit for anything demanding and large Enterprise stuff. There is a reason London Stock Exchange threw out their Stock system (running on Windows) and installed Linux instead.
Nothing stated is specially Unixy. Yeah, WS has been getting better, that doesn’t mean it’s more similar to Unix.
When I could redeploy any of my WS servers from scratch from a version controlled registry and full backups in a few keystrokes without the pain in the ass of BSODs because disk controller changes, reactivating nonsense, cluttered registry and so many little stupid things more, then it would feel a bit more Unixy.
I always thought that is why virtualization is so populair, because Windows is such a mess.
If it wasn’t for Windows, we’d be using a container technology (OpenVZ, Solaris Zones, Linux Container, Linux VServer) because that just makes a lot more sense in most cases.
Because virtualization is about consolidation of resources and efficiency. Well, if you want seperate ‘machines’ running on the same hardware there is nothing more efficient than just creating seperate namespaces like these container technologies do.
Virtualization is popular because it increases portability and security.
Both Nix and Windows seem like a mess if you don’t read a good book first.
Virtualization does not increase security, that is not true. It increases complexity and more code, thus more bugs, thus less security.
The stability part bothers me the most, having been working with almost 15 different UNIX/Linux flavors, I simply can’t agree.
How would you know if that 15 years didn’t include Windows?
Sorry, where did you read “15 years”?
It probably bothers you because you hold a dated view of Windows Server.
Windows Server 2008 is a refined OS. There are plenty of web hosts that guarantee 99.9% uptime with it.
There was even a Windows 2000 server that went over 2 years.
http://www.neilturner.me.uk/2003/01/21/33rd_longest_uptime_windows_…
Oh man, does this mean that the Windows guys are going to start this pointless uptime penis-measurement thing too now?
No it means that the derisive attitude towards Windows Server as seen by the parent’s comment about stability is unfounded.
When SLAs are measured against this at least in part, they really have no choice.
Yeah, I heard it too! There was one Windows server that served some basic web sites, and it had an uptime of a couple of years! This really shows the stability of Windows, and how it is fit for highly demanding large Enterprise servers such as London Stock Exchange.
Also, I can add that I know of one Windows server that did print jobs and it has an uptime of several years today. This is obvious proof that Windows is fit for large demanding Stock Exchange Systems and other Enterprise tasks.
(You argument is flawed)
All he was saying is that Windows Server is usually not thought of as good as Unix and uptime is one indicator of server stability.
We have one Web server that can deal with 100% CPU load at peak periods and it doesn’t fall over.
The London Stock exchange problems were probably more to do with the fact it was running on SQL Server 2000 which is far slower than newer versions than anything else.
Edited 2011-03-10 20:27 UTC
I can run numerical computations at 100%, if the computer runs without crashing – it proves nothing.
The major problems with LSE was it crashed a lot. And also, the latency was to high. Exchange systems need low latency. Ive heard from several sources that Windows latency is too high (something with the TCP/IP stack)
The only thing I think we have found out is that you have an attitude problem.
The article talks about trying to find the right dialog, settings, command, etc.
I’m sorry, but Unix-admins don’t need to search (by hand), they have enough tools available to find what they need.
They can even use find in /etc if they don’t know what file they should change.
You forget about /usr/local/etc.
That’s the thing about Unix… it is good at what it does, but it’s a relic in many ways, and hard to get a handle on. For example, I know just enough about Unix to know that a lot of important stuff is in /etc. But why the hell is it called /etc? Actually, that’s a rhetorical question, because I’m sure it makes perfect sense if you explain it in a historical context, such as ls being shorthand for ‘list’. But it’s about as intuitive as the Chinese alphabet. If common sense were applied, you’d think /etc would be the LAST place to look for anything important. But that’s what Unix is… the exact opposite of common sense.
When I first started playing around with Linux, I had no idea what the hell to do, because ‘help’ didn’t work; I had to buy a book to figure out what ‘man’ was.
In some respects, Windows is better because if you don’t know what the hell you’re doing, at least you can sit down in front of it and poke around in dialog boxes and stuff. I mean, you’re gonna be like Ray Charles in a strip club… just trying to feel your way through, but at least you can probably get to what you’re looking for.
Of course, Windows is far from perfect either. I guess my point is that they’re both a huge pain in the ass, and it’s a crying shame that these are the only two real options that we have.
Edited 2011-03-09 10:13 UTC
Well, we’re talking about sysadmin, there.
If you have to poke around and click haphazardly to find out how to set up your server, you probably should stop immediatly.
Administration of a server is exactly the kind of situation where you should RTFM. There is no doing it otherwise.
Edited 2011-03-09 13:57 UTC
“Trial & error” is not how a sysadmin should work. 🙂
In relations to RTFM: UNIX administrators traditionally know certain terminology, they know how things are named, even if the name on system A differs from the name of system B. The same is true for locations of files, or even for maintenance procedures. It’s their job to know them, and the differences.
In “Windows” land, you can’t apply established terminology as MICROS~1 invented own words for things. Technical terms are standardizes even among totalls different operating systems. In “Windows”, they are replaced by abstractions (“symbol for”, “represents a”, “is a picture of”). Procedures and locations chance with every release of “Windows”, sometimes there is a “legacy way”, sometimes not. The picture-centric interface emphasizes how things look like instead of what they are and therefore are called. A pictural search cannot be automated, like “dear computer, find me the icon with the blue ball and the upside-down letter ‘i’ on a yellow towel with a bunny next to it”. The only search that works is the “pop-out phenomenon” that works by visual recognition – and has nothing to do with the cognitive concepts that make you remember and locate a certain word. Of course this pictural approach is a no-go for blind system administrators as they rely on information that is presented as text, either by synthetical voice output or Braille readouts.
A good sysadmin would follow the advice “think first, then act”, so poking around with a stick in a pile of garbage simply looks wrong. 🙂
Anyone remember ‘ancient windows server history’ about…
Skype??
AT&T??
HotMail??
and about 4 other large events in the last year??
Knocking down Massive PRIME customer business…
THAT is a Feature Microsoft ‘diners at the table’ from the media fail to mention…
Editable Text Configuration
Interesting. Has it always been that case? I remember having /etc/GETTY, /etc/wall, /etc/upkeep, /etc/adduser, /etc/mount or /etc/fsck, as well as the important ones /etc/INIT and /etc/rc (on a UNIX System III compatible) in the 80s… Per its manual, /etc was a directory for additional system-level relevant tools (as you can see from the examples), carrying the name “et cetera”.
But I like the explaination above. 🙂
Why is rather unimportant. All you need to remember is that configuration files are in that directory. Equally non-obvious naming conventions can be found on any platform, including Windows.
True but in my experience *nix is less of a pain in the ass. As an example, just yesterday I had to create some scheduled jobs on Server 2003. Imagine my surprise when I found out that it is *IMPOSSIBLE* to set a scheduled task to be run by the System user using the GUI. It can only be done using the schtasks cli command. Wtf?
In my personal experience Windows (and Windows apps) works in the opposite way of Linux. Initially you feel great because it’s all so easy and simple to do. Click there, check a checkbox here, wooooh! done already. Over time though, the more you work with it the more frustrating it get as you discover more and more shortcomings and inflexibilities and the insane complexity of the registry starts to wear you down.
*nix, on the other hand, feels overwhelming at first and you’re not really sure wtf you’re supposed to do and how. Where the hell do I configure network interfaces and what the heck is all this stuff in /var about? However, the more you work with it the easier it gets and the more you come to appreciate the flexibility and simplicity.
Edited 2011-03-09 19:11 UTC
This is so true. I inherited a WS 2003 install a year or two ago.
It is so frustrating wasting time trying to find the right GUI screen on WS, it’s tedious in the extreme when you have an idea what you are doing. It constantly winds me up.
I’ve gradually migrated services away from the box with a view to decommisioning but I fear I may never be rid of it, I dread the idea of trying to shift years worth of shared folders and user setup.
You could get Server 2008 and run without a GUI at all.
OP doesn’t talk about gui vs. non-gui, he’s talking about Unix being a really elegant way of designing a system by not overdesigning it. In Windows, settings reside in the registry, INI files, binary configuration files and are scattered about the filesystem without any sense of organizational structure.
The Unix *culture* on the other hand is that of building on top of existing infrastructure. Unix has a fairly well designed and simple file system, so you just put conffiles in /etc, just like everybody else does. Of course there’s the odd misfit who doesn’t play nice, instead scattering their crap around in /opt/vendor/package or /usr/package – those are then ridiculed in the Unix culture, rather than glorified.
I’m pretty sure he wasn’t saying that either. No, I interpreted it as “I don’t know my way around windows server and I’m having to do some work to figure things out therefore it sucks.” Funny thing is, there’s probably some windows server admin saying the same thing right now about a [insert unfamiliar system] they inherited responsibility for.
Allow me to quote from the article “Nine traits of the veteran Unix admin”:
Veteran Unix admin trait No. 8: We know more about Windows than we’ll ever let on
Though we may not run Windows on our personal machines or appear to care a whit about Windows servers, we’re generally quite capable at diagnosing and fixing Windows problems. This is because we’ve had to deal with these problems when they bleed over into our territory. However, we do not like to acknowledge this fact, because most times Windows doesn’t subscribe to the same deeply logical foundations as Unix, and that bothers us.
*** end quote ***
From this idea, I would assume that a capable UNIX admin is able to gain knowledge of ANY operating system because he is able to learn new things, to abstract, to conclude – those are the basic means of his everyday work. On the other hand, a “Windows” admin may not be able to do so as the mentioned abilities are usually not required for his work, as they do not conform with MICROS~1’s way of doing things. Searching for logical structures in a system that does everything in an arbitrarily designed way won’t work. I also found from my own experience that UNIX admins don’t bother learning new things, as this is also part of their job, and they are usually able to use adopted knowledge very fast; the “Windows” admins I met insist on doing things the one way they’ve initially learned (although they traditionally claim they didn’t have to learn anything, which is untrue), and they refuse to change things. Those concepts of behaviour can be compared to the academic type, utilizing scientific apporaches, seeking for less manual work, and the consumer way of buying made-ready solutions that he can use out of the box, requiring virtually no knowledge about how it works interally. This approach of course involves either higher costs or more manual work. Maybe this is due to the fact that they needed much time to get things working? Manually? Or it’s caused by business environments that have to pay attention to legacy as they’re still using outdated systems that will lose manufacturer’s support in… 2014 I think… and still don’t see any need to get a new view on things – catastrophe preprogrammed. Then it will be the UNIX admins keeping their stuff on life support. 🙂
Please don’t get me wrong: This is just my very individual observation and conclusion which I do not claim to be valid and mandatory everywhere. And I didn’t want to say that handcrafting is any bad. A quite inaccurate comparison, I know…
It is pretty easy to see that one should take notice of what Windows Server does just from the fact that it is ridiculously successful, which is a relatively recent phenomena. Windows Server has grabbed most of its market share in just the last 10 years, in a time when Linux had a huge head-start with a great pricing advantage, and against UNIX as the established leader.
Also things like Windows Server running on 5 of the supercomputers on the top500.org list. Certainly a very small share, but who could have imagined even a single Windows machine in the top 500 supercomputers of the world 10 years ago? Overall it is easy to go along with the pervasive idea that Microsoft has done nothing but live off their desktop/office monopoly the last 15 years, but the Windows Server side is really a rather recent success story.
The technical reasons for the development aren’t exactly crystal-clear to me. The UNIX vendors clearly failed since they were useless dinosaurs, ridiculously overpricing specialized hardware and to a great extent archaic software stacks (HP to this day sells the compiler suite for HP-UX for thousands of dollars, and makes debugging without it artificially difficult). I would personally suspect that it is hasty to assume that Windows Server only succeeds due to business strong-arming or such however, there is always a lesson to learn from every new-comer.
You have to be careful when talking about windows server market share, as there are very different and distinct markets. Microsoft isn’t only the OS provider, but also one of the main application providers for the servers. Its not easily possible to tell if the popularity of the server is due to the “design of the os” or the applications that sit on top. If Microsoft offered Exchange,Active Directory, share point, or IIS for Linux or other non MS Operating systems. Then we could attribute the OS’s popularity to its design alone.
It is quite simple, really.
The secret of Windows server is that server software written in Visual Studio runs only on Windows Server unless the developers made special efforts to target something else. By the end of the day its more important to businesses what the software on the server enables them to do than what OS the system administrators prefer.
One of Microsoft’s key secrets to success has always been that they’ve gone out of their way to please software developers by virtually giving away all development tools so they can make the money back on that server it has to run on. Just imagine what Oracle or SAP would charge for access to MSDN.
No it is not that simple.
Both Linux and Unix vendors have not cared about the small or medium business markets or working to make it easier to integrate Windows networks with Unix services. Samba has been underfunded for years and companies like Novell don’t seem to care.
MS keeps improving software like SBS, exchange, sharepoint and AD while companies like Red Hat continue to focus on big Unix migrations.
Red Hat has even stated that they aren’t trying to convert MS shops.
You chose the one company in the world that generates hundreds of millions of dollars from selling the only commercially viable alternative to Microsoft’s directory/network services to criticize for not funding an open source alternative to Microsoft’s directory/network services. Netware is slowly dying on it’s own, last thing Novell needs to do is accelerate the process.
And despite that, the lead Samba developer was in fact a Novell employee. At least until he quit and went to Google, after the whole Microsoft deal thingie.
I don’t disagree with your sentiment, but really.
The title of the original article is “Why the Windows Server crew deserves respect, too” which is not the same as Windows Server deserving respect. In fact, nowhere in the article is it said that Windows Server deserves respect. What the article does say is that Windows Server *admins* deserve respect. You know, just what the title implies that the article is about.
Edited 2011-03-09 18:46 UTC
I believe that, but only because you have no control. So a lot of the things I am presented with in a Windows environment are usually issues with drivers. The other issues usually are not technical and are mainly cost and licensing impacts in a disaster recover context, which is brutal with closed proprietary software.
UNIX/Linux systems are complicated to administrate because you have much more control, and therefore many more possibilities to contend with.
Particularly if it is LINUX.
If you are a good admin, you wield LINUX using source code as well as knowledge of the OS and administration software.
-Hack
Yea it’s too bad that when you turn on Windows Server it starts setting up websites for you and randomly makes group policies. I’ve tried asking in the microphone for control over the system but it just laughs and keeps working. I even caught it looking at porn one evening.
I guess I should have gone with Linux. I heard it runs forever, never gets hacked and never has any problems with updates.