Microsoft plans to build more Unix features into future versions of its Windows Server operating system and cease work on its separate Services For Unix product. Microsoft plans to include some of those features in Windows Server 2003 R2, an update to the server OS due at the end of this year. At the same time, the company said it is not planning any further releases of the standalone Services For Unix product. DiStasio, a director in the Windows Server unit, said the plan is to build Unix tools into releases beyond R2 as well, but he did not rule out that there might be some tools offered separate from the OS.
I wonder what they are really up to.
Yet another UNIX(r) ?
“Microsoft plans to build more Unix features into future versions of its Windows Server operating system and cease work on its separate Services For Unix product.”
Does that mean SFU will not be supported on Microsoft Workstation OS such as Windows XP Professional (or Vista Professional)? Someone please clarify this. It would be a great mistake on Microsoft’s part if they make SFU Windows Server OS only.
Does that mean SFU will not be supported on Microsoft Workstation OS such as Windows XP Professional (or Vista Professional)? Someone please clarify this. It would be a great mistake on Microsoft’s part if they make SFU Windows Server OS only.
Currently, SFU runs on both desktop and server versions of Windows as a seperate package.
In the future? Can’t say. I hope they don’t do anything silly….
As time goes on, having more unix-like capabilities in Windows by default is a good thing.
With this announcement, do they plan on dropping SFU for the desktop, or will it still be available? Personally, I need it on the desktop (or Cygwin) and server use is secondary.
Good point, I had the client installed at one point but almost a gig’s worth for the NFS client (I did install loads of other stuff, like the shell and the unix tools) was too much to ask.
A seperate, stand alone and free NFS client from MS would be awesome.
‘A seperate, stand alone and free NFS client from MS would be awesome.’
Yeah me three
I doubt it’ll ever happen, because then people might start to wonder “Hey, why are we using this SMB crap?”
I’d like that too.
Are they going to license parts of or all of the Interix package (www.interix.com), or is Microsoft going to “natively” code the Unix tools? Considering that Microsoft’s native tools (Windows NT/2000 era) were pretty lame (POSIX 1000.3) which basically allowed you to run emacs on your Windows machine. And for a company that is “focusing on security” why do they insist in using NIS and telnet?
Interix (NT native Unix subsystem, meaning sits atop the NT native API as a native subsystem) was purchased and integrated into SFU quite some time ago.
Yes, it is likely the Interix subsystem will be included with R2.
Strangely enough, this would make MS one of the biggest ‘nix vendors.
Microsoft might be still a large company but their proprietory virus and spyware laden insecure and crappy Windows OS is going downhill very quickly.
Linux is already the dominant player in the server market and now it has started to become a major player on the corporate and home desktops. By next year Microsoft will be mostly associated with their XBox360 if anything.
Ummm… Linux is no dominant anywhere…. Is it being considered in corporate enviroments? Yes. Dominant, not by a long shot, even on servers.
Ummm… Linux is no dominant anywhere….
Are you absolutely sure about that? :}
http://www.top500.org/sublist
Dominant, not by a long shot, even on servers.
You keep using that word…I do not think it means what you think it means.
Linux is already the dominant player in the server market and now it has started to become a major player on the corporate and home desktops.
Slow down there. We still have a long way to go before Linux is “dominant” anywhere. Linux is doing very well in the server market in general and it has been a major player in the webserver market for years now but the desktop is still new territory. The desktop hasn’t really been a focus of Linux development until recently and it will take a long time to dethrone MS no matter how crappy Windows is. I use Linux on the desktop and have for years but in terms of marketshare and mindshare Linux still has a lot of catching up to do. The good thing is that it seems that Linux has already captured the “power-user” market and is now creeping its way down to the average joe. I’ve encountered more than a few non-technical people who are now giving Linux a try.
Linux is already the dominant player in the server market and now it has started to become a major player on the corporate and home desktops. By next year Microsoft will be mostly associated with their XBox360 if anything.
Interesting considering I work for a fortune 500 company in the top 20 who just completed a 6 month linux pilot program.
lets see what were the results of that pilot.
server side: They stated that while linux shows promise it does not scale well vs. what we use now. (Solaris, various IBM mainframe systems, HP/UX, AIX)
Considering it cannot replace these systems it means added support, documentation and standards proceses that they do not feel are justified.
Desktop: it does not fit well with our currently standardized software applications. We use Exchange servers heavily and Office (word, powerpoint, access and of course outlook) plus a large assortment of custom applications that have been developed over time for everything from manufacturing to network operations.
To cut it all short they found that at this time the only thing Linux will do for us is increase support and operational costs with very little benefit over our existing infastructure.
in the end they stated that Linux is not approved for company use beyond 3 test servers that we use. The 3 linux servers we do have are not production systems and do nothing more than serve a few non critical internal websites.
Somehow I don’t see linux taking the world in 12 months.
[i]”And though the basic purpose of having Unix services is the same–to help make Windows work in mixed computing worlds–some of the needs of customers are shifting. “Initially it was about converting (proprietary) Unix to Windows,” he said. “It is transitioning a lot into coexistence with Linux.”
That’s at the end of the article. I think MS may be getting the point. Coexist, don’t proselytize.
That’s at the end of the article. I think MS may be getting the point. Coexist, don’t proselytize.
Certainly a good thing, and it would be great if more advocates of free operating systems focused on the same point.
OK, sounds like your company did
a) a very poor comparison of linux vs. established *NIX – not scalable? Linux runs on most of the top supercomputers in the world. WTF?
b) What they did is nothing to do with Linux, concerning the desktop. Their “study” determined that they didn’t have the resources to move from the established applications, not anything to do with the quality of the applications available on Linux.
In short, if the company paid for this study, and that was the sum of what they found, they didn’t spend the money very wisely.
a) a very poor comparison of linux vs. established *NIX – not scalable? Linux runs on most of the top supercomputers in the world. WTF?
Yes that is what they concluded. I am not sure of the specific applications that they felt were not scaling as well with the linux systems they tested but we have applications distributed on the network that literally 140,000 people will access each and every day.
b) What they did is nothing to do with Linux, concerning the desktop. Their “study” determined that they didn’t have the resources to move from the established applications, not anything to do with the quality of the applications available on Linux.
No the problem was outlined as very specific applications that had no comparison on Linux. If the plan includes throwing out 5 years of development on custom inhouse production software there had better be one hell of a gain in productivity or at the least comparable software available with major benefits in other critical areas.
The big issue is “what do we gain by spending the money to move?” Apparently not enough is what they discovered.
In short, if the company paid for this study, and that was the sum of what they found, they didn’t spend the money very wisely.
No this was done in house by a group of people who run our current infastructure. Who do you think originally proposed the idea of Linux ? Our management is clueless about this stuff, thats why they pay for an IT staff.
Most of our best stuff was written at the company. We have a development team. Hell we have our own OEM release of Windows even.
Our IT guys use and love Linux. They ushed for this pilot program.
They set an entire room up with various servers and workstation. They eve reached a point where they had a good set of linux systems online and functioning with our current network as it would be deployed if the company went that direction.
In the end they were unable to prove to the company that moving to Linux would be of enough benefit to warrent the cost and man hours. Even a few people on the pilot team were suprised once they got past the ‘hype’ and actually started doing the job of making it all work.
It did do one thing for us from what I’ve heard. Once some of our existing vendors got wind of what was happening we got some sweet discounts on the latest systems we purchased and brought online. This was stuff from the likes of Sun, Dell and MS.
so if anything linux is great as a bargaining chip. *shrugs*
Well, mr. Anonymous, thank you. Your analysis was very interesting because it pointed out real facts in a real business case.
(BTW, I’m amazed by behaviour some people have. Really, no trolling, I’m very surprised. How can you tell to someone working for a Fortune company they made a wrong analysis, without even knowing what the hell they analized? I guess they didn’t become a Fortune company by making mistakes in their analysis… really…)
As I said elsewhere, Linux COULD be a right choice depending on circumstances. In your case, it wasn’t and people should understand that companies don’t make their moves (and spend their money) based on how they LOVE a platform but just if that platform is actually worthy in their cases.
Linux people should understand that and provide ways to mitigate such problems instead of dismissing (and bitching 😉
OK, sounds like your company did
a) a very poor comparison of linux vs. established *NIX – not scalable? Linux runs on most of the top supercomputers in the world. WTF?
b) What they did is nothing to do with Linux, concerning the desktop. Their “study” determined that they didn’t have the resources to move from the established applications, not anything to do with the quality of the applications available on Linux.
In short, if the company paid for this study, and that was the sum of what they found, they didn’t spend the money very wisely.
Can’t you people just admit Linux isn’t the best thing since sliced bread? I use it at home exclusively, and would surely use it as a server OS if/when I start my own company. But most companies have investments in other solutions, and aren’t going to ditch them and spend millions on porting/moving to Linux just so their tech guys can run around saying:
OMFG!!! We run Linux, we are so L33T. lolololol
Yes, I’m really bad at this geek speak stuff, so that was a poor example, but you get my point.
If you honestly think Linux is the best in all situations and the cheapest, you really need to actually get a tech job and get out a little more.
I find it funny people think this, yet Linux is only sitting at around 25% market share on servers. That is an incredibly high number (probably higher than AIX or Solaris) but it obviously shows other solutions are better in some situations.
Also, if you have a home-grown application tied to a platform such as Windows, the cost of porting to Linux may nullify the money you could save in the future using Linux.
In other words, open up your mind a bit to other solutions.
You know nothing about TCO.
What your study found is that it is expensive taking your existing applications and moving to new ones. And that, having locked themselves into a proprietary world of hurt, they can’t get themselves out of their predicament without re-inventing some of the middleware. SURPRISE! But that has absolutely nothing to do with TCO or the topic at hand.
As for scalability, it sounds like the testers were incompetent, too. Or they were trying to fit a square peg into a round hole. I have a mail server here that is flat-out unbelievable in terms of performance. 11,000 mail users, most of them using IMAP4, with antivirus and spam scanning, plus serving up SQL and RADIUS logins, along with about 1,000 RRD data queries every five minutes. All that with only four 15K RPM drives on the mail spool and 3.5G of memory – with dual Athlon XP 2000+ processors. And the load is very low at 0.4-0.8 most of the time. Most of those users are long-standing and there is plenty of mail on the server that is over five years old, not to mention that there is a domain-name abstraction layer for virtual domain name support. Scalability? Well, I would say, since it sounds like I could take a few of the minor tasks off this same old machine and run Internet e-mail for your 140,000 users with a few more disks and a gigabit ethernet card.
You know nothing about TCO.
I was not aware we were talking about TCO. I thought you were simply spouting off that MS would be only known for the xBox 360 12 months from now and that corporate america was moving away from windows.
What your study found is that it is expensive taking your existing applications and moving to new ones.
To a degree yes, and i’m sure some people expected as much. I think for others the whole idea of moving over was more of a romanticized idea than what was the reality.
And that, having locked themselves into a proprietary world of hurt, they can’t get themselves out of their predicament without re-inventing some of the middleware. SURPRISE!
I don’t think its considered a ‘world of hurt’ as the pilot program was not done to ‘get out of anything’. It was done to see how Linux based systems might fit with our business processes.
But that has absolutely nothing to do with TCO or the topic at hand.
Again I had no idea we were talking TCO here. The topic is actually services for unix, but the post I replied too simply said that MS did not matter anymore. I was relating the most current information on Linux that I have recieved at my job in regards to this.
As for scalability, it sounds like the testers were incompetent, too.
They only build and manage Solaris, HP-UX, AIX, IBM OS390, Z/OS, Windows 2000/2003 and XP client machines on a network that spans multiple continents.
I’m sure they are morons of the highest degree.
Or they were trying to fit a square peg into a round hole. I have a mail server here that is flat-out unbelievable in terms of performance.
I’m not sure of the server side application specifics. I’ll try to get some of that information if you are really curious.
11,000 mail users etc.
Sounds impressive.
Scalability? Well, I would say, since it sounds like I could take a few of the minor tasks off this same old machine and run Internet e-mail for your 140,000 users with a few more disks and a gigabit ethernet card.
I think it would be difficult to do on one machine as our organization is divided up between multiple business units and some groups which work on government contracts are in blacked out areas where their computing enviroment is handled in a completely different manner than the rest of the users.
Not that I doubt you could pull it off if all of our users had the exact same basic email requirements.
Funny. Athlon XP doesn’t do dual.
Linux scaling issue:
It would be nice to study SGI’s Altix experience in this regard, especially how they solved Linux memory subsystem and made it scale well beyond 16 cpus (512 cpus in a single node). So to say that Linux is not scaling well as opposed to Sun’s Solaris is inaccurate to say the least. Solaris might win in some mission critical features, but Linux is catching up fast.
SGI’s Altix supercomputer: http://www.sgi.com/products/servers/altix/
>They only build and manage Solaris, HP-UX, AIX, IBM
>OS390, Z/OS, Windows 2000/2003 and XP client
>machines on a network that spans multiple
>continents.
I don’t see Linux in that list. But more importantly, I doubt that the people who build and manage those systems were the leaders in this test you claim to have been part of. They have things to do – like run that multi-continent network you’re so proud of. Not that people don’t do that with Linksys routers these days…
Regarding the mail server, it was a bit of a test for you… it’s doubtful that old machine would do what I’m describing. While it’s probable that it would run out of processor and I/O, that’s not the big problem – having as many as 30,000 connections open at once during the day (bank on about 20% of the users checking their mail at once, during peak) would cause problems for any single system with a standard kernel (of any flavour), not just disk I/O and processor utilization issues. But that’s easily fixed with a 64 bit box (PowerPC/Opteron?) and a few IMAP proxies as front-ends for around $10K installed (you might even get away with CDs on the proxies). Or just load-balance a couple machines like it.
Anyway, the problem that I see here – and I doubt very strongly that I’m wrong – is that the company in question did not think outside their own rigid boxes of experience and software, nor did they consider things like having standardized protocols, ownership and cross platform compatibility to be important. I do. Most people who SHOULD be making such decisions do, but my impression of most Fortune 500 back-room workers is different than those posting above… most of them are looking for job security and the hellish nightmare that is Windows provides them with it – because nobody competent would have let Windows into a server room in the first place except to dish up files or interfaces for relatively dumb client interface machines. Even then, I’d have found another way, personally.
Fortune 500 companies are Fortune 500 companies despite their compute networks, not because of them… I have friends working at Alcan (Novalis), Dupont and others that tell me stories that they consider day-to-day while I cringe in horror at the nightmare that is the status quo from the late 90s to today. NETBIOS over WANs, anyone? “Wheee!!! Throw more money and people at it, we’re in the Fortune 50, we can afford it!”
I not only have a tech job, I have many technicians underneath me. And Linux is often the best solution – and it’s almost always the cheapest. Sounds like you’re the one who needs to get out a little more.
Regarding market share for Linux: what a load of crap! How would *anyone* know what the Linux market share is? I have almost 25 Linux systems on the Internet (0 break-ins in 13 years, FYI) under my control and only one of those has a purchased copy of Linux on it, because we got it cheap (and needed RedHat for an application). I’m willing to bet a lot (a LOT) of money that there are a lot of shops running Linux servers that are in the same boat. What odds will you give me?
>In other words, open up your mind a bit to other
>solutions.
Good advice. How about opening your mind up to the concept of open source, where those home-grown applications would not be tied to one operating system or product from a single vendor? Tying your data and middleware to a single platform is STUPID. Trust me, the surest way to find yourself cleaning printers and installing OS patches in my IT shop is to recommend a solution to customers that ties them to the vendor but not to us.
“I have almost 25 Linux systems on the Internet (0 break-ins in 13 years, FYI) under my control and only one of those has a purchased copy of Linux on it, because we got it cheap (and needed RedHat for an application).”
Based on this statement, I can safely say you do not run any machines that are mission critical. I wonder if you are even out of college. This is the major problem with OSNews, Slashdot, and the like. You Linux fanboys think that Linux would solve all of the world’s problems if only everyone would quit being stupid and start using it. Unfortunately, some of us have business or jobs that strictly require we keep costs justified and servers running. The reason that people buy packaged versions of Linux is for support. In my business it is nice to know I have someone with more experience and resources to help me solve a problem if I need it (don’t tell me you know everything and never need help?). Do I use Linux? Yes, I have 2 Linux machines at home and I run 7 Linux servers at work, along with 6 Windows 2003 servers, and 11 Netware 6.5 servers. What’s the solution, just shut up and use what works. Quit trying to convince me that Jesus christined Linux to save the world. Keep in mind this is only my opinion, YMMV.
My sentiments exactly! I bet the developers at RedHat, Novell, and Mandrake love guys like this “and only one of those has a purchased copy of Linux on it, because we got it cheap”. And while this guy blasts people who don’t use OSS for everything, is he supporting OSS or just freeloading?
I’m not sure you quite got what he mean’t.
>> The reason that people buy packaged versions of Linux is for support.
When you have a product that is not supported, and you can’t figure out the problem, you have no one to turn to and then the problem continues and they replace you with someone with better decision making skills.
Its not like in IT we end up with a ratio even close to one person to one application/os/database to support, so having support you can rely on is essential.
Having said that the gist of your reply is still valid.. few companies have any issue with paying for support for anything that their business applications rely on. This does fund linux based companies.
As far as I’m considered, it’s a good thing if future versions might actually include basic tools like grep, cat, wget, and zip. I’m not even that much of a *nix geek, but I still find I have to grab UnxUtils.zip in order to be able to do much in the way of useful work via the CMD.exe shell or batch files.
>Based on this statement, I can safely say you do not run
>any machines that are mission critical
And I can safely say that you are talking out of your ass. I do not need to depend on any third party for my administration, which is why I don’t use any packages or software that require me to. I’m 15 years out of college and make my living with aforementioned servers.
You’re the kind of person I was talking about – you need hand-holding and a safety net. I don’t. Perhaps I’m just more comfortable with my competency.
> Funny. Athlon XP doesn’t do dual.
zeus# cat /proc/cpuinfo
processor : 0
vendor_id : AuthenticAMD
cpu family : 6
model : 6
model name : AMD Athlon(TM) MP 2000+
stepping : 2
cpu MHz : 1667.062
cache size : 256 KB
fdiv_bug : no
hlt_bug : no
f00f_bug : no
coma_bug : no
fpu : yes
fpu_exception : yes
cpuid level : 1
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 mmx fxsr sse syscall mp mmxext 3dnowext 3dnow
bogomips : 3276.80
processor : 1
vendor_id : AuthenticAMD
cpu family : 6
model : 6
model name : AMD Athlon(TM) MP 2000+
stepping : 2
cpu MHz : 1667.062
cache size : 256 KB
fdiv_bug : no
hlt_bug : no
f00f_bug : no
coma_bug : no
fpu : yes
fpu_exception : yes
cpuid level : 1
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 mmx fxsr sse syscall mp mmxext 3dnowext 3dnow
bogomips : 3325.95
First, Linux (both 2.4 and 2.6) scales as well or better than Solaris 10 in concurrent connections as well as MP architectures. Solaris has some theoretical advantages in I/O, for instance the cyclic page cache, but the real-world impact of these features hasn’t been well tested.
It’s universally true that proprietary code sucks. That doesn’t mean that it isn’t faster or more functional or more feature-rich than open source alternatives in many cases. What it means is that the code sucks. You look at the code and you laugh. This is a problem for portability, especially for in-house software solutions, which tend to suck even more than commericial proprietary software.
In a free market model, you assume that any barrier to switching suppliers is negligible. In this situation, market share is indicative of the relative value of the products of competing suppliers. When the barrier to switching suppliers, let’s say, from IBM to Red Hat, is large because your payroll system only compiles on XLC, then the relative value of Red Hat’s products must be vastly higher than IBM’s.
In the corporate server space, any movement in market share is indicative of a strong value proposition, and the value gradient increases as the migration barriers are increased. For instance, it would require more benefits to migrate from Windows Server to Linux than from Solaris to Linux. Luckily for the FOSS community this is pretty much the case for many workloads.
The barriers are even higher on the corporate client, and higher yet on the home client. These markets will move glacially slow, if at all. For how long have we been at roughly 95/3/2% Windows/Mac/Linux on the PC client? When will Mac or Linux reach 5% or Windows reach 90%? Not for a few years at least, even if Vista is a monumental flop (as in, mandatory SP0 to fix huge problems).
Would you mind actually backing up your assertions with some data? So what you are saying is that all proprietary operating systems suck (Solaris, AIX, HP-UX, Unicos, Windows, etc.) and only OSS operating systems (Linux) are better? As I have asked you before, do you have any experience with any operating system other than Linux? Because if you had, I think you would see real fast how “trollish” your comments look to someone who does have the experience with “other” operating systems. At a minimum your post looks and smells like another pro-Linux, anti-Sun FUD fest. Keep in mind that some of the proprietary vendors (IBM, HP, SGI) are dedicating talent and code to “your favorite OS”. So are you saying that all of that code sucks too?
Quite frankly I think your full of shit!
This announcement means: “We at Microsoft think that it’s important to be able to run UNIX/Linux applications on Microsoft servers, but we don’t see a market for Windows clients interoperating with UNIX/Linux servers.”
If SFU was included in Microsofts client versions of Windows, then sysadmins would realize that now they can deploy applications (possibly even FOSS applications) running on a Linux server to a large number of Windows clients, which could in turn mount a remote NFS volume as their data storage. All this can be accomplished with full support for ACLs and other corporate necessities.
But then there would be no need for Microsoft servers, and pretty soon the workforce would be able to smoothly migrate to Linux clients as well. So, this move is a strategic initiative to disguise a plan to control interoperability as a press release touting a commitment to interoperability. This is a staple of the MS playbook, like the running a draw play on 3rd and 22.
What is striking is that while other players in the server market are spouting high-level things like “virtualize everything” and “open standards,” MS is still talking about security and reliability. They’re a whole generation behind!! No failover, no hypervisor, no crash dumping. They offer none of the features that IT customers demand today. They have a server platform that is a poor excuse for a web server, a poor excuse for a mail server, a poor excuse for a database server, and pretty much nothing else.