There are enormous amounts of information now available about evaluating, and examining Linux for the desktop. Almost every vendor/distribution is making pitches for the desktop. The quality of the software has improved, and continues to improve. In my personal tests, there are still some missing elements that I thought I would convey to you the reader. Some of my points may have answers, and solutions available. I may not be aware of them
however, so be aware of this and I look forward to your responses in the comments area below the article.
Most of what I have seen in terms of Linux on the desktop, is the rough
guide to removing an OS, with a view to replacing with Linux/Other OS or
duel booting a box, and running the system. There is really no coverage in
the areas I am about to mention, and perhaps it is overlooked by many. But
it is very key to how successful Linux will actually be in the longer term.
In a moment I will provide a kind of rough overview about how companies
handle Windows, and installation and rollout of Windows, both to user and
customer. This is a vital element which I think has been overlooked and from
a business point of view, I think its one of the reasons people set aside
the plus points of Linux and stay with Windows.
Lets start with the suppliers. Lets see how the industry works currently.
You have the dominant player in the market.
You have the people who provide the systems.
Dell, HP, OEM, Other.
Microsoft works with a carrot and a stick method of getting the synergy so
it works in their favour. They offer the system builder the chance to use a
product they want to sell. But in addition, they offer extensive, and huge
assistance with handling of such products. They offer the tools, and
software that allows system builders to work with the OS. In terms of
special builds, drivers, specialist support and other areas such as joint
marketing, and product development. In return you will see aggressive
tactics such as not allowing the OEM to ship boxes without an OS. One of the
key aspects is imaging of the software, and the replication of an OS. The OS
is built for the consumer. (This applies to desktop Linux distributions).
Its easy for Dell/HP/OEM companies, and others to create a range of
computers. These computers are built and tested, and then the OS is built
and tested. Its a framework that works easily for those who then produce
thousands of systems, and simply make one or more software images of the
system that fits the need. This might include various OS options, and may
include office productivity tools, and other items.
I suspect that these companies use Norton Ghost/Other tool, for imaging, and
the excellent SYSPREP utility amongst others that Microsoft provide. SYSPREP
lets you build or install the OS and tools you desire. It then lets you
reset the machine to a factory state, you can on reboot select a plug and
play check for new hardware and regenerate the system SID, if you wish. In
additions, Microsoft also provide a bootable OS called WinPE. This is
basically the equivalent to a 32bit Windows bootable system with a shedload
of network and other drivers. Along side things like PXE enabled network
cards, BOOTP, and other standards, this gives very easy to use
packages/solutions, that allow the two parties who need these tools the
most, access to the tools that both system builders, and businesses need.
So, the OEM/Dell/HP/Other companies can build their OS, and replicate the OS
in a simple – easy to manage way. This also allows them to build the rescue
CD’s, and updates to the very images themselves in a simple, easily
As for business, many windows specialists simply work on the same basis. If
I have 100 users on site. And I wish to make a change to their systems, lets
say, I want to replace 3 year old systems. I talk to a supplier, Dell, HP,
whoever that may be. I talk to their business support teams. Within a few
minutes, I have a spec of machine I want, with the OS I want, with a
specific build I want. Lets say with a spec of 1 Ghz/40Gig/256mb
ram/gfx/sound/Lan + Windows XP, and Office XP. I talk to Microsoft and
within a few days I get a site license, which includes the system images of
Windows XP and Office XP that have no need to be activated. I supply the
information to my supplier and they build the boxes. Now whether I decide to
let Dell/HP/Other do the imaging, or I make a new image for my company which
we apply to our systems does not actually matter. What does matter, is that
in business I will be licensed. I have a manageable solution. With the tools
I have available, I can build a system image or update an existing system
image, and have it ready in minutes. In addition, if I do the extra legwork
myself, I can have the image so it logs on to the domain, and has all its
programs, all the domains printers setup ready from the get go. Further, I
can carry through a standard registry. If I want to lock down security,
desktop settings, internet settings, software settings, domain settings all
in a central build. I can kill of MSN messenger. I can lock the user down to
the corporate level of agreed services and systems. I can go as far as
making each users ‘My Documents’ folder reside on a server, or group of
servers, instead of the local machine.
Lets take this further, I can have the ‘My Documents folder’, and other
folders stored off the local machine. I can run an LDAP mail or MS Exchange
system which again resides off the local machine. I and the user therefore
benefit IF that local PC ever fails. By reloading an image to a new PC, or
the same PC with replacement parts I can get the user back to working
condition in very short time. You can go further. You can have roaming
profiles. The list goes on and on.
So, to look after my hundred users+ is straightforward. Yes, Windows does have some issues as all OS’s do. But that is a side issue in the real world. For the average user working in an office, they know Office, and they know
Windows. As an IT person supporting those people, it is my duty, my professional reputation that hangs in the balance. Not providing a good solution is simply not an option. But add to this, the various tools and weapons that Microsoft provide to me, my business, and my suppliers, and it is an excellent package. Its also the very same reason why I am sitting here and saying ‘Sorry, I can’t do it.. at least not yet’. Could it also be a
reason why most OEM/suppliers simply don’t offer Linux?
Maybe I am wrong. Maybe Linux has all the Ghosting, Imaging, Business and software tools to do the same as windows offers. But its not visible. It looks like its missing in action. It looks to me an outsider like it doesn’t exist. And while it doesn’t exist, at least in my sphere, its hard for me to understand just how Linux will make it fully onto the desktop, beyond the area of the enthusiast, or hobbyist. Beyond the odd server, or workstation. So you have my comment. Now I hand over to you for a moment and ask some questions:
How would a company like Dell/HP/OEM/Other support a Linux build, its updates and support at least as well as those it gets when it works with Microsoft?
How would companies who then buy those systems from the supplier be able to do the same kind of images and updates in a similar painless way?
Why does there seem to be no Linux RISprep/SYSPREP tools available to both builders and business/users with strong support from the Linux Distributions, to do this level of support?
In addition, Windows XP is remarkable in its level of recovery during replication. You can change the hardware to a greater degree than anything I have seen, and by and large it works, or can be corrected. For me, it would
take days to prepare a Linux system to such a level, and worse still, there would seem to be no easy way of offering the positive benefits of running Windows, a Windows domain, and the tools my users would wish to use, along with the problem /system recovery I get with my current setup. Do you know of a way I am not aware of?
My main caveat with the Linux desktop idea is that I am not dealing with one particular computer. And neither are hundreds and thousands of IT veterans and specialists who work with Windows, often not because its the best
technically (even though its very good), but because it is the most manageable. IT managers, IT Directors, Company bosses and boards have to deal with the reality. They have to comply with the law. They want simple,
straightforward solutions. They want business solutions and providers who do that. They want solutions and provisions that work. I have not seen a Linux
desktop that would be acceptable for me to even try and roll out in any area of the desktop. Its not that any of the software is not good enough. Its not
that its not capable. Its not that it can’t do specific targeted work. It can probably do all that. Nothing I have seen even indicates tools, and management, recovery, centrally held data storage, user control, management
and integration that I can get with Microsoft Windows.
Everything Microsoft have done ties together. From the desktop, through to domain and server. It ties in and it works. That is why people use Windows. That’s why Dell/HP/OEM/Other sell Windows. That is why most of the
corporations round the globe choose Windows. Until that is addressed, I do not see Linux making inroads on the desktop. If it was easy, it would be occurring by now. Its lack of uptake seems at least to me to be an indicator
as to why. Perhaps we need a new distribution. Lintigration might be a good name. And what would it be ? An integrated Linux solution, that ties in to a
server, offers bootable network installation, package and management solutions, user handling, data placement. Everything a ‘Professional ‘ would need from Server, Desktop, Printer, Network and integration perspective.
Linux replacements for areas like:
Central server or domain creation.
Then the ability to create images that can be delivered to any PC on the network. SYSPREP/Automated install/update across network.
Specialised routines for locking down systems, desktops, tools, data storage on the ‘domain’. System failure/recovery.
The tools for building and working with Windows desktops would be of great benefit. Its time Linux stopped looking at the desktop as an individual issue, and looked at a far more complete solution. If you can get that, the
desktop comes in range.
Myself and my fellow staff continue to evaluate Linux, and its various distributions. We have had to swallow Microsoft’s License changes and are very unhappy about it. I think that is repeated up and down companies around
the globe. But Linux doesn’t give me the tools that I need to handle the ‘desktop’ – tied and operational INTO our and other peoples business systems. Its falling short, and that’s why I thought I would post this article.
About the author:
Darren Stewart is the network manager at the Gray Cancer Research Trust. He has worked in IT for the past 10 years with systems ranging from AS/400, Unix, Microsoft Server and Desktop OS’s and systems, and for notable
companies as AXA Equity and LAW, Old Mutual, MCTWorld, Circle International, Finance and IT Expertise Ltd. He is married with an 18 month old daughter.”
.. though I think the author was overly blathering about sysprep he`s definitely hit the nail on the head. Just having a workable desktop that people want or can use doesn`t cut it in todays corporate desktop. The backoffice of the corporate network is where the desktop really happens. I`ve just rolled out Captive Directory to two offices. I ghost a machine, join it into the domain, add it to 1 group and all the software that user needs is installed and configured automatically. If I want to change everyones wallpaper, I`ll do it on the server and the change propagates out. If a user starts deleting .dll`s, next reboot the software is automatically repaired and noone notices a thing. I can repackage and distribute any software, changes, anything I like, without visiting 600+ desktops.
This kind of corporate control exists nowhere in the Linuxland, and until it does (and I hope it does one day) Linux will never make an entry into todays corporate user market.
1. Do kickstart-style installs, where the configs are also on one central server ( for static ips via mac or something )
2. You can export user directories via NFS .. people have done that for ages ( i think )
3. Most distros have pretty good hardware detection, you can go from 1 box to the other without too much hassle ( some XFconfig stuff maybe )
4. the list goes on and on 😉
Gernerally I think this guy just needs some justification for not being willing to learn something new.
I will take your points one at a time
1. Kickstart style. I do not know what it is, and I have been around for years. Perhaps you can supply mopre details.
2. Certainly NFS is an option. But does it offer the same benefits as the options in windows, and have you compared them ? Are the tools available and workable for hundreds of users across a domain, etc etc.
3. Most images of Linux will crash n burn if you change hardware. That was one of my points. Remember, I was making a statement about Making an image that WORKS when you change the hardware, and on the occasions it does not work, its simple ro resolve. ie – I could take an image and it would run on k2, k3, p2, p3 processors using an XP image.
Occasionally I have found the IDE hardware can cause issues but you remove that before you go to image and set a standard IDE before the reboot.
Have you ever compared how to roll out these systems, or are you merely sitting there assuming Linux cuts the mustard without inspecting. Let me know what roll outs you have carried out.
I have been there W2K network with 200+ computers to manage. The reason you don’t see the tools is because they are different from Windows. Using Redhat’s kickstart utility you can script installs and do network installs. Linux has a small PXE boot version to use to boot the client for network access.
MSI motherboards use a version of Linux for there bootable bioses. Find a Solaris Admin and ask him how they do it. *nix is different from Windows. The problem is that as a Windoze user you have to learn a lot about *NIX networking solutions before you can implement them. I have been working hard on being able to provide a total Linux solution and the more I learn the more I need to learn to do it effectively. Keep reading sites like this and do some google searches on Linux Networking and you will be amazed at what is possible. NFS and LDAP are your friends.
>>I dont get the point .. maybe its just me
Me either…you can setup thousands Linux machines using
ftp or nfs install, scripts etc.
How do you think I instaled 25 Linux desktop on our company..doing one at a time, manually?? no way!
more info here:
If you didn’t talk to Microsoft/HP/Dell, you wouldn’t have learned all that ghosting/prep stuff. If you don’t talk to redhat or ximian you wouldn’t hear of the alternatives either. Maybe you should follow some courses on the subject? After all, it’s only fair that if you learned stuff about windows, that you should also learn stuff about linux, if you want to make a comparison.
The bootmanager Grub has optional network support, or you can use bootp/tftp directly.
You can install a whole bunch of computers using kickstart (or you can use images).
You can update and install software on them using up2date or redcarpet.
Use ldap, nfs/afs for login and home directories. Or you could go for the X client-server approach, where all those 100 computers just act as a thin client for a couple of servers. (so you only need to worry about a couple of machines)
Do massconfiguration using cfengine, or ssh, or a bunch of perl scripts and cron jobs.
I have seen countless times I had to rebuild a machine (when I worked as a W2K Network Admin) and it died when I moved the install over. W2K will not move between machines when the chipset is different (Intel–>VIA or vice versa). Redhat will move anywhere you want it to go. And with Grub I can even move it from IDE to SCSI Hard drives.
I agree about LDAP, and NFS. But your comment re’the more I learn the more I need to learn to do it effectively’ speaks volumes.
I somehow don’t see Dell /HP/Other doing this for the customer.
SYSPREP is great but Perl and Shell scripts are more powerfull.
At the end of a kickstart install you can run any perl or shell scripts you come up with.
Plus you can’t install windows from FTP.
I agree about LDAP, and NFS. But your comment re’the more I learn the more I need to learn to do it effectively’ speaks volumes.
I somehow don’t see Dell /HP/Other doing this for the customer.
I am not sure how you moved your install over. But Ghosting and following certain guidelines works.
Most images of Linux will crash n burn if you change hardware.
Check out kudzu, detects and configures hardware which has been added or removed. Installed by default on redhat for example.
Btw, if you recompile your kernel with optimisations for Athlon, and leave out a whole bunch of drivers/modules you can offcourse expect trouble if you switch to a 486 with hardware you didn’t compile the modules for. You will be fine with stock kernels.
Unix/Linux is an entire different operating system with different design decissions. It’s kinda obvious you will have to learn new stuff and put some effort in it.
I freely admit a lack of knowledge in the area. The article was written due to this, BUT also due to the lack of coverage in many of the articles regarding new linux distributions, AND the fact that I want to know more.
One thing I would like to say is much of the comment here is welcome, and I am reading your comments and responses. I think many others will as well. Please keep them coming.
Unless people KNOW and have a way to roll out Linux desktops, it can’t happen. And its not discussed enough.
I have still not seen enough to persuade me that you can build a linux desktop, with the settings I would look for, and with data settings and users settings, registry and security settings, and the ability to change and update central images, and the support of vendors and suppliers to the same level as I can get from MS and suppliers.
But.. I may be wrong Lemme have it
Just plug src and destionation harddisks and do
cat /dev/hdc > /dev/hdb
if the harddisks are different, just make a script that partitions the destionation hard disk, copies the files over, and sets up the bootloader, it’s not really hard.
Most Linux installations do not require the same frequency of a ‘clean re-install’ that Windows does, hence little work has been done on imaging software. Don’t layer common Windows practices onto a Linux network, and then complain that Linux does not have the tools to fix this non-existant problem. Dropping in a CD and re-installing without formatting the /home partition will repair your box without damaging user data (you do have a seperate /home partition don’t you?).
And when push comes to shove, rarely have I run into a box that I have configured that I could not just drop a pre-installed harddrive into to recover. The most I’ve had to do is reconfigure X. In my experience, Linux handles switching of hardware much better than any other OS I have used, since the pre-installed harddrive was usually not done on the same configuration. This is common practice. I’ve done many an ftp install in a box with lots of memory, and then dropped the harddrive into a box with 32Meg (FTP installs requiring more than 32), and have kudzu/harddrake detect the changes on boot.
Linux is different but not all that difficult.
Try setting up a central server and export the dir’s that are common, like the /usr, /lib, …
so you can share all the apps.
also try out knoppix, this gives you a good idea on how to make your own custom bootable debian cd.
just remember it’s different but not all that hard.
I somehow don’t see Dell /HP/Other doing this for the customer.
Different operating systems, different customs, different companies. Talk to Red Hat, Ximian, Suse. But I don’t know why you believe that HP/Dell/IBM wouldn’t do this. Didn’t IBM help out the German government to switch to Suse, including a couple of desktops?
Do you really think that, while reading all those stories about governments and companies that put linux on their servers and bunch of desktops, that they did all the installing one by one manually? Or configure them one by one? Or that they didn’t have any sort of support? Offcourse not.
The tools are there, talk to someone who knows those tools. Just like you did when you had to install 200 windows machines when you first started working.
*nix Networks are just different. They are very server-centric.
You really do not do things on the client machines you just change settings at the server. The real cool thing you can do in linux is mount system directories across the network.
As far as I know you can not mount C:Program Files across the network. You can however mount /usr/local/ across the network allowing you to centralize your applications and your updates.
Thsi requires Computing Horsepower but that has become much more affordable in recent years
Guys, too many of you are posting regarding setting up a copy of machines or disks.
Remember, I was looking for something server side managed, server based data storage, server based mail, roaming profiles, settings, security, image maintanence and updates, and machine recovery.
The first comment here mirrors my general comment exactly. Is he wrong ? Am I wrong ? If we are why is it so many companies feel as I do that its a road you just cannot go down ?
Even I am capable of copying a disk from A to B
If you roll out linux in your shop, please say how you do it, the tools you use, the basic methods, where you keep your data and why, how you recover from a failed drive or machine, AND USER training and acceptance. Remember, I would not be able to just GIVE 100 users a whole new environment without huge nightmares. Each user would have to go through retraining and would need help. Remember that if you did roll it out in many businesses you may well have to cater for user retraining, and persuading management about the pros/cons of such changes.
There may be many people reading this topic who perhaps like me are curious, yet either do not know its possible, or maybe remain skeptical about its reality in the trenches.
Let me put it another way. You have Linux distributions. Why when they offer the ability to install and create a system using a simple GUI have they not done more work in this KEY area
You really will have to cater for people like me if you want to win the desktop. You’ll have to build amd maintain, and improve tools that MSCE admins can run and work with IMHO in this area.
Problem is that linux people need to get of their high horse and make Applications easier for end users to use and install.
End User: I cannot install xyz
Linux Guru: Oh you are missing abc and need to go to http://www.abc.com
End User: Ugh ok.. Why didn’t it come in the installer
Linux Guru: You get what you pay for
End User: I cannot install xyz
Linux Guru: Oh you need to edit all of then and run make again
End User: I don’t know how to do that
Linux Guru: Or you don’t know what you are doing thats the problem.
For Linux to be on the desktop it has to be easier than Windows
Perhaps I just don’t know enough about linux but I work as a student tech in a university computer lab. We have 178 some computers to administer and the thing that makes our job so much easier is a program called deep freeze (link below). Basically it creates a static image of the drive and any/all changes made to the computer are gone after reboot. This is a *Very* nice feature.
Why? Well we can give the users administrator privleges, let them install whatever crap they want and then just reboot the computers and see all those changes go away and the computer revert to a clean slate. This allows the greatest flexibilty possible for the students and the techs. We don’t care what they install and they don’t have to worry about security programs getting in the way of installing programs (you would be amazed how many college textbooks come with cd-rom programs anymore). Also, it solves numerous headaches of students installing crap like bonzai buddy and us having to spend alot of time removing it.
I mean, ASFAIK, you cannot do something similar with linux. Sure you can lock down the computer but you can’t give them root nor can you make all changes go away on reboot and start with the same fresh image – once again, AFAIK.
Note: not that we don’t have *nix computers, we do, but they are locked down very tight and most students prefer not to use them – same with the mac’s.
I’m curious as to which distributions you have been looking at. Not to start a my distro is better than yours, but until recently, Redhat and Suse have concentrated on the server side. Mandrake on the other hand has emphasized the workstation. It comes with more gui administration tools than any of the others (of course, if you’re a die hard command line user, you can edit the conf files by hand, too). There are gui tools for user administration, group administration, network administration, printers, file sharing, internet connection and probably a dozen more.
As for graphical installs, AFAIK all of the main distributions have come with graphical installers for quite some time now. Most of them even have where you can select a “class” of installation such as workstation, server, desktop publishing, etc.
Since you’re looking specifically at a Linux desktop solution, it might be good to focus on a desktop oriented distribution.
I’m not sure about where you work, but at the company I work for, we would consider it a plus if an end-user couldn’t install package xyz. And while I admint that dependencies aren’t pleasant, they exist in both Windows and Linux. Also, Linux allows multiple versions of a library to co-exist, Windows doesn’t, or don’t you remember DLL hell (at least XP has tried to fix that).
Finally, if you work for a company that does want users to install software without the IT staff’s knowledge, must modern distributions use some type of package manager (rpm, urpmi, apt-get) that automatically installs the dependencies for you.
BUT also due to the lack of coverage in many of the articles regarding new linux distributions
I also never read anything about “SYSPREP” in any article regarding new Windows versions. Your article is the first I read about it. But I did read articles about kickstart, cfengine, up2date/redcarpet, ssh, afs,.. It’s just a matter of which sites you go to.. I doubt you will find the information you are looking for at http://www.winplanet.com 🙂
Your main problem is that you are mixing up your problem with a solution. Describe your problem without any references to microsoft tools, or the things you used to do.
While you describe your problem, you talk about sysrep, images, profiles, and whatnot. Don’t do that! 🙂
You have a hundred computers. You want easily install an operating system on it, and automatically do custom configuration on that. No problem, you can use kickstart. You want to update and install the software on those computers, no problem, up2date can do that. You want to be able to dictate what users can change about their desktop, and what is mandatory. No problem, gnome2 and gconf can do that. You want to be able to dictate what programs they run, no problem, use basic permissions or acls. Use pam to impose extra restrictions such as preventing them to login before 8am or using to much resources. You want the user data and login information to be centralized? No problem with LDAP and nfs or afs. What else?
I just couldn’t agree more with the author’s opinion. of course for some issues there is an answer but
> … also due to the lack of coverage in many of the articles regarding new linux distributions
This is one of the main issues. A lot of solutions presented in the comments are barely documented. As a “Windows Engineer” (:$) I feel like I have to re-invent the wheel for each obvious thing that Windows can by default.
> Keep reading sites like this and do some google searches on Linux Networking and you will be amazed at what is possible. NFS and LDAP are your friends.
Sorry, but I need to find portal sites that discuss such technical issues in detail. And I said portals where people share their knowledge, not general HOWTO pages. (Altough I have to admit I don’t always find that stuff for Windows either). NFS and LDAP? Add Samba and build your own W2K domain controller? OK. Show me a page who describes which distro to use and without programming all that stuff myself. Where I just have to provide a Kerberos Realm name, then add some users and add clients to that “domain”.
I would believe you CAN do all that stuff with Linux, but there isn’t a distribution who provides this as one integrated package.
My customers are often smal companies with typically 30 users. When I setup a new Small Business Server, I can’t afford to spend more than 1 to 2 days on that. So I need a solution that implements fast.
Oh, also: such a customer doesn’t really care about a license cost of one server they buy every 3-4 years. What is say 1000$/€ every 3 years compared to the cost of hiring an Admin (for support) on average 50 to 100 days a year? Think total IT budget.) So even then, Linux should make you deploy faster to really make a difference! If I have to spend 1 day extra to implement it, I loose the bargain of Linux license cost. (Please don’t tell me that Windows needs more support. It does on a certain degree, but those NT servers really don’t crash that often.)
This is the approach that Sun is taking with their own Linux distribution and workstation product(s) coming in 2003. There will be a backend server which maintains config info, profiles, data stores, etc. The workstations will be “fat” clients (local apps if needed, some customizability, etc), but can be reimaged or net-booted (kickstart) as needed to reset to a default config. You’ll also be able to put “thin” clients (SunRays) on the network for people who truly don’t need “their own” copy of apps, etc.
The concept is aimed at large deployments of workstations (call centers, large corporations, etc) where the admins need to leverage themselves across many hundreds of systems.
A version where your existing hardware is used is being investigated, but like the author mentions, having many different makes/models/etc makes automating/standardizing more difficult.
When you first mentioned “imaging” in your article I thought that you were going to talk about the marketing concept of product imaging. In my opinion, product imaging has as much to do with what you are taling about. I will try to explain.
Ask your self two questions: 1) What do I think when I hear “Microsoft Windows”? 2) What do I think when I hear “Linux”? I almost guarantee that what you think of is mostly what has been imprinted in you mind by what you have heard (marketing, ads, etc.) about the product. In your case (the author) it sounds like you have a great deal of experience in Windows administration, so this will help in keeping you down to earth.
It turns our Linux does have imaging capabilities, which are quite easy from my perspective. I commonly tar up a Linux installation and uncompress it to a new partition and it works just fine. Though there may be issues with this; tools such as kudzu (mentioned above) help in detetecting hardware, and I know from experience that there are issues like this in Windows too. Another tool that I found great that actually detects hardware is the program used in Knoppix (also mentioned above).
Back to product imaging, perhaps it is also the responsibility of Linux companies to reflect a good product image of what they are trying to sell. Linux is know to be hard to install… Windows is know to just work. I don’t see it as your fault (the author) that you haven’t heard of the tools to do your job under Linux, to be surprised I’m not surprised. It took me years before I even heard of Linux. Then again, how do you expect any company to rival a marketing giant like Microsoft.
Indicate that in some areas, its not that the tools exist, its that we don’t know about them.
Could it be that we need to get some Windows engineers, and some Linux engineers on a duel project where people can compare the tools, methods, successes and failures
A test implementation if you like
I think its very fair to say that Windows admins may struggle to roll Linux, and Linux admins may struggle to roll Windows, and that seems to show here with neither side really being able to understand where the other is coming from, at least so far.
Is there anyone out there who works on both sides and can assist with our current thoughts ?
I have not seen a Linux desktop that would be acceptable for me to even try and roll out in any area of the desktop
But DeadRat v8 is the most desktop friendly os yet! It has clear fonts, gnome, kdzu-ntite …. what did you say? They are basically following everyone else and not moving forward in hard areas, such as what was said above? Geez, i wonder who is?
Mandrake… no. Lindows … no. Slackware … different market. Debian … different market. Lycoris… maybe. Xandros … i wish i had 30mil to make a couple beach houses and a clone of Corel Linux. Suse… yes.
When will the leeches of the linux community (lindows, xandros, DeadRat ™ ) abandon dead or un-needed standards (ie gnome and other projects made as protest to other applications that had once had a non-free licence) and start moving towards proper competition with each other and with windos?
I dont care if this release of DeadRat(tm) includes clear fonts and X v4.1, i already have those ELSEWHERE, and 3+ MONTHS AGO!!
If you like Debian, FAI (Fully Automatic Installation) seems to be a nice tool for doing mass installations.
They’ve ghosted linux on thousands of servers.
Kickstart is to Redhat as Jumpstart is to SUN.
It gives you the ability to install the os on a system with predefined answeres to the install script.
I use Jumpstart at work for Sun Servers and workstations and it saves A LOT of time.
Kickstart could do the samething in the Corprate desktop world. RedHat put’s out Kickstart, I don’t know if you can use it with other distro’s or if they have there own.
RedHat Kickstart Manual:
While this is OSnews, most of the talk here seems to be about USING the desktop, what you want to learn about is administering the desktop. This will require lots of learning, and that’s why UNIX admins are usually paid well.
You seem to be a very knowledgeable windows admin, which is good. But I’ve(as a CS student) heard of many of the aforementioned tools for Unix, but I’ve never heard of SYSPREP. My point is that just as I am not a Windows admin, you aren’t a unix/linux admin, and you don’t seem to have experienced the culture/environment.
I would recommend you try it on a spare desktop, and then either read newsgroups/web articles, or go talk to Redhat or some other vendor like others suggested.
I think the biggest reason you don’t hear of these tools for Unix/Linux is because most have been around for long enough that they are just expected, by default.
If you decide to try linux seriously, even just for yourself you may be surprised just how much you learn about windows in the process. (Similar to how you can learn about your own culture by going elsewhere for a while. Hence the popularity of going abroad during college.)
I work with hpc clusters more than the desktop, and I know of some tools that might help you.
Imagine/Ghosting Try the System Installation Suite (SIS) at http://www.sisuite.org This tool combines the functionality of VA Linux’s SystemImager and IBM’s LUI tools. I have only used the imaging part, and then only as part of the OSCAR http://oscar.sf.net cluster distribution, but I know it works with at least RedHat and Mandrake.
Terminal ServerSpeaking of Mandrake, I know they now include a XWindow terminal server configuration utility. I haven’t used it yet, but check out http://www.mandrakeforum.com/article.php?sid=2279
Security I have used the Bastille http://www.bastille-linux.org tool to tighten security. I like this tool because it teaches you about security as it is doing its job. (Mandrake sponsors this project, but it is not limited to one distribution.)
One thing to note about allowing people to install programs…it often doesn’t require root. A decent program will allow itself to be installed just about anywhere, including the user’s home directory.
Forums almost every piece of software you find on a Linux box has mailing lists for developers and users. Look at Freshmeat http://freshmeat.net and Sourceforge http://sf.net (and Google of course) if you have trouble finding the software’s home. Mandrake has http://www.mandrakeforum.org and http://www.mandrakeuser.org where they provide both free (I have spent some free time answering questions, as do other users) and paid support.
Even though I keep mentioning Mandrake, I am not stuck on them. I am simply most familiar with that distribution at this time.
I hope this helps.
Well. I’m by no means a sysadmin. However windows sysadmins with attitudes like this piss me off. You had a buttload of ms tools that you got to learn when taking your MSCE and some other ms courses. About all the registry tweaks, bootflags..config settings elsewhere. After a few years you get so used to your tunnel vision world, that you forget about even thinking about alterantives. Well get this: Linux is a different OS. It has different tools. It has different needs.
For example it has far more flexible network installation..Can windows do http/ftp/nfs/hd/cdrom/iso-on-hd..etc install? Can windows do diskless thin-client? Oh guess what that roaming stuff..works just dandy with nfsmounted home. Pair that up with something like LDAP for authentication and you get everything that you need. Oh whats this? need to install software remotely? well there is about half a dozen tools to do that…Perhaps the dumbest would be to run a script to automagicly ssh into box..rpm -Uvh needed stuff and logout.
Also, regarding setup time. Just in the time that it takes to install windows/Backoffice server/whatever, you could have a linux system up and ready(assuming that you have some experience in the field and setup similar servers before).
And lets not forget the wonderful “call microsoft to disable activation part”, you don’t worry about that in linux, its non-existant. Stuff like disabling MSN messenger equivalents is also dirt easy in linux. Oh what about having a nice tight firewall script(on clientside, so the server doesnt even have to worry about it) on the client machines to restrict the sort of network services they can use? Can you do that in windows?
Also, you can even setup a samba server and have the xp machines roam using it
Anyway, sorry for the big rant. I guess the point I’m trying to make is that you are very biased about ms administration and refuse to open your eyes to alternatives. linux is capable of at least matching your setup feature-by-feature. Now regarding straightforwardness of windows servers: I can’t figure them out. I can setup IIS, etc. But for example trying to get ip aliasing to work..where you have 2 ip addresses on a nic and one of them is a dhcp one..can’t figure that out for the life of me on windows. So now I have to 2 static ips on 1 nic..192.168.x.x and 10.0.x.x networks..For some reason filesharing only works over 192.168.x.x even tho I’d love to make it work only for 10.0.x.x but I have no clue where I should even begin looking in windows. I googled and googled and googled and came up with nothing.
oh right..imaging software->linux has lots of that. its called cat & dd & friends.
If you need support for your Linux try mailing lists
Oh what about having a nice tight firewall script(on clientside, so the server doesnt even have to worry about it) on the client machines to restrict the sort of network services they can use? Can you do that in windows?
Yes, you can, starting with W2K: Ipsec.exe. Not technically comparable to ipchains/iptables, but there is are similarities.
I’m a TechAsst-Admin for a hosting company. We use linux boxes exclusively for our hosting solutions (custom tweaked slackware). How do we do production machine “image” installs? We have our generic “image” tar’d up on a server. The generic kernel has our most common chipsets, cpu, etc. We update the image every few months. We do a network mount & run our own “install” script. Done. It even partitionsformats the drives for us All in about 10 minutes. We’ve done this in various co-locations to the total amazement of the win2k guy next to us doing their 80 update reboots. After it’s up-n-running, we’ll do a kernel recompile if needed for special hardware…
Hello, in my old university (upc, barcelona, spain). We used to have that same system, but we used Linux.
The computers booted with REMBO:
And restored a linux or Windows98 image everytime. Then when it booted you just had to login, and the servers automatically restored all the user preferences (mail, browser, adress book, personal folders). It didnt matter if you booted linux or windows, you always had your things wherever you were.
We also had Sun Ray’s thinclients all over the campus. And of course, you could also store anything (mail, adress book, programs, projects), and restor them in any thin client.
You could also froze your sessión using an smartcard, and restore it in another thin client.
At least it can be done in a couple of ways.
Install Windows .NET Enterprise server RC1 that looks like XP and it eat about 84 Mb of RAM
Install RH 8 with Gnome 2 and it goes for 150 Mb of RAM.
When linux fix its XFree problem will be ready for desktop.
1. “Image rollout:” Install Mandrake or RedHat. Make an autoboot disk and perform automated installs on multiple identical machines. Or, do an install, and then use Partimage or beloved Ghost to copy that partition(s).
2. Protection of system files: Mr. or Mrs. Desktop Luser has an account and password to log into the machine. But not root access… so they cannot mess up their system. And if they do, just restore the backup of their home dir that you did last night during the sceduled cron job, possibly even residing on a CD-RW right in their own desktop machine. You did backup, didn’t you?
3. Remote administration: Log in via VNC, X, or (shock, gasp, CLI panic) SSH.
4. Configuration changes: Linux uses text files to store cofig info. I know that the format of these files sometimes changes from program to program and between distribtions, but if the super ballsy sysop/bofh got down and learned the formats(or tweaked the source code a little), changed could be made via the above methods, or via a shell script and scripting language, although I know nothing of those two things.
5. There is no operating system that I have ever used that is easier to use as far as device driver support. If the kernel comes with what you have in the system, just type modprobe (kernel driver name). Then, if it cannot load, it just won’t load. No making winblows try to uninstall that driver for that question mark device in the device manager, or any other of that plug and pray bs. Only DOS and OS/2 are easier, and they lose out to Linux mainly because you have to reboot them, and Linux you don’t.
6. Centralized storage of documents: mount -t smbfs \profile-server\%username%documents /home/documents or something like that. Ever heard of SAMBA? Windows-style file and printer sharing?
Linux has some of these things, maybe all of them. Just take the time to learn them. I was glad when I decied to learn how to make DOS do just what I wanted, and amd glad I made the same effort to do the same with OS/2, Linux, and even windows 9.x. Just invest a little more time, and get more in depth.
First, Windows .NET enterprise RC1 eating only 84MB RAM? ehem…
X is now eating 35MB on my box, and Metacity around 9MB…
Second: You don’t even need to install a graphical environment to get a linux server.
I think what you’re searching for is exactly this:
yes, .NET only eat 84 Mb in a new instalation, and also all the critical services are disable by default.
Anyway, i like linux but i hate XFree because is big, slower and very limited.
You could say me that MS products are insecure, some are unstable and so on, but one think that MS build fine is GUIs.
Linux need a more flexible engine to get the same power of MS products in the desktop.
And i agree with you, i don’t install graphical environment in linux servers, SSH is good enough.
no more posts. 🙂
In the end you won’t need to do as many reinstalls of Linux to “fix” problems. You can justify having fewer personnel on staff for support and saving money or you can keep the extra personnel on staff to research how to do all the stuff you are complaining about. Spending a little more now can save you a lot later. Just because you think the Windows install/fix will be faster now doesn’t mean it will save you time in the future. I think the old saying goes, “Haste makes waste.” Don’t waste your companies money on the new licensing scheme just because you didn’t want to spend the money on the research to find a way linux can do the same thing.
With a little tweaking you can store the installed packages and the config in a database, so every say developer can have updated “image” IF his machine would break ( which isnt really a BIG problem ) and about changing the background. You could write a script that finds the ie gnome config file and changes them. You can do everthing that windows 2000/XP does for years in Linux and more.
You just have to be willing to learn how the system works and what it can to and how to automate/script. And i admit the learning curve is still not where windows is, BUT in the end it is the better solution to be flexible and not to be locked into the “easy” MS way of doing things.
The other point is that the MS way is a way into total submissioin.
Free your mind and rest will follow 😉
Install RH 8 with Gnome 2 and it goes for 150 Mb of RAM.
And how much of it is shared memory? And how much do you know about how Linux manages memory opposed to Windows? A couple of days ago there was a link to the pretty dinosaur book (Operating System Concepts) here on osnews, maybe you should read it…
And how much do you know about Xfree86 4 and it’s extreme modularity and extensible protocol?
If you want centralized administration and ease of deployment, investigate thin clients. Specifically:
From reading your editorial i seem to get the feeling that you want it easy, without any learning curve. All things you mentioned are possible on linux/unix often better somethimes a bit harder or maybe even worse, but still everything is there.
But what isnt there are nice gui tools to administrate a Domain like situation or a nice update tool to distribute new software or update configuration files on the clients, its all possible with some nice scripting and stuff. Yes it has a learning curve but somethings get easier. For example most software thats says its windows 2000 complaint isnt like it needs Administrator rights to run, most of the time with hardwork you can get it to work with maybe just some open keys in the registry and some files in the winnt dir, but this is not good. Most linux software works good in the multiuserbased system and needs no tweaking at all just a plain install, which is nothing more then pushing the default packackes to a client and installing them, very easy to automate.
What you say is true, thats why they use windows, they are scared to learn…to try and too believe that its easy to work with the tools that are availeble on the linux platform. After you get used that creating users is done on the command line and that you cant make mistakes and just click some more buttons will fix it you tend to take more care and do it right the first time, which in the end will save you time and will give your users less problems. MS admins tend to think its all so easy but administrating a pretty big MS shop can be hell if you let your admins fool around cause it all looks so simple.
Under linux you will have too learn, cause trail and error can be vert damaging to your system, now under windows click here click there, hmmm what did it do, nothing ? did it solve it ? hmm maybe that button will fix it…hmm no…lets look in the registry in all those undocumented config stuff…hmm now thats broken too, lets revert everything and try again. While under unix nearly everything is documented so its just a thing of good reading, most of the time reading is enough to solve your problem, though reading is very hard.
Wahhaha mega ranting
I think making linux luser friendlier will kill it in the end, since when it looks hard it can only get easier in the end. And my MS windows lessons are OWH it looks so easy but in the end arrrg it so fucking hard!
Linux is weighed down by too much legacy material for it to be truly considered an excellent desktop/workstation replacement. Microsoft has been wise enough to make core changes when need be, but Linux suffers from too much tacked on without a re-engineering.
Administration of Linux is far harder than it needs to be due to the absurd Sys V runlevel garbage. Even the Windows Registry is more logical. Why there hasn’t been a switch to an easy to use BSD style is beyond my ken.
There are various ways to replicate an installation.
Mandrake for instance, will make you a disk that you can use to replicate the install of one PC, unattended on other, somewhat similar PCs. Differences are handled automatically by Kudzu.
So, then you just plonk the DVD onto the cup holder, shove the floppy in the slot and power up.
After that, for maintenance, you can ssh into any user machine anywhere in the world, without getting up from your machine.
So, I don’t see any obvious way in which it can be made any easier.
I read the article, and It did not appear to be anti-linux to me. It was a simple proposition:
This is how I do it in Windows. I think this is easy, it works well, and it’s well known to Windows admins.
Can I do this in Linux, and if so, why is the process not so well known?
Is Bill Gates paying antagonistic geeks to show attitude to this sort of request in an attempt to put people of using Linux?
Some other people above hit on most of these points that you can do this stuff with Linux. What’s more important is that it really isn’t neccesary. Issues that you have with rolling out NT based solutions just don’t exist.
For example reading your article you seem to want to be able to:
1) Roll out inexpensive non custom desktops
2) Backup the user’s information
3) Be able to remotely administer the whole thing
How about this strategy; the “desktop” is just running XServer software. Everything is running remotely on your servers which means:
a) The individual user’s home directory is being stored on a server so its very easy to back up
b) The individual user’s data files are being stored on the server so they are very easy to back up
c) The individual user’s desktop is just running a simple XServer package so it doesn’t get corrupted and if it does that is usually a result of a hardware problem. In any case you can just blow the image away in seconds; heck at $200 (not including monitor) you might just toss and replace the whole computer rather than have your guys try and fix stuff.
Remember all those apps are X apps which means they act as if they are running remotely all the time. They are also Unix apps so they are designed to support many simultanious users. There aren’t any extra issues (like you have with Windows apps) running on Citrix.
How would a company like Dell/HP/OEM/Other support a Linux build, its updates and support at least as well as those it gets when it works with Microsoft?
The exact same way they would as when working with MS. They’d have an agreement with a vendor (SuSE, Mandrake, or Red Hat, for example). The vendor would assist them in creating customized, workable systems, and provide whatever tools the OEM needs.
How would companies who then buy those systems from the supplier be able to do the same kind of images and updates in a similar painless way?
Red Hat has Kickstart. For documentation on Kickstart, visit their site. It seems very similar to Solaris’s JumpStart, which I do have a little experience in. Essentially, on bootup, the machine asks a central server for a temporary install IP address and networking info, as well as where to get updates/install. It then goes to this install server and proceeds to install/update according to the rules the sysadmin gave the server. I jumpstarted a Netra with zero prior knowledge in a couple of days. Do some training first (you did some training in the MS ways, no?) and it doesn’t even take much more than a couple of hours to write up a whole new set of rules/packages for a special situation (e.g. new machines to consider).
Why does there seem to be no Linux RISprep/SYSPREP tools available to both builders and business/users with strong support from the Linux Distributions, to do this level of support?
You don’t seem to have talked with a vendor. There are. See the above. Additionally, you can roll your own. I had a floppy and a zip disk that I used to install 70 or so workstations back in around ’98. I rolled my own then, unfortunately. It’s not terrible, but it is involved. OTOH, you can do this and, though it, have the option of having full control over your system and installs from the ground up. MS presents you with no such option.
Do you know of a way I am not aware of?
Most everything can be done in a server-centric way. Authentication can be done via PAM modules (e.g. an LDAP-based authentication or authenticating to a Novell server). Home directories and applications are not installed on each machine; they can be installed locally or mounted off of a server. Since there is no registry, each program stores its global config files in a specifc location (which is also mounted off the server), and stores its user-specific config information in the user’s home directory. Since you mount the home directories off the central server, the data follows the user around wherever the user goes. No need to worry about “roaming profiles,” it roams already!
Additionally, one can have “thin clients” which have either only a windowing system running localy, and everything else runs off of a central server transparently. My former university did this. Alternativey, you can have the entire filesystem mounted off the remote server and have nothing actually stored on the machine. Finally, you can go to a nice extreme of this, and have Sun’s newest thin clients, where a central server stores/runs their session, nothing is stored locally, and the user carries around a smart card with only enough info to authenticate themselves to the user. If they move around, they just take out their smart card, move to the new location, plug in their smart card, and their session comes back up exactly as they left it.
Many of the problems Windows solves with those tools you mentioned have been solved long ago by Unix developers, and incorporated into the system. I highly recommend that you have a long, serious chat with a RedHat, SuSE, Mandrake, and take the time to learn the system. It’s similar to Windows in some respects, and vastly different from Windows in other respects. Yes, there’ll be a learning curve, but how long did it take you to get to the level of Windows knowledge that you have? Please, take the time to really learn the system through and through. It’s not the text-based system of the ’70’s, nor is it the standalone box you seem to think. It’s a highly modular and flexible system from the ground up, and I think you’d really like it when you get to know it well.
>This is how I do it in Windows. I think this is easy, it >works well, and it’s well known to Windows admins.
Linux != Windows
Linux should never be turned into another Windows.
Just becaues that’s what you do in Windows dosn’t mean
that is how it should be done in other operating systems,
especially ones that are vastly different. It also dosn’t
mean that it’s the best way to do it either.
Like I always say…if you don’t want to LEARN a new OS then use the one you are used too and stop complaining!
The OS shouldn’t change just becuase it’s not what you are
used to and you are too lazy to learn something new or different.
Umm, the second “user” should have read “server”, in the sentence fragment, “and the user carries around a smart card with only enough info to authenticate themselves to the user.”
The resent posts have managed to attack windows, windows admin, windows tools. Even posters attacking windows tools and systems because they don’t work as *they* expect.
I went off and examined the Redhat Kickstart docs. Not bad. Not good either. I am sure the high level of configurability is useful somewhere.
Some of you have attacked me/other sys admins on a basis that we should go away and learn more. Or that we should spend some more time studying the fundamental differences in architecture, and the reasons why things are different. You are making a mistake that someone has the time to go to great depths in these areas. Sorry, real world means most departments have pressures that work against this, and thus, we are back at simple, easy to administer system and integration.
Let me remind you, I wrote my comment based on the idea that LINUX is now pitching for this AREA. I have no problem with that. I like Linux. But what seems to be missing here is the basic understanding that whatever you may think, companies are not likely to go and retrain their entire technical teams, and all their users based on your comments.
If Linux was pitching for server space and its normal areas I would have no problem. But its recent foray into desktop territory is worthy of discussion.
What is also clear, and I mean no disrespect to you guys who favour Linux, but Windows is compared to your suggestions even better than when I first made my comment. Looking at suggested pages, tools and options, its a disappointing mish-mash of variable tools, all created without a vision or any unification in mind.
Many of you even attack the end user, the very people whom would be your customers. That in itself is a cardinal sin.
No one has really dealt with the issue of retraining the users. Very few people covered the intergration of office tools, and the interoperability with other companies who would be still working on MS Office based solutions.
Just as I lack an understading as to some aspects others have mentioned, others also misunderstand how things work. One comment ‘And lets not forget the wonderful “call microsoft to disable activation part”‘, shows an unbelievable and stupid lack of understanding.
When you get enterprise licensing with MS, there IS NO activation. Another stupid comment is ‘We’ve done this in various co-locations to the total amazement of the win2k guy next to us doing their 80 update reboots’ even though we have already said we run a central image where updates are done and its handled once for the enterprise.
Another comment” 2. Protection of system files: Mr. or Mrs. Desktop Luser has an account and password to log into the machine. But not root access… so they cannot mess up their system. And if they do, just restore the backup of their home dir that you did last night during the sceduled cron job, possibly even residing on a CD-RW right in their own desktop machine. You did backup, didn’t you?”
Basically I know its a shock horror, amazing idea, but Admin access is usually replaced with ‘power user’ which acts the same way in the windows environment. I then have the suggestion that I run local Cron jobs on users machines backing up their /Home drives to a local CDRW. And then asking me or the user in sarcastic comment if we backed up.
So let me get this right.. either myself or the user is going to walk round the entire enterprise each night, and mount/unmount linux CDWR drives and do local /home backups even when I have said storing data on a local machine in todays age of one year HD drive warrant’s is the most braindead, stupid, dumb assed lame LAME LAME way of handling this, and its something that was clearly stated in my comments originally. At this point my regard for ‘Mr or Mrs Desktop Luser’ as stated is higher than for this comment.
Many of you have suggested a remote NFS /home, which I would have on a server running a nightly backup.
I am not sure how to say this, but its important to remember you have to convince, and operate a system that starts and ends with the user, and the company. Some of you have stated good ideas and tools, and methods. No one as yet has supplied me with an all round idea of how an all Windows business could successfully carry it off.
Thats the market Linux is aiming at. At least with the desktop sales pitch. Whatever solution it is, its got to be a unified workable and downright simple solution. Both for technical staff and the end user, and with no, or at least limited loss of functionality. Its not the Linux market guys. Its a whole new ball game.
Most windows shops have specific software, bespoke to windows. Many tools and utilities, licenses and assests tied into what they have. When you have made your grand switch to Linux, and the users can’t use the system you have put in, and none of their software runs even if they could use it, you’ll have to find answers and solutions.
Sorry to rant, but some of you simply are not being realistic. You are going to have to have a solution that is better, simpler, cleaner, easier and with lower cost, and that before people will even consider it.
Everything a ‘Professional ‘ would need from Server, Desktop, Printer, Network and integration perspective.
well, i’d say linux already has ‘server’ and ‘network’ covered, its far more flexible and powerful than windows in that regard. as for ‘desktop’ and ‘printer’, well, we’re working on it, and its getting better every release. ‘integration’ is a misnomer entirely, the unix philosophy is completely against integration – it provides a single point of failiure and leads to things like the registry which can’t easily be administered from a boot floppy in the case of utter failiure.
Central server or domain creation.
kerberos, nis/yp, ldap and many others, including your own personal scripts, if you so prefer.
Then the ability to create images that can be delivered to any PC on the network. SYSPREP/Automated install/update across network.
many people mentioned bootp/tftp, nfs, afs, coda, kickstart, red carpet, apt-get/dpkg. and once again, you can always write your own stuff and release it – thats how we all improve
[i]Specialised routines for locking down systems, desktops, tools, data storage on the ‘domain’. System failure/recovery.[i]
backing up your central ldap or nfs/other fs server to a worm is the best way, but there are distributed fault-tolerant fs too. bastille will lock your system down and the intrinsic user system of unix means that you dont have to lock down the tools – theyre secure already.
unix has the best error reporting logging systems i have seen – you can have upwards of 1000 machines all reporting their syslog over ssh to a central server(s) which write them immediately to worm or tape. you can have custom log scripts checking these and tie that into your IDS. unix makes you the spider in the web – you can feel the vibrations over your entire network.
DS, you are asking for experience from someone who has actually done or been in an envionment where large scale rollouts where done, I’ve been there. Personally I worked as part of a team that automated HP-UX/Sun/Linux mass deployments, centralized configuration management, cloning, etc, for a large (or once large) telecom co. (I’m not there anymore though)
Here is how the environment was setup.
1) Nothing but the OS is local to the disk. That way, if the machine dies, you can clone and put into place a new system in about 15-25 minutes. (from the largest server to the smallest linux machine).
2) Centralized network storage. We had a SAN solution with NFS servers exporting volumes from the SAN to the rest of the network. Access is controlled by using netgroups.
3) NIS. All of our unix users have NIS accounts. However, my brother implemented a systems where linux users could log in using windows userid’s and passwords from a windows DC. There are two ways to do this. a) use MS services for unix in AD to run the DC as an NIS master. Works for all unix’s b) use samba and winbind in the linux image.
4) Hardware is compliant. You have to make sure that you have an image that will work with the hardware you buy. (of course). You have to do this with Windows too. XP on a machine w/ a bunch of ISA hardware, etc, won’t work too well.
5) For linux you do a customised kickstart installation with scripting in the post install section to initiate the post install procedures. In the kickstart configuration file, you can specify the software packages to install, and where to get them, etc. Your distribution lives on an NFS exported filesystem (which is actually on the SAN). Kickstart does an install from the NFS server, installs it’s packages, and then starts your post install.
6) cfengine is your friend. We used a smart cfengine setup to manage all of our HP/Sun/Linux systems. With intelligent use of variables and filesystem layout, cfengine can autodetect the configuration files and tree that it needs, and can carry out specific tasks which you of course have to specify.
7) Centalized patching mechanism. We had a system of scripts that hid the OS specific details for patching. That way, each system used the same interface. This interface was then called from cfengine.
Each OS has it’s own details for initiating an install. HP-UX is the best, you just ignite it from across the network, and it goes. Sun, you go to the console and do a boot -net install. Linux you put in the kickstart floppy/cd/etc (could be by network booting if you have the right boot image in the nic), and boot that way. After that you walk away and let the rest of the process take care of it.
I’m not going to get in a discussion about how great the tools are for Windows or begin to point out the equivalents in Linux. That is the wrong premise to start with. Your view point is not from the direction of whether the Linux desktop is good enough for the average user, your view point is as an IT person and a Microsoft oriented one at that. You’re talking about whats easy for YOU and what saves you time. That is not what is going to decide whether Linux desktops and Linux software begin to spread. What will be the deciding factor will be first and foremost COST. There is no doubt that Linux is a far less costly solution. The fantastic tools you mention were not there when Windows began its climb to dominance, they came later as a result of IT people and companies trying to become more efficient when rolling out solutions for users and customers.
Linux will do the same thing as it progresses.
What will start that process? COST! Linux is already happening on a daily basis around the world. You, as an Microsoft oriented IT person, will not make that decision. Your PHB employers will make that decision and TELL you to start using it because it saves THEM money. If they hold off on the decision to switch, their competitors will force the decision to switch on them simply because their competitors will be able to lower their costs and undercut your less enlightened boss. You may be able to convince your boss that Linux is not for your company, but eventually, after your company gets beaten in the market by those smart enough to see the savings in Linux, your boss will come back to you and your shortsighted opinion and perhaps ask to find find the door…out.
That’s a fair better suggestion and helfully information than many have offered. Would you consider it a big ask for a bunch of MSCE’s to roll what your team did ?
I agree re the hardware specs by the way, most beneficial to go that route. To be honest, many have pointed out Imaging and network role out. Seems no problem there. Its the backside setup that is concerning me. The NFS ideas are fine, as are security and other aspects. office intergration would have to be workable, and I like your brothers work re the Wind DC
I now suspect from all the comments that the systems and admin can be done by scripting and other methods. More work than Win, but all quite possible (least for me). I just doubt I would ever be able to allocate the time for it all….
ghost works with linux,and as of ver 7.3 even supports grub
there is a native *nix tool called DD
just like your windows network you can create your images and roll them out, almost all distros will pickup new hardware changes.
you can even script it with a small boot disk if you want.
Policeys and user startup scripts and Domains;
here, linux has more choices and flexiblity,
you can setup an LDAP domain with samba support built in,
openAFS you can have a network like system of user directory and workgroups accross the entire domain with user and group permssions.
with openAFS all windows cliets and all Nix are supported, i think even mac is supported.
you can even add SELINUX, to the domain and have major secuirty policey type controls over the entire domain.
and all the intergrated into an LDAP based domain means many
software and hardware will work with it , even win2k AD is based on LDAP.
with the openoffice and abiword projects office software is not an issue.
groupware and imap:
there are many imap/exchnage replacements out there and many will intrgrate into your ldap domain, the only hanging issue here is calender support is kind of weak but , you can get it to work.
becuase linux is a text based system you could have starup script check for updates with either up2date or atp-get(rpm ver or deb ver)
you could even roll out new software by just adding something like:
apt-get install openoffice
to the startup script
and everyone will get it, this could be used for updates too.
all in all this covers, most of you topics, i think.
its just a matter of knows the tools and how things work sure the howtos and docs in some projects are weak, but the IRC chat channels and web forums /news groups can offer you a ton of support.
Thanks, that kind of information is great. Thanks for coming back with it
Well, when it comes to free advice, you get what you pay for ;o).
I’m sure that there are a few enterprise-level sysadmins here (see Joseph’s post a few spots above), but I really wouldn’t expect too many to actually be hanging around here. You talk about going to Dell/HP/etc. for solutions for you Windows network, but you expect a hobbyist site to give you a migration plan for Linux? Get real. Read a few other discussions here, and you should realize that you’re not going to get consultant-grade advice here. If you’re really serious, these are the people you talk to:
Once you’ve talked to these guys, *then* you can come in here and tell everyone how Linux isn’t ready for the enterprise desktop.
i know several microsoft admins….they have been doing microsoft for YEARS….back to dos/3.1 days…now moving on to XP and 2k domains/active directory.
learning one more microsoft thing is like taking another breath of air. it’s natural.
if you think you are going to skim over some manuals, ask a few questions, futz around a bit with some distros…and in 6 months be ready to roll out hundreds of linux workstations ….you are in for a big shock.
i know a few linux people who could manage it easily. (not me)…but these guys have been unix/linux admins for years.
too many microsoft dronez forget that.
the challenge in the change to linux is huge for microsofties..it’s a whole new way of thinking/doing.
many will give up. and that’s fine…it’s not for everyone.
and openAFS is headed by IBM and it scales to something like over 10,000 users or something.
“Linux should never be turned into another Windows”
Who said it should be? Business people are wanting to know cost-effective ways of implementing Linux.
“if you don’t want to LEARN a new OS then use the one you are used too and stop complaining!”
Who said we don’t want to learn a new OS? The inquiry is about how to implement Linux, not dump on it.
(BTW Which of the several OSs I use are you referring to?)
Try reading all the words, and understanding the whole of the post before going off at the mouth like an immature C/PM OS fanatic raving against the new wonder DOS.
To business people, an OS is not a religion substitute, it is simply a platform for running the tools of our business. Factors in that assessment are upfront costs and running costs. The assessment is binary – it is cost-effective or it isn’t.
‘if you think you are going to skim over some manuals, ask a few questions, futz around a bit with some distros…and in 6 months be ready to roll out hundreds of linux workstations ….you are in for a big shock’
Yes, maybe. But thats what the pitch is, right ? Linux for everyone and everything ? So I am putting my toe in and I am finding out.
I’ll tell you this, If I decide I will do this, then I will. The question is more, how long would it take, and whats required, and is it achievable.
‘many will give up. and that’s fine…it’s not for everyone.’
Erm, if you aim at the desktop, is’nt the point that you ARE trying to make it for everyone ?
Someone accused me of being some kind of MS zealot. My background is more AS/400 systems, and like many people, I have done MS sysadmin through a requirement, less than through choice. After a while you can get to operate the system without issues IMHO.
Why should I hate MS tho ? Its straightforward, it works, it pays my bills. But that’s a side issue.
There are lots of good comments so far. I can tell you what I have researched, but not implemented in a large scale yet.
I’ve set up small networks with NIS as a central password server and NFS for central file sharing (including home directories). This works very well, but NIS is not considered a highly secure solution since it advertises the user list to the network. As long as you don’t let users have local root access, and have a solid firewall, it should be fine.
LDAP is a more promising solution with better security and performance. However, the client side software for authentication is not easy to set up, at least about a year ago when I was seriously investigating it. Maybe it’s better now. One of the drawbacks (as of last year) was that LDAP did not support netgroups which are critical for wide spread NFS sharing. This makes NIS more attractive.
Of course, you could also look into Novell’s NDS (yes it runs natively on Linux now), but I prefer to stick with completely open source solutions.
Finally, there is Kerberos which is probably more complex to set up than NIS or LDAP.
As far as roll out tools. the other comments cover it well. Kickstart for automated installs, and ghost or drive image for clones. I have succesfully cloned many Linux workstations and they are much easier to deal with than Windows clones.
With Windows NT and above, you have to worry about unique SIDs when you clone. There is a SID changer that comes with Drive Image and Ghost that works fine on WinNT4, but not with Win2K or XP. As far as I know, you have to be an enterprise customer to quality for the official cloning tools from MS. Now, there may be some unsupported (by MS) cloning tool for 2K/XP but you are voiding your support if you use it.
How many licenses do you have to have to get a version of Windows XP and Office XP that do not require activation? Again, I thought it was 10,000+ desktops to qualify. I haven’t read up on all the latest MS licensing so I am not sure.
I was a Novell CNE from 1993-2000 and I am currently a Microsoft MCSE since 1997. I’ve done large scale Windows domain rollouts (500+ nodes). However, I think MS is making things harder with their ever stricter licenses and product activation, etc.
I think the Domain model is a great one for small to medium sized businesses (<500 desktops), but AD is a horror. Way too complex unless you have a mammoth network. MS is forcing that on everyone. Also, unless you clone Windows,
installing the OS is not even half the battle. You still have to install Office and configure it, and Outlook and configure it, and apply all the service packs for the OS (then reboot) and office (then reboot) and IE (then reboot), then install the critical anti-virus program and hope it doesn’t break something else (and update it every week!), then install other apps. With a Linux install, you usually get all the apps installed at the same time, and no anti-virus.
I can’t in good conscience recommend MS solutions to my clients any more.
I am currently converting every MS server and workstation I can to Linux. There are some vertical apps that have no replacement in Linux, so some machines have to stay Windows.
However, the next large network I build will be Linux based.
>Guys, too many of you are posting regarding setting up a copy of machines or disks.
That’s good. Kickstart is a great tool.
>Remember, I was looking for something server side managed
>, server based data storage,
> server based mail,
SMTP/POP, Openmail, Insite, Domino
> roaming profiles,
Define “security”. Do you mean “firewall” or do you mean “lockdown”? If you mean the latter chown and chmod will serve you well.
> image maintanence and updates,
NFS and Shell scripts.
> and machine recovery.
Add a grub entry for “recover”, and boot a custom image designed to reload the system. If you have NFS, and permissions configured properly you can actually pull this off with *NO* data loss.
>The first comment here mirrors my general comment exactly. Is he wrong ? Am I wrong ? If we are
Are you wrong? Yes.
>why is it so many companies feel as I do that its a road you just cannot go down ?
They don’t feel as you do. Many companies are switching EVERY DAY.
>Even I am capable of copying a disk from A to B
Then you can learn DD, NFS, NIS, chmod, chown, and basic shell scripting.
>If you roll out linux in your shop, please say how you do it, the tools you use, the basic
I rolled my own tools and pushed updates using cron, and basic shell scripts. I upgraded an entire farm of RedHat 6.0 machines to 6.2 in under 5 hours (230 servers, all in locations across america)
> methods, where you keep your data and why, how you recover from a failed drive or machine, AND
You keep your data on a server, if you lose a drive on a workstation have one prepared or use a boot CD or network boot to load it.
>USER training and acceptance. Remember, I would not be able to just GIVE 100 users a whole new
What user training? Sure they will be afraid of the new machine for a day, but it’s not as tough as it sounds if it’s configured correctly.
>environment without huge nightmares. Each user would have to go through retraining and would need >help.
Granted, which is why you train SPOCs (Single Points Of Contact) and leverage them during the migration. If you have upgraded users from 95 -> 98 -> NT -> 2000 you will be familiar with the process, it is no different when changing to Linux.
> Remember that if you did roll it out in many businesses you may well have to cater for user >retraining, and persuading management about the pros/cons of such changes.
I keep hearing “training” come up in the list of why nots, however it never seems to be a problem when a company decides to upgrade their OS, or their Email software.
>There may be many people reading this topic who perhaps like me are curious, yet either do not >know its possible, or maybe remain skeptical about its reality in the trenches.
It’s not that tough. I’d say that any 6 month Linux newbie on a mission could provide a reasonable solution.
>Let me put it another way. You have Linux distributions. Why when they offer the ability to >install and create a system using a simple GUI have they not done more work in this KEY area
Have you installed RedHat Linux? Why does everything require a GUI tool for management? Which is faster over a remote link, SSH or PCAnywhere? LOL
>You really will have to cater for people like me if you want to win the desktop. You’ll have to >build amd maintain, and improve tools that MSCE admins can run and work with IMHO in this area.
Why? The tools are already out there, I’m not going to “cater to you” if you can’t get with the program then get left behind. The tools are there, if you are too “good” for them then that’s your problem.
This is just one way to create your “image”. It uses two tools, mindi to create a mini-distro bootable CD for rescue, and mondo to backup your system.
You pop in the CD, boot from CD and type “nuke” and it restores the entire system from as many CDs as needed. Can backup/restore over a network. You can repartition and resize partitions during the restore, change from IDE to SCSI to RAID, etc.
Thanks for the article, lots of *lively* discussion. I’m gearing up to replace a client’s Windows desktops with Linux sometime down the road – they don’t know it yet 😉 So it’s good to read about so many options.
By the way, several other things to point out:
Webmin – another *great* remote admin tool. The Samba or SWAT (samba config) module allows you to set up a Samba server as a Windows PDC or server with just a few button clicks 😉 You owe it to yourself to try Webmin, even if you’re proficient with vi or emacs (like me)…
>>>Why should I hate MS tho ? Its straightforward, it works, it pays my bills. But that’s a side issue.
hey dude, i never meant you personally. i was trying to communicate that switching to linux is not easy.
also, i don’t hate MS software. I just don’t trust the company.
I have deployed and managed 120+ Ghosted Win98se systems for years, using seven different images for the various hardware and departments, it now works so smoothly that I did myself out of a full time job. Now I or my replacement can log in via modem from home to fix 95% of problems that crop up.
Managing multiple client Linux deployments does not have to be any different, and if you are willing to learn a little, is actutaly a hell of a lot easier to manage. You can either use Kickstart based deployments, or as per Win98se, Ghosted image deployments.
Symantec Ghost has supported Linux based Ext2 filesystems deployments for years, and the latest version, Symantec Ghost™ Corporate Edition 7.5
even now supports Linux EXT3 providing “Comprehensive native support for the newest Linux Journal filesystem”
Individual client custom configeration is even easier than with Win98se/Win2k. Once the client has logged in via DHCP and is allocated a name based on it’s MAC address. The client local init script can mount a remote NFS/SMB( Via SAMBA ) filesystem and run any scripts from a directory address ending in the clients DHCP allocated name. The /home partition can be hosted on a server and even distributed on a peer-peer basis via the CODA filesystem. Every thing in the Linux/X11 Desktop enviroments is build to be remotely operated and can be managed the same.
Even all of the above can be automated and professionally packaged, as it has been done in Ximian – Ximian Red Carpet™ Enterprise™
The solutions are already there, the solutions are getting easier to use, the solutions cost less to aquire and operate over time, the solutions can be greatly customized to enhance perfomance of the enterprise.
The problem is NOT the lack of a solution, but the lack of looking for one.
>than many have offered. Would you consider it a big ask for >a bunch of MSCE’s to roll what your team did ?
Once it’s implemented anyone can deploy it.Implementing it takes time though. Implementing an environment where both unix and windows can co-exist requires people from both sides. You need unix experts and windows experts. An there-in lies your problem. I will readily admit that unix people are rare compared to mcse’s. As an aside, lets be realistic, designing an environment is no easy task period. Designing a heterogeneous operating environment is much harder and requires more work. There is no way around it, and it has little to do with the relative difficulties of the OS’s you are adding to your environment. It could be a very easy to use system like MacOS, or and easy to deploy system like HP-UX. It would be a big ask for a bunch of unix guys to *properly* implement a windows environment. Implementing and properly implementing are two completely different things. However, learning the in’s and out’s of another system as versatile as linux definatly has it’s benefits. As a system’s engineer and architect, the more tools I have in my toolbox, the better chance I have that I can choose the right tool for the job. If you only have one tool…..
>Its the backside setup that is concerning me. The NFS ideas
Like I say above, backside setup in a heterogeneous environment takes more work than a homogeneous environment and requires different design decisions. Our environment consisted of thousands of servers and thousands of workstations. Setup correctly (the unix side) could be supported on an operational basis of one person to every one to two thousand workstations.
>are fine, as are security and other aspects. office >intergration would have to be workable
As is control. As a windows administrator I doubt very much that you allow users to install whatever they want on their machines. Managing access controls is an area that will quickly make or break your environment. (The later Mandrake distributions have been nice from an interoperability point of view because they come with support out of the box for NT ACL compatibility (if you use XFS and Samba)).
>More work than Win, but all quite possible (least for me).
I would challenge the comment about more work than Windows. Having worked with very experienced and respected MCSE’s responsible for the windows integration and automation, I would say that the effort is comparitive. There might be more thought required when designing a unix/linux environment than windows, but that’s usually because you need to plan for windows interoperability b/c windows does not normally come with facitilties to readily support *nix systems. The unix/linux systems do have those facilities and their installation and setup can be automated. Also, I have found that usually the file management jobs fell to the unix machines (even if the filesystems were for windows users) because the effort required to script and automate such solutions is far less under *nix. (Also, many MCSE’s that I’ve met don’t have great familiarity with the windows scripting facilities, whereas it’s a required skill for any unix admin).
If you want to carry this conversation on, please feel free to email me.
Yet Sla7er, the village idiot completely forgets about shared resources.
So, his assumption:
<voice of a moron>
Dah, 10 desktops = 10 sessions = 1500 MB of memory used, dah, GNOME 2.0 is bloat
</voice of a moron>
Also forgetting that alot of the features in .NET have been disabled by default, meaning, you have a bare bones systems loading up using 84MB, not an achievement in my books.
Lets start activating the sound server, the terminal services server, and numerous other services required to see what the REAL requirements are.
Who said Linux exclusively used SYS V run levels?
Geepers, get out of your house a little more often. Slackware have been using the FreeBSD scheme since God was a teenager.
Na, that isn’t it. Microsoft goes around giving out certificates to pimply faced teenagers who have never done any formal study and expect them to work miracles like securing a server.
I know alot of MSCE’s who have computer science degree’s who simply got MSCE’s to prove that yes, they do have that skill as a degree is normally to generalised. However, unfortunately, the number of good MSCE’s are normally drowned out by bad MSCE’s.
How to correct this? bring in the UNIX/Matthew motto, “You have to earn the right to use a computer”, ensuring that those who are simply in because its the “in thing” are purged out quickly.
Lets look at the “ease of use approach”, it now gives admins the ability to think they know more than what they really do where by Linux/UNIX ensures, through the use of “tough software” keeps the admins on ground 0, realising they don’t have all the answers and therefore ensurign they RTFM (Read the Fabulous Manual) and learn the basic fundamentals.
Read the post, read the comments. I am shocked at the commentary/responses. The author has bought up some valid points as to the usability of the platform AS A DESKTOP environment, indicated some areas where he’s run into difficulty, and has asked for some help. About five people tried to help, most of the rest of you merely sat back and attacked the author, his skills, Microsoft, and anything else you could think of.
What the hell is the matter with you people? The man’s asking for some HELP, here! Jeez.
The way I see it, the desktop is already there, except that it is too cluttered, though.
Also, the definition of what a desktop is still undefined. Simply ask two people what they think of a desktop and these two will probably give out diffent views of the same thing.
The situation, most vendors are stuffing all the decorations into one box, resulting into a very complicated set of tools that only a few will ever use.
It would probably be good to group common pieces and spin them off to another world, though. Example is completely separating desktop toolkits, like KDE, GNOME and [insert your favorite here] will have their own distribution. Independent and away from each other’s reach.
I cannot agree more. I actually bothered to READ all the comments before posting (which is, apparently, more than most have done) and give serious thought to AdmV0rl0n’s questions. I noted that less than 1-in-7 responses were appropriate answers that were not simple restatements of previous comments. Approximately 5-in-7 responses were insulting in one fashion or another to AdmV0rl0n or MS admins in general. Somewhat less than 1-in-7 comments were either direct assaults of previous commenters or AdmV0rl0n or were way OT.
I just completed a MCSE course several months ago and found the whole thing very frustrating for me, especially since I had not used an MS OS seriously since WinFW 3.11 (having since discovered Linux and Macs). When I questioned my instructor, who is very knowledgeable about Linux, concerning why he could not recommend Linux for the workstation desktop, he stated that it was because he posted a similar question to a newsgroup and received little useful information and LOADS of flames. “If the Linux community is that hostile to outsiders, they’re not going to get me or, my network, as converts.”
As to AdmV0rl0n’s questions, I cannot be of any more assistance on the OS end of things. I can, however, give some words on your office suite choice…
I would stay away from the completely open-source solutions on this one for two reasons (bring on the flame war):
1 – SUPPORT ! By paying for your office suite and signing a support contract, you can rest assured that YOU don’t have to learn all the nitty-gritty on the applications. You can concentrate on keeping the network up.
2 – DATABASES ! While the variety of open-source databases are good and the quality high, this is where most of your retraining problems and expenses are going to occur. This is really the primary reason for reason 1. Many support contracts will include some training sessions (or discounts or at least opportunity) for key personnel who can then train their co-workers. Take advantage of this.
Make sure you get a suite that can handle MS file formats. StarOffice can do this. Not sure about Hancom or others. You need MS file compatibility to be able to trade files with vendors/partners/gov’t. We don’t have to like it, but it’s a fact of modern business.
Anyway, I really hope this helps some.
I think you’re missing the point here, AdmVorlon.
You want nice nifty integrated tools to do everything. But, GNU/Linux is _not_ integrated. This is not MS Windows, where the GUI is the OS – this is UNIX, where the GUI is the front-end to the OS (for the most part). There is no central ratifying body (MS) to determine how everything is going to look. _If you want the advantages of Linux, you will have to learn to Think Different ™. Don’t look for “comparable tools”. Linux is not the same as Windows. and thus does not need “comparable tools” for many things. There’s no defrag tool because ext3 doesn’t fragment (yes, I know that’s an oversimplification). There’s no wacky security options to figure out, because the OS enforces them by default.
I understand your defensiveness regarding some people’s complaints, but look at it this way: how many hours have you spent learning to admin MS systems (I count “experience learning” in this as well)? Now, compare that to the number you’ve been admining Linux systems. You’ve gone to great depths to learn about MS stuff – why won’t you at least give a few dozen to learning about Linux? I don’t think that’s a harsh complaint at all.
Step one: go to a vendor. RedHat will bend over backwards if you express interest in switching to Linux for anything (they smell the blood of fresh sales). They will answer your questions as concisely as you want.
Step two: Test it. You might find that your pre-conceptions of what Linux can or cannot do easily are significantly off. I know mine were a ways off – now, I can’t function without Linux. Give kickstart a try, in particular – it only requires two computers, neither of which needs to be all that powerful. RedHat even includes a tool which makes setting up a kickstart config stupidly easy.
Have you heard of the dd command available in all unixex including linux. you dont need a few thousand $ ghost or sysprep. Roaming profiles, that is for single user systems made to act liek mucti user. just share your /home on nfs. what more. all profiles and more settings are roaming, rotating and anything more you need.
it is more easy to maintain hundreds or more systems on linux networks than windows as you have more choice of remote admin to imaging to managing useras centrally than just one fixed with absolutely no flexibility way of doing things in windows world.
want to restrict you users desktop env or anything else for that matter, take a look at kde kiosk mode or just the power and flexibility of PAM is enough to acheive this.
The issue is that you guys dont see anything beyond the biscuits M$ throws at you and you are happy to be a loyal slave to them than go out and see the real world.
Just a search on google should give you all the answers you need. or ask your friendly neighboorhood linux guru or a professional consultant. I only request you and people like you to do you homework before you comment. dont exepct linux inc to market a utility like dd with brouchers and a few thousand $ price tag. Linux is a community effort and not a corporation selling some product. ofcourse there are companies which does these services if you want it that way. consult with them and get support from them.
Having just waded through p’haps 30 pages of comments, I’ve a few to add myself. I am the sysadmin of a _small_ network (<50 desktops) so I can’t comment on enterprise issues, but I can offer some general suggestions.
First – “nevermind the spam” – thanks for being willing to come up, say you’re interested, and ask about this stuff. Unfortunately this is probably not the right forum, its a bit like asking slashdot. Good responses, but high noise level and a common lack of reading ability among the posters. This is not a pro forum. You’re used to going to the pros like MS and the OEMs – it might be an idea to do so here too, since the magic words “1000+ desktops” will have their attention quite nicely.
No disto I’m aware of is currently integrated and wrapped up to the degree you’re looking for. Windows will be easier for now. Of course if more people like you are interested then that could change – and the more enquiries people like RH get, the quicker it will. You’re currently unlikely to be able to cope w/o hiring an experienced *nix guy who has worked with linux, your expertise is in too diffferent an area. MCSE means much what it says. I’m not bagging you here, its just like asking a guy who has been doing, say, solaris networks for 10 years to do a winxp rollout. It won’t happen w/o training.
That said, I have a few suggestions on the more technical side if you’re still interested.
Roaming profiles: All user-specific settings on a linux box are stored in the user’s home directory, usually in hidden directories eg “.gnome” (the leading . makes it not show up in file listings). As a result, all the users settings and files can be easily shared by using a network-mounted home directory. It is not overly challenging to set up nfs-shared home directories (just mount /home on each client over nfs) but it IS NOT SECURE. There are secure alternatives emerging, including AFS, Coda and (amusingly) CIFS. DO NOT USE NFS, ESPECIALLY READ/WRITE, FOR ANYTHING IN A SECURITY SENSITIVE ENVIRONMENT. And of course any admin worth his salt knows that security matters _everywhere_.
IMAP is wonderful and the _only_ useful option for server-side mail IMHO. Mozilla mail supports imap very well, and I believe Evolution (an Outlook-compatable app) does too.
There’s currently no easy way for a user without root access to install software. I consider this a good thing. Currently all the “package” formats (rpm, deb etc) require root and do machine-wide installs. There is work progressing on a standardised UNIX package manager format that will allow user-local installs of software but I believe this is still in planning.
The 2.6 linux kernel will improve support for network filesystems, crypto, filesystem ACLs, and much more – hopefully once the distro makers start integrating that it’ll help a little.
what it comes down to though is that the tools you want, in the nicely bundled up and integrated form you are used to, do not exist for linux. I expect the distro vendors are working on them – ask, and tell us! Remember, though, that 99% of admins don’t have access to the MS tools you do and its a very, very different story managing a smaller MS network where MS is hostile, not co-operative.
Finially – I suggest that you try out Red Hat on a home desktop and spend a little time learning, if you can. Unless you’ve used the environment you don’t have a hope of managing it. Imaging asking a mac user to do a windows rollout….
Good luck, and thanks for asking. Hope you’ve got something out of this besides a negative impression of linux users ;-/
Yes, you can do perfectly good image copies of linux systems with dd, or even tar! But, there is a project which I have been impressed with lately that does do very good cloning of hard drives in a manner reminiscent of Ghost. Check out http://www.partimage.org
My comment on the article is that in the windows world we tend to get locked into knowing about a suite of apps that we use on a regular basis, each one of which serves a particular function, eg. windows plus ghost plus … However, for linux you don’t need to go hunt down these sort of apps. They come standard with the system, and often are best if used from the command line or as part of
a powerful script. That is how you’re meant to do it, and in order to know what to do, I am afraid the solution is to learn and learn. It is a very different way of looking at it, but the linux way in the long term will mean that you will have fewer proprietary products to baby sit, and the the system just works.
Let me repeat that one. The system just works. Period. And it won’t get wiped out by a Code Red or Nimda. Less support required.
Of course if Dell had to preinstall linux, for them it would not be as much as a problem as for you, in their case it would just mean hiring the appropriate geeks…
I have a mixed Win/Linux installation and I will tell you how we did it, so bare with me. Please note that it is a fairly small company (50 users) but I am sure that what we did is easily expanded.
We have 3 Linux machines in the backend. One called Public wich is an Athlon 2000+ 2 GigRam and 160 GB RAID 5 storage.
This machine exports with NFS and Samba a folder called public which is for every one to use and abuse:) Of course we have quotas for every user. The other Linux machine does NIS and Samba as a Primary Domain Contrloller.It exports the home directories for every user so every Linux user in the company has automatically those so called “roaming” profiles. This machine also does DHCP and Squid. The other Linux machine is just our gateway to the Net, so it acts as a firewall.
Note that we use Suse 8.0 for the servers and the desktops.
We have created a so called “info file” wich Suse checks everytime you start a new installation and looks for the default answers to the setup questions.
Our Windows users use Win2k and XP, and are all logging in to the domain wich is provided from the oforementioned Samba service.
All the users are created through Yast at the NIS server. The password file is repicated to the other server with the help of a script.
Note that Samba uses the Unix passwords and users, so any Windows or Linux user can seamlessly log in to either Linux or Windows machines transparently.
Also all the Linux machines are installed through nfs. We have copied the Suse DVD to that Public Server and with the help of a boot disk + the info file” disk we can setup any workstation regrdless of harware changes.
Well that is the basic idea. Excuse my English, but I am fro m Greece and English is not my native language.
Hop this helped you a bit.
Maybe it’s a problem of documentation. Whether it’s possible or not, at least it’s not visible! Maybe one of the guys giving solutions in the comments should write a howto. I can’t, cause I don’t know anything about the subject.
There is nothing wrong with NFS provided it has been configured properly, please give detail in how CIFS is more secure. Any admin worth a grain of salt is restricting ports on his switch to prevent unauthorized access, and tunneling using NFS and NIS over SSH in addition to using allow lists. There are ways to make NFS secure, don’t discount it so quickly.
It makes me laugh reading articles like this. Having been a SunRay user for over two years these arguments aren’t even relavent.
SunRays are ultra thin clients with zero state. All admin is done on the server and when you are running the Gnome / Mozilla / StarOffice combo life is sweet. If my ‘PC’ ever breaks down the mail boy runs up with a new one and plugs it in, no rebuild, no nothing.
And being involved on the support side I know that you only need 1/3 of the sysadmin resource to run a SunRay setup as you do PC’s.
Step into the 21st century guys. No noise, no maintenance, no hassles.
Okay I am not saying that Red Hat is here to save the world but I can roll out workstations faster in RedHat than I can in Windows (and yes I do this for a living) All with all their own customized name IP and config as I desire (and all off a network server. As for the My Docs being on a network server, thats funny I support many Windows shops and have seen some really tight security but the users still manage to save docs to the local machine, now under Linux I just mount their /home/username dir to the network server and they have no idea that they are living the safe way because 99% dont have an idea they are being mapped. (I can just make the rest of the file systems read only<with a few minor exceptions>) This is all next to impossible with NT or W2K, and should I get started on interoperability, Samba integrates a Windows Network like a Windows 2000 Server never will (Windows, Unix, Linux, Novell all without huge added price tags) So in short YAWN I could do that years ago Windows is still trying to catch up
Some far more intelligent comments recently. Much more useful. Thanks
Craig: NFS shares have always concerned me, but it is an option.
My current network guys is 110 PC’s and Macs, + 4 Sun workstations, and several PC servers running NT4. To correct a few thoughts people expressed:
MS supplied us with the enterprise versions of XP office and XP pro, no questions asked, once we signed up for the new license. These require no activation and include the tools you need for prepping systems.
To be totally fair to MS, while their screwing us on the license side, the tech side is fully workable. There are few caveats and it does just work. I may provide a future post covering this to help in terms of understanding.
I have worked with MS products since pre windows days (yep I set up a bunch of DOS stations using a serial network) I even came close to working on my MCSE. Delayed! I will still get it but my focus is Linux now. Linux has proven to me over and above stability, configurability. The tools are out there, but I would not recommend sticing a Win Admin in front of a Linx box and say go to work any more than I would take a Unix Admin and stick them in front of a Win box and say do your magic. (although that could be an amusing prospect)
Linux and Windows are totaly different beast (yes they are both beast, cant be helped at our level of technology) Rebutal .Net RC-1 takes 84MB and RH 8 takes 150 <apples to oranges> .NET will take a lot more memory in the final release(early releases are always shallow on memory usage) and Linux architecture is built on the principle to try to speed things up <hence it loads as much of the common Libraries and apps into unused memory so it can service request faster-even if they are not used; later when memory gets low it dumps unneccesary stuff to laod the necessary stuff–and yes it is quicker than the windows way of doing it> I have worked primarily with Windows up till a few years ago. Yes Linux will force you to relearn and yes MCSE will be worthless <except for the networking stuff>, but Linux is much more rewarding and I hate to beat a dead horse but my daughter is Three and I love to go home on time at night to see her <with my Linux servers thats not a problem, however my Windows servers really tick me off to often and make me stay late (and yes I may not have an MCSE , but dont hold that against me I work with it more than most MCSEs I know and usually end up finding the solutions for the said MCSEs <yep I work at a VAR> Thye initial reason for my interest in Linux was years ago when every other MS solution for the servers was decimate your security and the problem will go away. (Okay not fair NT 4.0 days-and I still support those too—OS overload)
Anyway hard subject but truth is dont jump in just cause its popular, Linux will require relearning, and the items that the author mentioned are covered in almost every newer System Administration book for Linux and are also part of te RHCE course (okay my quals RHCE, CCNA, Linux+, APS, LCP, Server+so yes I can also back myself up in paper)
1. Use samba — I have both linux and windows boxes going to my Solaris boxes using shares that smb based. No voodo here.
2. Kickstart is one of the primary reasons that a large company I know moved from a server farm of 100+ Solaris servers set up via Jumpstart to RedHat set via Kickstart. No other Linux company seems to give a damn about this and it gets very little press.
3. I have myself and about five programmers working from Linux everyday.
4. The funniest part of the article was about roaming profiles. If you use TCPwrappers on all your machines, set up NIS. It is not that hard and by god all my users have one home directory in one central NAS server and if we wipe their box out they reboot that box and start off exactly where they left off. Also, like you mentioned, the user A can log into user B’s box in a pinch and -Boom!- they are sitting on their desktop user A has access to all their things. Look at webmin for a gui approach to server configuration. It really rocks for DNS/NIS administration IMO.
Linux is not for everyone and has IMO at least another year or two before it can be widely used beyond the Unix geek population.
Nobody has really spoken to the crux of the issue. If orgs move to linux then Windows folks have to re-learn old habits. Not only will it be hard but it simply might not be the thing for every company.
First PC i used was Sinclear Spectrum 48kb, there was types and just basic software, than i bought an Commodore Amiga500 on 1992-93… Gosh, it was the most sophisticated PC i have ever used, there everything i needed on it, games, software, workbench etc.. i never worried about a system crash, i never worried about reinstalling the system, virus etc., it was faster than any PC of its time, it had the best sound, graphics for years till PC’s catch with it, and never had the same pleasure and fun with any PC i bought after Amiga500.. and none was that innovative or sophisticted. Amiga 500’s advantage was its architecture, hope one day some company would come out with something like Amiga500 again, and i would throw my PC to trash..
Hope Sony would do something like that, and make the next genereation of playstation consule more like personal computer, than we would enjoy the sophisticated architecture on sound and graphics and use it as office applications and graphics as well..
i don’t think that anything so good would come out with recent PC architecture, we need an innovational thing like Amiga500 of its time.. And i hate HP !!! ask them what would we do if we loose our HP installation cd? and if we don’t have an internet connection? HP’s r the worst PC ever been built!!
Have you checked into OpenOffice or Sun’s Star Office for an MS combatible Office suite??
As for Email solutions how about Ximian Evolution and Connector or a Qmail or other MTA based mail rather than saying oh no there is not Outlook for Linux.. You can you Samba/NFS or even FTP for your installs and “roaming desktops” LDap/NIS for Authenication.. Webmin for remote administration which runs on SSL for security.. You can change permissions to allow/disallow services and the firewall that come built into Linux to prevent MS Messenger/Yahoo/AIM and also prevent outside filesharing programs like Kazaa/Morpheus from being able to get out.. how about a cetralized communtication messenger like Jabber so that people could communicate faster..
And If you need remote access you could use Freeswan/VPN all of which have manuals and Documentation that comes with your install.. And No I wouldn’t suggest that a bunch of MCSEs try it without doing some studying and as with any OS update or crossover always do a test network always whether it is MS, Mac, Or Linux it is just smart administration..
Could someone (Eugenia?) update the article to guide potential readers to the comments section? Since most of his questions and doubts have been answered here, it would be a nice information source.
Check out Bynari’s Insight Server at:
Basically, it can do anything that Exchange can do. The server is free (they offer a paid version with support), the connector is free for Linux, and it supports MS Outlook and Ximian Evolution email clients.
So using Bynari can be one big piece of the puzzle to migrate from MS Windows desktops with Exchange to a pure Linux network. You can mix and match Windows/Outlook and Linux/Ximian on the same network as you migrate.
Yes, yes yes. Look. I have tested from Mandrake, 6.0, 7.2, 8.0, 8.2, 9.0 and I KNOW about the APPLICATIONS.
I know I can do email, FTP, shared data, etc etc. My basic comment was built around how do people pull it altogether in a larger environment.
Its very easy to take one linux station, be it redhat, mandrake, yellow dog, Suse, Others, and get it so it is a workable system. I have no problem with that at all. Its fine for the single machine/single user/single task PC. Its not right for rolling 100 machines and ensuring everything works, integrates, and is able to operate.
People have now supplied me with plenty of information regarding the many tools available. I’ll happily go further afield and do more research
I agree with comments that WindowS just work (and might be easier for rookie) but it is unfair in my opinion to just saying Linux not suitable for large environment. Yeah it might be a bit complicated at first but if you willing to spend about the same amount money and time to study the advance feature of Linux as what most Windows admin spent to get expert (maybe an MSCE), both system are comparatively equal in capability. In the long run, Linux is much more viable solution since you can tweak here and there.
I would suggest Mr. AdmV0rl0n to try get more info on remote application function (and maybe diskless terminal) for easier setup. Many project that focus on this such as http://www.ltsp.org and many others. And, Linux is still flexible enough so that you yourself can choose how to setup you system, maybe dedicated authentication server that can access application on other server or just simple diskless X terminal that run on the powerful server but display on outdated hardware terminal (at the speed of the server) or you can run diskless X terminal on local machine if your network and hardware is powerful enough. The best thing is that (as previously mentioned by sombody previously) the user can use any of the machine within the intended group to access his/her files, application, setting etc. without losing their favourite setting.
But of course, I did agree currently GNU/Linux is quite behind in the field of graphical preferences setting. Sometime you have to edit config files or running Perl or other scripts. But the truth is that, Linux still can do the task that Windows can do but as a cheaper alternative especially on the long run.