Linux and UNIX-like operating systems in general are regarded as being more secure for the common user, in contrast with operating systems that have “Windows” as part of their name. Why is that? When entering a dispute on the subject with a Windows user, the most common argument he tries to feed me is that Windows is more widespread, and therefore, more vulnerable. Apart from amusing myths like “Linux is only for servers” or “does it have a word processor?”, the issue of Linux desktop security is still seriously misunderstood.
Be it market share or inherently more secure, the bottom line is you are less likely to get infected with linux.
Honestly I don’t care about this nonsense, because much like politics both sides are just putting numbers out that benefit them.
Microsoft is at the unfortunate disadvantage of having everything developed under one roof, so all bugs are summed up under Microsoft Windows errors, expect for anything not included by the OS in the default install. However that leaves IIS, Network Stack, Communication Stack, IO, and anything else you can think of even drivers.
Linux on the other hand seems to benefit from not under one roof reporting. Because after all Linux is just the Kernal and if we had to compare Kernel to Kernel I am sure the numbers would be about even for bugs and vulnerabilities. However when you combine such systems as Apache, and the Linux networking, communication, and IO stacks you run in to a similar amount of vulnerabilities.
The point is that neither OS is secure when running but a person who doesn’t know what they are doing. And I do believe that hackers specifically target Microsoft because they always know a core set of components are going to be on the system. And I do believe that Linux is more secure in the sense the combination of programs is usually haphazardly put together. Meaning that a hacker cannot figure out what is on the system to exploit. However with popular brands like Ubuntu I believe this trend is going to change.
Apple is already starting to see this with their Mac brand.
It is quite the opposite. Linux Distributions ship with lots of userspace programs including a wide range of different genres, from office suite to games.
Most times measurements are made all bugs “in Fedora” are counted contrairy to all bugs “in Windows XY”.
Secondly, this “everything is insecure, it only depends on your numbers” myth is what MS tried to tell the public with their advertising (“Windows is more secure”) and their paid-for studies for years. With quite success: They knew nobody would buy their “Windows is more secure” shit. But by flooding the market with these claims, they achieved the “nobody knows what’s more secure” claim to be accepted.
The truth is, it’s right that number counting is not that relevant. Just look how Windows systems are set-up compared to Linux systems. How every Windows machine wastes ressources for virus scanning etc. and still you had these massive worms. People seem to be very fast at forgetting things in this industry. And then you look at the architectures of Unix systems and Windows systems. It’s true that Windows got more secure in the latest years which is a very good thing and longly overdue. But still there are massive problems inherented by backwards compatibility. The Windows security model grew very complex compared to Unix/Linux. It’s far more easier to find holes in a complex system which is full of intended (because of the backwards compatibility) corner cases etc.
Edited 2008-07-20 16:15 UTC
The truth is that this is not a technology problem. Windows is targeted because it makes good business sense to target windows over Linux and Mac OS X, and I would venture to say that none of the reasoning for targeting windows is because of security.
http://blogs.zdnet.com/security/?p=135
Bot networks equal big money today. So which would you rather focus your efforts on, in a constantly changing environment, the 10% that amount to Linux + Mac, or the 90% that amount to windows machines.
It is similar to starting a coffee shop, where do you think you are going to get a better wide range of consumers. NY City, NY or Elmira, NY? It is obvious, as a business decision to start you business in NY City. There are more potential customers, their is more money, and you have a better chance of doing well.
I am really getting tired of this argument, because it is obviously a business problem and not a technology problem. But you guys are fighting it almost like somebody asked you to whip them out and measure for biggest.
I just don’t get all this arguing, I approach all operating systems as being insecure, and it forces me to protect my self in more reliable ways. In fact I have it down to such a science that I don’t even run anti-virus on my Windows Vista x64 anymore, and I have been virus free for almost 2 years now.
All my mail goes through Gmail, which is scanned. I don’t install any software that doesn’t come from a trusted vendor. And I am running x64 which is outside of the current target of Trojan writers, because they tend to focus efforts on the mass market of Windows XP and Windows Vista 32-bit.
Plus if what you are saying is true about hackers going after insecure operating systems, Mac OS 1-9 would have been swamped with viruses.
I also never had virus problems with Windows although I don’t use it since 2002.
Still if you followed the history of the industry in the last 10 years you found many technical aspects which _indeed_ made a difference in this issue. I would also claim that Windows, but much more than that Internet Explorer, even made this big malware industry possible and that without those products we would have a different security culture today.
If you just have a look at ActiveX, its design and then its outcome, you will see that it stands for itself, it is a big security nightmare which other platforms just never had.
I see a platform which was very insecure und vulnerable for over 10 years. It was outstanding in that regards. And _apart_ from that it was also the market dominating one. You can say this is history, but things didn’t change as much as you might think. For example recently a worm spread which infects WMA files — simple audio files! And it makes WMP to download itself. This is the same lesson MS did not learn a 1000 times before.
What I want to point out is that neither of those (security from hell, market dominance) could have the same impact alone. It’s an issue which is both technical and non-technical. At the early stages almost no hacker wrote exploits for financial reasons. How much you can earn with that was found later, in fact after a very long time. It would have been much easier to build a botnet in 2000 than today, still in 2000 nobody was talking about botnets. Your marketshare argument holds truth but it doesn’t make so much sense historically. Indeed there are other (technical!) reasons why Windows was always the main target, at least if you measure that by success. Do you really think in 2000 it wouldn’t be much more funny to break into some big webservers instead of attacking your neighbor?
And apart from that, I am not fighting anything or anyone. Or could you point me out?
Edited 2008-07-20 20:29 UTC
What makes you think that such a worm would be impossible if you put a fuzzer to an OGG vorbis or OGG theora file?
Heck, BIND which is internet-facing software that accepts much simpler requests than the average media file had exploitable buffer overflows for a number of versions.
Microsoft got the security religion rather late, but we’ve been pretty darn good at it for the last 7 years. As nbernardi said, it’s a commercial enterprise now. Vista exploits go for $50,000 a pop… that’s not chump change so there are many people looking. And it’s a pretty asymmetric game… we have to release a lot of stuff on a deadline and make sure it is functional, secure, reliable, usable, localized, and everything else whereas the attackers can sit for a long time without any particular deadlines looking for one chink in the armor. And these days, attackers don’t even bother going after the OS or even the Applications, but instead just ask users to open executable trojans… there’s nothing an OS can do against a program that a user willingly launches.
Re: The ActiveX issue, how is ActiveX different in vulnerability from the Netscape/Mozilla plugin model that every other browser uses? It seems like the same attacks are applicable to both.
If the OS X market continues to grow, perhaps we shall see a similar set of attacks against that system… I mean, getting a user to click on a malicious program is not a particularly OS-specific attack (a trojan doesn’t need root to do most of its useful dirty work).
You are correct in that both ActiveX and Netscape-style plug-ins are native code that can have the same flaws.
But the main difference is the packaging and installation of that code:
With a plug-in, the installation is very obvious. A specially-formatted plug-in file needs to be put in a special location for the browser. And that process is generally done by a plug-in installer application.
With ActiveX, however, *any* Windows application is likely to install ActiveX controls. They can be in any location, and they don’t even have to do anything related to your web browser. If a component is packaged up as a COM object (very common on Windows), then IE can “use” it.
http://www.kb.cert.org/vuls/id/680526
The installation of an ActiveX control can happen natively through the web browser, or through installing any application (internet-related or not). e.g. Winzip:
http://www.kb.cert.org/vuls/id/225217
The end result is that there are LOTS of systems that have LOTS of ActiveX controls that they may not even be aware of.
These are fairly irrelevant differences, though, because the basic idea is that someone can install native code which runs in both Firefox and IE. It really doesn’t matter whether that code runs from the Firefox plugin directory or from some random place on the hard drive. Maybe it makes you “feel better” to think that the code is somehow sandboxed in the plug-in directory, but it can do just as much (and more) damage as any ActiveX control. I know that people find plug-ins/controls useful; however, the only really secure approach is to turn them off completely. Which will (understandably) break some usage scenarios. But those kinds of tradeoffs are the price for better security.
The browser does not load controls unless they are marked as safe for web use. But as you say, it is unfortunate that it was so easy to get confused and accidentally mark a control as ‘safe.’
Thanks for joining the discussion as an MS employee.
I just want to answer your questions on WMA/OGG and ActiveX. I don’t believe that MS produces more buggy code than others. I know there are many talented people working in your company and I guess they should all be well aware of buffer overflows and other bugs which can be exploited and avoid them. Sure there could be as likely exploitable input processing in OGG as in WMA. But we are talking about a different issue here.
The problem here is philosophy.
Neither OGG nor MP3 or any other sane media format includes the possibility to define a website where a decoder should be downloaded and afterwards instantly run. WMA includes this, so people want to listen to a WMA filed and asked by WMP to “install necessary codec?” which they say “Yes” to and there they have the virus. It even silently transcodes MP3 files on the user’s machine to (infected) WMAs just because MP3 doesn’t come with this “feature”.
ActiveX are objects which have the same power as executables. But they are not treated us such by MS’s software, instead they can be distributed in various different ways which I would call unsuited at best. A website can deploy an ActiveX object which Internet Explorer is more than willing to install (it’s like “install this active x component?” Yes). Because MS wanted to market ActiveX and use it as a “killer feature” to dominate the web (after the failure to overtake Java, court case won by SUN) it was in the vendor’s interest to make it as easy and unquestionable to the user as possible to say Yes to just every ActiveX around. Then there were some rough security bounds (“zones”) but hundreds of ways were found to get beyond those borders. Some years ago you could read about new ActiveX holes at least once per week!
The program can be as robust as it can be, if the philosophy behind it is weak. Every sane person knows that it is never a good idea to automatically download and execute code from the internet. As we saw, software like the Windows Media Player still does exactly that. MS obviously still didn’t learn the lesson, or perhaps just refuses to do so (we have virus scanners everywhere for that now . This is no overlooked security problem. It is insecurity by intention, due to other reasons though. Doesn’t help the user much..
Honestly I am not pointing fingers, because I know Microsoft has been mostly at fault because of its lack of focus on security.
However that being said, even if Microsoft has the great history of security that Linux has, it would still be the ideal platform for malware developers to target. Because it is a business decision and nothing else. Like it or not there are holes, a ton fewer than Microsoft, in Linux that could be exploited, but they aren’t because the malware industry has to focus somewhere and Microsoft is the biggest and easiest target right now and for the foreseeable future.
Yes but why did hackers pick Microsoft technologies over the vulnerabilities in iTunes. Because it was a business decision by them.
From what you posted shows that you did not read the article, or if you did, you did not understand it.
I have to tell you my experience with Windows XP.
At home I use Linux, at work I have to use Windows. I once transfered a file via USB-stick from a company laptop to the laptop of a customer.
two weeks after that I plugged in this same USB stick into my desktop computer at work.
I opened Windows Explorer, clicked on the drive letter, and BAM – a virus warning popped up.
I removed the USB stick, took it home and plugged it in, and found an autorun.inf file in the stick’s root directory. I only did not get the virus into my work computer because the virus software caught it. The next virus might not get caught.
The company I work for is VERY security conscious but this might be something they overlooked.
Why on earth is the DEFAULT setting on Microsoft XP (installed one year ago) to AUTOSTART stuff from any pluggable device. That is plainly insane, and I do not know one single Linux distribution which autostarts anything from a pluggable device. I had NO chance to see what I was starting before Windows started the thing itself.
There is no doubt that Windows is securable, you can do it. But by default, it is MUCH less secure than any Linux I know of, which in turn means that most installations will stay this way.
You have to manually break into a Linux system, a worm or virus does not get far, simply because the user action required for execution of a program is much less easy to get than with Windows. With Windows, an executable just needs the .exe (or several other) extension, in Linux you have to make the file executable first, and if this is doen as a user, the virus can not spread across the whole filesystem, but stays in the users are. It cannot modify a system file to keep itself from showing up in the filesystem.
The only actual information in that article is:
* Linux users less likely to run as root (hardly news…)
* Linux users are more technically savvy and less likely to fall for social engineering – hardly a genuine reason.
* Linux apps are delivered as source (because Linux is used so much less so less commercial desktop apps are developed on it)
* There are a variety of different distributions and architects – making it harder to write viruses and coincidentally harder to develop commercial software targeting Linux
I’m sure there are much *better* technical reasons why the Linux OS is more secure, and I would like to read about them. Unfortunately they aren’t in this article.
It sounds like you’re just trying to flop the article’s meaning around and take its points from the exact opposite point of view that they were meant to be taken. Yay… what fun.
That said, the article wasn’t that great compared to some I’ve read on the topic, I admit–but seriously, quit trying to take it out of context. The article was about security–NOT your own problems with lack of commercial software. Keep using Windows if you need that software, who cares, but the entire point of the article was clearly security. Don’t know where you pulled a commercial software argument from, but I would guess you’re sitting on it right now.
Anyway, here’s an article I like on the subject. Much more in-depth and interesting.
http://www.theregister.co.uk/security/security_report_windows_vs_li…
Article which is pointed by that article is heavily outdated and part of those information were approximate.
Plus, statistics are heavily outdated, expecially when considering that in 2004 Windows2003 was about 1 year old.
That’s not a very good (and updated) source of information.
If the articles points are that easy to reverse, then maybe that’s a good indication that the original article was nothing more than an exercise in presenting widely-known facts with a particular spin?
What warranted that assumption? The OP makes no statement indicating that he considers the situation to be *his* problem, or even *a* problem in general.
“It sounds like you’re just trying to flop the article’s meaning around and take its points from the exact opposite point of view that they were meant to be taken. Yay… what fun.”
Just pointing out that some of the things that [this article claims…] makes Linux more secure make it less useful on the desktop – which is where virii are more likely to be found (because most require some user interaction). Less useful OS suffers from less virii – hold the front page…
“The article was about security–NOT your own problems with lack of commercial software.”
Not a problem for me – not sure where you pulled that argument…
“Keep using Windows if you need that software”
I use Mac OS X at home and Windows XP at work – if it’s any of your businesss. 🙂
“but the entire point of the article was clearly security.”
And as I said, not a point I thought it made very well. The arguments it used to can actually arguably be called *problems* with the platform, not security advantages. I wish the article *had* been about security.
Like all Vista users by default
Yer, and it’s taken them umpteen years to get to that point. Even then, like XP before it, there is some software and things you have to do under an administrator because of that legacy.
I still believe that when someone asks whether Windows or Linux was more secure that they are essentially asking the wrong question. With enough knowledge about how a computer works you can configure both Windows and Linux to match your security needs. Without such knowledge, you are doomed.
This is a valid point, but it doesn’t change the fact that
1) Given typical Linux and Windows installs and usage patterns, Windows is way more vulnerable
2) More malicious software exist for Windows than Linux in the first place, at least by a factor of 10000:1
YOu forgot this very important one
3) Windows executes a file based on its file extension and is therefore immediately executable. Unix files have to be made executable. UAC in Vista may make the user think about it but its still a big big design flaw having a file executable based on its file extension.
If you’re talking about Windows XP, then I’d agree with you. But not with Vista. User accounts don’t run as ‘admin’ by default under Vista, and privileged operations require explicit user approval (UAC).
Again, it depends on whether you’re talking about Windows XP or Vista. Vista has effectively shut down the attack vectors which targeted XP, and malware authors have been forced into moving up the food chain to target applications such as Adobe Acrobat, Google Desktop, etc. This is actually a good thing. It means that Windows OS security has gotten demonstrably better.
1) Given typical Linux and Windows installs and usage patterns, Windows is way more vulnerable
So it’s a configuration issue, not a technical issue, and *other* than default configuration Windows is as secure as Linux?
If not, what are the *technical* reasons Linux is more secure.
2) More malicious software exist for Windows than Linux in the first place, at least by a factor of 10000:1
Which probably mirrors the usage patterns of the two operating systems on the desktop. Linux is more secure because on the desktop it is so insignificant that no-one writes malware targeting it? Again not a technical reason, more of a failing…
Run windows as a normal user, and it becomes much more secure. That’s the real difference between Windows and Linux security, especially with Vista and UAC. It stops most malware dead in it’s tracks.
The problem is applications written are designed for a user to run as an ‘administrator’ in Windows.
The biggest problem is Microsoft is going to have to work with the vendors to write applications to work with all of their functions as a regular user.
I have been using Red Hat since 6.0 Professional version they had back in 1999 or 2000. However it has taken me many of years to really get a good understanding of how a Linux distro actually works. I am still learning new skills on a weekly basis.
The design of a Linux based distro is more secure in the fact you do not run as root. You can modify the sudoers file to allow ‘sudo’ access however you can set it to require a password.
I do not think Windows will be able to overcome the problems with applications requiring administrator access until they enforce the applications coders to code it correctly.
“The design of a Linux based distro is more secure in the fact you do not run as root. You can modify the sudoers file to allow ‘sudo’ access however you can set it to require a password.
I do not think Windows will be able to overcome the problems with applications requiring administrator access until they enforce the applications coders to code it correctly.”
It’s not the design of Windows that is at fault, it is the defaults. They should have been changed a long time ago, and UAC is the first step. It’s not going to happen over night, because MS unfortunately values backward compatibility too much.
I’ve been running Windows as a normal user since NT, and it may be tricky sometimes, some times it can be a real PITA, but there hasn’t been too much I haven’t been able to get working.
Wow. Every time I seriously try to lock down XP, I give up. It’s a losing battle. I install XP, and create an Admin account myself, as it requires. Go through the install, switch the log in window to the classic one so I can also select Administrator. Try to change my (admin-level) user account it forced me to create to a Limited User.
“Sorry, you must have at least one other Administrator account to change this one to a Limited User” [paraphrased]
What? Then what the hell is the administrator account aptly-named “Administrator” there for? Looks? Whatever. So I created another admin-level account, named “Admin,” and was finally able to change my account created during install to a limited user. After finally making it this far, I find out that I’m able to send system files that I shouldn’t even be able to touch to the recycle bin, but when I want to undo that or restore them, access denied–log in as an admin to do that. WTF?
I won’t bother going in-depth on all the problems running programs I had as a limited user, but I saw such ridiculous things as Winamp not able to “uninstall” plug-ins. Why? They’re just .dlls located in C:\Program Files\Winamp… off-limits. If there were a “home” directory concept in Windows, each user could add and remove their own plug-ins, but no. I understand why this is, but it all boils to single-user design decisions which should be stuck in the past and each program storing all of its files in its own directory… yet… they’re still dragging Windows down.
It was after this XP test install that I decided to finally re-partition my hard drive and re-install my Linux-distro-of-choice on it by itself (previously set as the default of a dual-boot setup). Needless to say, after install, I was running everything I wanted as a normal user, with root locked away for system changes, with no stupid WTF moments like XP’s you-can-delete-but-not-restore crap.
Edited 2008-07-20 05:01 UTC
Hmm, I think that “downgrading” own account from administrator to limited is not the correct way to go, and actually agree that you shouldn’t be able to do it.
The simple answer to your predicament is logon as Administrator, change your account from administrator to limited, and then you will only have Administrator and your account (as limited) to log on. I’ve done that tens of times.
“Sorry, you must have at least one other Administrator account to change this one to a Limited User” [paraphrased]”
Of course it says that, you need ONE administrator account. You just create your own user as a normal user, not administrator, do not downgrade it. Do all your installs as the “real ” administrator, then use your “normal” account for day to day stuff. Runas is there if you need it.
The point is, there already *is* an Administrator account of the same name, and has been for a long time now. I remember quite clearly (a million times over) being asked by the installer what I wanted to make my Administrator password. Why should I litter my system with redundant “administrators” just because the installer sets one up that apparently the “users” control panel applet doesn’t seem to even know exists? Why is Administrator “hidden” by default to begin with, when in fact the first user you create is an admin itself? “Security” reasons? LOL. I’m sorry, but none of this makes any sense, and quite literally–it’s inexcusable, downright *bad* design. It seems like a joke.
Edited 2008-07-20 20:22 UTC
It’s not a bad design, it’s bad defaults. Yes, Windows creates the first user as an admin, but that doesn’t mean that you have to use that admin account, or keep it as an administrator.
The “real administrator” doesn’t show up in the users control panel applet, but it does show up in the Manage Computer MMC applet. The reason it doesn’t show up in control panel is to leave a clueless user a way out if they delete or disable their account. It makes sense, more advanced users know where to get to the Administrator account, and can even make it show up in control panel if they want (it’s just a registry setting).
You don’t have to litter your computer with administrator accounts, you just need one. Sure the defaults with XP suck, it’s true, but you can make it work with a normal user account, you just delete the account that you made when you install windows, and just make a normal user account.
Windows forces you to have at least one administrator account which is pretty sane since someone has to administer the box. Though you can disable the admin account via local security policy or a mmc snap-in.
There is a home directory concept for per user data, it is called AppData. You should have sent the winamp guys a nasty email explaining to them the insanity of not developing software in a least priviledged environment, and asking them to please use what has been considered best practice for almost a decade now.
The way to work around badly written software is to grant your user write access to the folder that you need to write to. Yeah, it sucks and is messy, but it is way better then the alternative. I can’t even imagine running windows as an admin all the time.
Edited 2008-07-20 22:51 UTC
How about right-clicking and selecting Run As (not a great workaround)?
I agree its annoying that Windows 5.x forces you to create at least one other administrator level account in addition to the hidden Administrator account which IMO should stay hidden!
As far as plug-ins, that’s a problem by the developers of Winamp, creating a system-wide program and allowing plug-in writers to wrap their binaries in installable executables, same thing with a lot of games.
Windows has a home directory – c:\documents and settings…
The problems that you describe are down to crappy programming from 3rd party software vendors. Nothing more, and nothing less.
Dave
Yeah – and if you are not from an english speaking country you have two home directories:
“C:\documents and settings” and “C:\Dokumente und Einstellungen”
Consistency is a NIH (Not Invented Here) problem for Microsoft.
On my Linux install (German, subregion Austria) the home directory is still called /home/myusername/ despite being german.
Some things are not meant to be translated, and a filesystem layout for the essential parts of the operating system is definitely one of those things.
“If there were a “home” directory concept in Windows”
There is. If Winamp doesn’t use it then that is its failing not the OS’s.
Unfortunately that isn’t helped by the fact that even Microsoft’s own software isn’t written well as to allow the smooth running in a limited user capacity. Run Office 2003 on Windows Vista and you’ll see what I mean.
At the end of the day, software companies will take leadership from the operating system vendor; if the operating system vendor isn’t interested in making their own software use the new API’s or update their software to the new security model – why should other vendors go through all the hoops?
It reminds me very much of the complaints that no big names are using the new API’s in Windows like WPF and WCF. When Microsoft’s own operating system has applications bundled with it, which don’t use WPF/WCF (which CAN be called from native code – they DON’T need to re-write it in managed code) then how can they expect third parties to make that investment if they’re not willing to do it themselves?
Please don’t encourage other vendors to take their lead from the OS vendor in this instance, particularly in respsect of security.
This particular OS vendor has a back-door into the OS such that it can be changed (“updated” is the euphemism they use) regardless of the settings or wishes of the owners of the machine on which it is running.
This OS vendor also makes an add-on after thought scanner product in the hopes of detecting breaches after they have already got in, but the scanner provided by the vendor is amongst the worst products available.
As I said, when Microsoft can’t be bothered getting Office 2003 running flawlessly on Windows Vista by releasing an update – why should any other vendor do the same? Why spend the extra money when the operating system vendor and the largest office suite vendor can’t be bothered putting in the extra investment?
They way the operating system conducts itself demonstrates how much confidence (or there lack of) they have in their own operating system. If they don’t use the features in their new operating system, its telling the software ecosystem, “we have no confidence in our new operating system”.
Yes, security is important, but like I said, when Microsoft can’t even get their own software using the security features within the operating system – what does that tell the rest of the marketplace?
True, but it is the old story of appearing to do something rather than actually doing something – and when the excrement hits the fan – the blame game is of greater interest than addressing the short comings of their products.
Edited 2008-07-20 08:51 UTC
“Run Office 2003 on Windows Vista and you’ll see what I mean.”
Ummm……….I know several people doing that without any problems. Can to give some details on what you’re referring to?
Try the fact that everytime one loads it up it keeps asking whether I accept the EULA; here is the original post I made on the Windows Vista newsgroup:
http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?…
If someone were to describe such a thing but for Linux not Windows, you would be amongst the first to jump all over such an observation with a claim that Linux wasn’t usable by average users.
Is there a quick way to switch user under Windows in a window, such as sudo or su ?
Is there a quick way to switch user under Windows in a window, such as sudo or su ?
I think runas is what you are looking for
Edited 2008-07-20 12:39 UTC
Backwards binary compatibility to the death is what I do not understand about Microsoft.
With virtualisation technology everywhere it would be easy to make a completely new, mean and lean operating system without much backwards compatibility, and run the old XP inside a virtual machine. If the new system is requested to start an old application, it can do this automatically and transparently.
Then they would be free to put really good security in place.
I agree, to a point. The over all affect of that is a clean windows install, with a broken, virus infected VM running inside it.
Good security practices have to be included from the start, including the hypothetical VM, otherwise, the problem is not fixed, just hidden.
Wrong problem. That was true of Windows XP, but not Vista. Users don’t run as ‘administrator’ by default, in Vista.
Unless you want to game online and the ani-cheat software forces you to run as admin or you can’t game.
I’m sure people can come up with more scenarios.
Would be nice if third party people would better integrate their software with least privilege in mind.
Most users just click OK anyway
the same users will put their passwords in linux when a sudo prompt appears…
Edited 2008-07-21 14:47 UTC
You do realize that there’s a significant difference between clicking a button and providing a password, right?
There is absolutely NO difference for a user that doesn’t understand the implications of the choice, either way. Cut loose an uneducated user on Ubuntu, and you’re going to see them entering their password, without regard for the consequences. The only thing that will prevent this problem is education. It isn’t a UI problem.
Edited 2008-07-22 01:27 UTC
Ho well, I run all my mail through my isp’s checkers, so that removes one vector.
I turn off all unneeded daemons or services and sit behind a firewall. So that removes another vector.
All my programs come from the Debian ftp servers, so that removes a third vector. The problem with Windows programs is that you often have to obtain them from all over the place and many places are malware-laden. Providing you stick to properly run repositories and steer clear of cowboy operations, Linux is way ahead in this regard.
That leaves drive-by malware via my browser, mainly. I keep it up to date and run it from behind privoxy.
A lot of this stuff is common sense, imho. But yes, the use of divisions of privilege by way of ordinary user being quite separate from root or admin is a really key thing, imho. The problem is, no one’s yet found a really painless way to do this. Running sudo can become such second nature, that I suspect a cleverly done social exploit that had the user typing “sudo …” could get quite far on Linux simply because so many people use sudo all the time without thinking much about it. TBH, some distros almost encourage this.
I’m wary of the “Linux is more secure” stuff. It depends on the user .. and if it ever came to fending off a malware avalance on Linux, it would also depend on that little word “yet”. We know how secure Linux is today, but the truth is none of us has much idea about tomorrow.
Saywhatnow? It reduces that vector, it does not remove it.
For UNIX/Linux, they just rely on plain text startup/configuration files. Even if virus affect these systems, from the modification time, we can see what files have been changed, then we can dig the virus out.
Sure, *nix OSes tend not to be attacked by viruses as much as their Windows counterparts, be it through better default configurations, lower numbers of installs, or outright malware writer disinterest… that doesn’t mean they aren’t actively targetted by other threats. A misconfigured *nix box can become a spam shovelling, DDoS launching zombie just as fast, if not faster than a Windows machine and I’d dare to venture an inexperienced user would have an even harder time noticing it’d happened before it was too late.
There is some piece of malware that relies upon the fact that say 95% of Ubuntu users still use ‘sudo’ OOTB?
Here is a great gaping security hole.
Personally, I think using ‘sudo’ without a password is plain crazy and actually go that on step further on all my Linux boxes and disable it completely.
As Distros like Ubuntu ( and its other coats of many colours) grow in popularity I think that it will get the attention of the hackers and a new generation of threat will occur. The old adage of security through obscurity will no longer apply.
Clearly you are a little confused, and your post shows you have not used a Distro like Ubuntu.
Sudo always DEMANDS a password before it will allow a command to run, so I do not know where you got the idea it did not use one.
The old idea is that it is secure because no-one is using Linux is also a load of balls, there are millions of internet servers running Linux. If I wanted to write a virus, I would write one that would take out the infrastructure of the internet, rather than hose up some basement dwelling internet poke players/porn junkies pc.
The quote you gave, “The old adage of security through obscurity will no longer apply.”, I hope you are aware that the “security through obscurity” idea was put about by Microsoft, when people were looking access to the Windows source code to try and make it as secure as Linux, Microsoft told them, that because the source code is not out in the open, Joe Public could not search for vulnerabilities, so it was in essence security through obscurity.
Now, instead of spouting off crap, actually download and TRY a Linux distro. Until you do so, your opinions are not valid and your post on Linux and Linux security are useless.
‘Dude’ I do use Kubuntu on a daily basis on several Servers. I use Xubuntu on my laptops. None have sudo enabled.
I have come upon many Ububtu systems where the user demanded that it was ‘Setup like Windows’ and the password requirement for sudo was removed.
I was also using an EEEPC earlier today for the first time. It also had no password requirement for using sudo. I don’t know if that was the default or not so I can’t comment on that.
If it is that easy to remove the requirement for a sudo password then I have to say that it is a security hole big enough to drive a Routemaster through.
I’m of the ‘old school’ linux user (Since Slackware 1.1, Unix since 1984) who believes in passwords and long ones at that for all critical accounts.
But hey, FOSS is all about choice. You can run your system OOTB or with (from my experience it is quite widespread) sudo passwords disabled if you want to. All I’m saying is that it is all too easy to disable sudo passwords and it could be a major security problem to targetted malware.
I don’t disagree with your take on this, but a small correction is in order. For every terminal session that you have active, you only have to give sudo your password once. Any sudo commands you run after that will not ask for your password again until you close your terminal session and open a new one.
Good point, but it’s actually time-based from your last use of sudo. You can decrease the time limit if worried.
Yeah, your Linux Boxes.
Do they say Starting Windows
when you turn them on ?
Your post shows complete ignorance of Linux and especially Sudo.
Sudo will ask for a password when you enter the first command. Then it will stay active, ONLY in that instance. If you want another Sudo instance, you need to type in the password.
Here is a great gaping security hole.
Personally, I think using ‘sudo’ without a password is plain crazy and actually go that on step further on all my Linux boxes and disable it completely.
This is something I totally agree with. sudo without a password is essentially the same as running as root. Any virus/malware/hacker etc can do anything they want on your *buntu installation as long as they can run sudo. It might be user-friendly..but it sure as hell ain’t secure.
When I was using Gentoo I configured sudo to require password for everything except a few predefined commands, and I’m glad that Mandriva does also require password when you’re trying to use sudo.
Dude, Ubuntu’s sudo requires a password – the user’s password – before doing anything.
https://help.ubuntu.com/community/Sudoers
Have you ever used Ubuntu, as the posters before me would say?
# Uncomment to allow members of group sudo to not need a password
# %sudo ALL=NOPASSWD: ALL
As you can see, it’s commented out, so by default it DOES require a password.
It still requires a password, so it’s not gaping in my opinion…
Are you saying that 95% of Ubuntu users use sudo without a password???? What are you smoking? You’re showing your ignorance on this subject.
Edited 2008-07-21 13:26 UTC
I am really curious when people get real and abandon security misconception shared by the author of article:
– first, the most serious misconception is that “root” account is somewhat more important for desktop OS than user account and that virus needs to access this root account. That is total nonsense. Reinstalling OS on the desktop is simple. Recovering deleted user data usually impossible. And virus does not need root to spread, all it needs is some form of internet connection. As long as user can display pages and sent emails, virus can spread.
– second, the idea that malware cannot hide in sources is flawed as well. All it needs is to put its scripts somewhere in ~/.gtk/desktop/myapps. Moreover, these scripts are platform independent – they will run on any unix and any CPU. And then can be written in dozen of languages linux distro usually supports. Moreover, mutating sources to make them hard to detect by antivirus software might be even easier than mutating binary.
I think that the only reason why malware is not so wide-spread in linux is really because malware writters still do not care. If linux ever gets more than 10% of market-share, it will get viruses too.
I think the one with the wrong assumptions is you. Have you used a Unix system recently?
– first, the most serious misconception is that “root” account is somewhat more important for desktop OS than user account and that virus needs to access this root account. That is total nonsense. Reinstalling OS on the desktop is simple. Recovering deleted user data usually impossible. And virus does not need root to spread, all it needs is some form of internet connection. As long as user can display pages and sent emails, virus can spread.
That is totally wrong. In a properly configured system, an infected program running with user’s priviledges will not be able to modify any other binary outside the user’s home directory — in any case, none that resides in /bin, /usr/bin, /usr/local/bin or any of the such (sure, those in /tmp may end up screwed, but then again). Hell, it’s hard enough to even infect a binary in the first place. Run everything as root and you’re screwed — it gets write access to just about everywhere.
– second, the idea that malware cannot hide in sources is flawed as well. All it needs is to put its scripts somewhere in ~/.gtk/desktop/myapps. Moreover, these scripts are platform independent – they will run on any unix and any CPU. And then can be written in dozen of languages linux distro usually supports. Moreover, mutating sources to make them hard to detect by antivirus software might be even easier than mutating binary.
…and, ran as regular users, they will be totally harmless to the system :-). All they can do is probably some nasty stuff to the user’s home directory, which is easily solved with a regular batch of backups.
I think that the only reason why malware is not so wide-spread in linux is really because malware writters still do not care. If linux ever gets more than 10% of market-share, it will get viruses too.
Oh please…
Edit: afaik, some programs that could circumvent permissions by exploiting various security weaknesses do exist — but they are quite complex, and quite possibly too complex to be accessible to your avera script kiddie.
Edited 2008-07-20 12:59 UTC
Like in Windows Vista by default
Like in Windows Vista by default
Edited 2008-07-20 13:10 UTC
Yes, and Vista users think they are safe and secure, even though they blindly click OK to all the prompts UAC throws up. Or worse, they disable UAC completely
Also no, as a user called Dave, I can download format.com from DOS 5, open a command prompt, and type this
“format c: /u /autotest”
This will run and format the drive without any prompting.
1. you can’t bypass the UAC prompt (if a process is trying to elevate itself or if it’s trying to copy, delete, modify a file in a protected location, an UAC prompt will appear).
2. you can’t bypass the Vista’s code integrity check: system files have mandatory code integrity, you can’t replace them!
Instead, in linux with a single command it’s possible to replace the whole kernel due to lack of code integrity checks!
Edited 2008-07-21 14:42 UTC
Code integrity checking you say? How can Vista check code that is not included by default ?
Like I said, download the DOS5 format command and run it from the command line. Vista will not check this, it will run it blindly and it will hose your machine up, UAC or not.
Also, people are disabling UAC, not by-passing it !
The DOS5 format command can’t modify the MBR. It’s protected.
Reference? C’mon, you just pulled that out of your crack.
Nope, I pulled that one from EVERY SINGLE forum on the internet that deals with vista annoyances, uac is the first thing people tell you to disable.
It is true that disable UAC is the most given Vista TIP on the Internet… I think it is the wrong tip to give as Vista is usable with UAC activated since the RC days…
“..and, ran as regular users, they will be totally harmless to the system :-). All they can do is probably some nasty stuff to the user’s home directory, which is easily solved with a regular batch of backups. ”
Sorry, but this is: LOL!
Regulars users do backups, right? (WRONG!) The average user is more afraid of “user land” viruses, than of “root land” viruses. The deadlies virus could be sent via social engineering, and look as harmless as this:
#!/bin/sh
rm -rf /home/`whoami`
You’d only have to fool the user into making it executable (which isn’t necessarily hard to do).
Edited 2008-07-20 14:18 UTC
Sure, and you could just tell the user to type rm -rf /home/`whoami` on the console himself, or better yet tell him to pick a hammer and smash his box to pieces.
The point of linux security is not protecting the user from his ignorance, but protecting the system and all the other users from whatever that user might do.
You have every right to delete your /home directory, so the system won’t stop you when doing so, no matter if you do it yourself or someone tricks you to run some malicious script.
Regulars users do backups, right? (WRONG!) The average user is more afraid of “user land” viruses, than of “root land” viruses. The deadlies virus could be sent via social engineering, and look as harmless as this:
#!/bin/sh
rm -rf /home/`whoami`
You’d only have to fool the user into making it executable (which isn’t necessarily hard to do).
Perfectly true — yet this applies to any operating system. Unfortunately, users need not pass an examination to use a computer, like they do with cars.
Edit — I wanted to say this in a separate post, but got carried away.
I think the likes of us have a certain… affinity towards not-exactly-essential points. From an engineering perspective, the exact reason and technical merits of why a solution is safer than another aren’t that relevant in the short-term.
Quite frankly, given the average life cycle of computers in a production environment, I wouldn’t need too many days to think about switching from Windows to OS X or Linux. Regardless of why *X is more secure, the reality simply belongs to the fact that, right ow, and in the foreseeable future, there are fewer viruses and the such.
Really now, seeing that Windows implements a complex and tested system that’s still not efficient doesn’t really make malware less harmful.
Edited 2008-07-20 19:47 UTC
You still do not see the misconception:
Malware does NOT NEED to access /bin, /usr/bin or any other “root only” directory. It does not need to infect binaries either. Access to home directory is enough for malware to spread and to have the full access to the most important files on the computer.
But that is exactly the misconception. Who cares about system. What is important is exactly that “nasty stuff in user’s home directory”.
And yes, backups always solve the problem, but note that home-dir based malware will easily get into backup too..
The point is that I (say “UserA”) don’t have to worry about whether “UserB” is a moron and fills his ~ with malicious scripts. Both my own ~ and all the system stuff will remain safe.
And anyway, if you had such a disgusting user in your system you could just not let him execute anything on his ~. Chances are he doesn’t need to do that anyway.
Of course the point you’re nicely ignoring is that the vast majority of *nix PCs have only one user and that user is also the system owner and admin.
No, I’m just saying that means are provided to protect both the system and other users from you. If you are the only user and feel like screwing your ~ then that’s up to you.
BTW I’d dare to say that the vast majority of *nix computers are actually servers.
True, but unlike Windows, Un!x systems aren’t fundamentally designed with this assumption in mind.
Huh? Windows wasn’t “fundamentally designed” with the assumption that only one user will be running on a given box. WTF did you get that from?
AFAIK, this is not quite possible with current nixes – there are scripts in ~ by default.
you can easily run with /home as noexec.
I don’t buy this. I do weekly backups, but mainly as a safeguard from harddrive failure, not as a safeguard against malware, because malware can be so subtle as to alter files without your knowing it, so you’d never consider restoring the files from the backup.
Sure, if malware trashes your whole home directory (or, at least trashed it enough so you’d notice), then you’d restore the files from the backup, but what if the malware just altered a few files? (For example, even just changing one value in a spreadsheet used by a small business to calculate payroll could lead to havoc that might not be noticed for weeks.) You’d not know it so you wouldn’t bother to restore the files, and eventually you’d backup the altered files themselves, resulting in a backup that lacked integrity.
Very good point. I would have modded you up, but I have already posted here, so have a virtual +1
And they have to try and set the execute bit on that script before it can run unlike windows which will run anything as long as its got the correct file extension.
So what? Who cares? Nor Joe User when his MP3 collection in hos home directory was wiped out.
No it isn’t.
Again, who gives a shit? The system can be restored from installation media in a short time. Your corrupted data can’t and that doesn’t even take into account the damage from your stolen data.
Please show me where I can get a backup solution easy enough for Joe Average that will effortlessly backup 100’s of GB’s of data.
Plus they can read all your data files and who knows what interesting secrets you have in those?
In case you havent kept up to date, malware isn’t about getting respect for rooting boxes anymore, it’s big time crime that is often after your personal data.
Please show me where I can get a backup solution easy enough for Joe Average that will effortlessly backup 100’s of GB’s of data.
I don’t think the issue is about having an easy backup solution. I think the issue is rather that they have nowhere to backup all that stuff to. It is a serious hassle to backup ~100GB stuff to f.ex. DVDs, not even I would be wiling to do that so even less a Joe User. Then again, some users just back up their files with some backup application to another directory or hard drive partition and assume it’s just as secure…It ain’t. I’ve several times had to explain to people that as long as a virus can write and delete stuff on their computer those backups are just as much in danger as any other file.
So what? Who cares? Nor Joe User when his MP3 collection in hos home directory was wiped out.
Very much true. Just do note that malware nowadays doesn’t usually try to delete any of your files, they instead try to f.ex. mess up your web browser so that no matter what you do you will always be redirected to a certain website. Or they can just be sitting in the background collecting information about your habits, your username and password and such. But it’s harder to hide and even make such malware function if they don’t have access to system files.
True, which reinforces the point that today’s malware doesn’t care jack about the system files.
Sorry, can’t resist. Time Machine (from Apple and integrated into the latest OS release) is the most user-friendly frontend to rsync(1) I’ve seen yet
for the end user. Anyone on a corporate network should have expensive geniuses configuring seamless backups of their data.
Okay, back to your regularly scheduled theological discourse.
first, the most serious misconception is that “root” account is somewhat more important for desktop OS than user account and that virus needs to access this root account. That is total nonsense. Reinstalling OS on the desktop is simple. Recovering deleted user data usually impossible. And virus does not need root to spread, all it needs is some form of internet connection. As long as user can display pages and sent emails, virus can spread.
I hear this argument all the time and it makes no sense. First of all reinstalling an OS is no simple task for the ordinary user (nevermind the plethora of third party apps most likely installed). Second data loss has nothing to do with viruses. Most viruses don’t delete data and a simple hard drive failure is much more likely. The security of having a separate root account is to eliminate propogation of viruses and to keep system level processes secure. User level access doesn’t allow you to take over an entire system and turn it into a spambot.
second, the idea that malware cannot hide in sources is flawed as well. All it needs is to put its scripts somewhere in ~/.gtk/desktop/myapps. Moreover, these scripts are platform independent – they will run on any unix and any CPU. And then can be written in dozen of languages linux distro usually supports. Moreover, mutating sources to make them hard to detect by antivirus software might be even easier than mutating binary
This is unlikely on most distributions because most software is installed from a central repository that uses some kind of hashing algorithm to ensure package reliability. An average user doesn’t install from source. Source level malware would actually be a lot easier to detect. Just use grep.
Come on. In recent Linux distros it is very simple task. My last ubuntu install took about 20 minutes and zero intervention on my side.
How does separate root account eliminate the propagation of malware to another machine?
As long as user is able to use internet, malware can spread.
The problem you do not see is that you do not to take over the entire system to turn it into spambot. All you need is the ability to send emails, which is something usually allowed on user level access.
Average user can get “security update by email from his distro vendor” and install it. This is how malware really works these days.
What are you going to grep?
BTW, I am speaking from my experience. The only computer I ever had infected was my Fedora base house server. And it worked just like this….
Edited 2008-07-20 15:11 UTC
Come on. In recent Linux distros it is very simple task. My last ubuntu install took about 20 minutes and zero intervention on my side.
Personally I think it is pretty easy to set up a Linux install, certainly easier than Windows but the average user thinks otherwise and won’t even touch a Windows install.
How does separate root account eliminate the propagation of malware to another machine?
As long as user is able to use internet, malware can spread.
True but you aren’t really talking about viruses anymore then. Without root privileges they can’t infect binaries and take over the system. A separate root account alone doesn’t eliminate all malware vectors but saying it doesn’t help stop the spread of viruses is naive.
The problem you do not see is that you do not to take over the entire system to turn it into spambot. All you need is the ability to send emails, which is something usually allowed on user level access.
Tell me how you can change smtpd settings without being root? Tell me why you would have a smtpd server with ports open to the outside running on your desktop in the first place. Like I said before privilege separation is one piece to security but it doesn’t solve everything. Opening up your machine to the outside with unneccessary services is your own fault and cannot be mitigated with simple privilege separation.
Average user can get “security update by email from his distro vendor” and install it. This is how malware really works these days.
Again I admit this can happen but there is nothing that can really mitigate this kind of attack although to really have any kind of affect you would have to have a spambot already to send these emails. It’s a pretty tricky attack on Linux in the first place though considering all updates are through a central repository and packages are in different formats for different distributions.
What are you going to grep?
Antivirus software uses signatures to detect viruses. Virus writers use all kinds of techniques to circumvent detection. It’s a lot harder to circumvent detection when the exploit is in plain text format.
I don’t think it is difficult to see why Windows is so easily owned compared to other operating systems. The necessecity of running as administrator because of backwards compatibility makes it low hanging fruit for crackers. Just visiting a web page with a Windows XP machine can lead to the entire operating system being taken over because a single flaw in the client is turned in root access without any privilege escalation necessary. Tricking users into installing software is one thing but automated root access is something only Windows gives up so easily. A lot more user interaction is required in Linux to install and propogate malware. I will say that Vista has gotten a lot better in this regard but the fact that Vista is a dud and many people and organizations are avoiding it altogether isn’t making Window’s problems a thing of the past any time soon.
Opening up your machine to the outside with unneccessary services is your own fault and cannot be mitigated with simple privilege separation.
Does every user have to know?
Well, be it Linux or Windows the weakest link is the user. The user will happily click Allow in UAC or enter his password in Linux just to get his favorite smileys or his dancing desktop showgirl.
As the original poster said not running as root doesn’t make that much of a difference if you are the only user using that machine. And this happens to be the case most of the time when talking about desktop machines. It’s true, it would save you from reinstalling which in Linux anyway is a breeze (20-30 min tops), but that’s pretty much it.
Your data is compromised, the malware could do what you, as a normal user, could and that’s quite a lot, including internet access, access to network shares, access to your address book, sending emails via your favorite mail client and access to all important files (YOUR files not the system files).
Infecting executable files is not the way modern malware prefers to work. The articles speaks of a problem you might have had 5-10 years ago.
I’m not saying running as root is a problem under Linux – it’s painless actually and it’s a welcome addition, but running as a regular user under Windows, including Vista is shooting yourself in the foot and I don’t think it’s the biggest attack vector. The clueless user is. Linux doesn’t have enough of those to make it a feasible target.
I don’t understand your [flawed] logic… Firstly – those “source files” need to have root access to hit various parts of the system that are outside of normal user access rights. Sure, your ~ will get bollocksed, but the rest of the system will be OK, unless there is some sort of priviledge escalation attack that takes place.
With Windows, you *have* to run it as root basically to get things even remotely working. And that means the normal user has absolute access to the rest of the system, including system files. Herein lies the problem.
Most Windows based software application developers should be shot for their pi$$ poor efforts. UAC is a small step in the right direction for Microsoft.
Dave
Here three flaws that are made by both sides:
1) Linux has a smaller marketshare, therefore, it is less likely to be a target for malware writers. Malware writers want the best bang for the buck – they’re not going to worry about targeting people like me (MacOS X user) or any other *NIX user. It just isn’t worth their while. Don’t confuse security holes and malware. The focus should be on the security hole itself, not the results of that security hole (aka exploits being written).
2) People confuse malware written for security holes. Windows advocates on this very forum try to shift the blame; as if Windows is 100%, and if it weren’t for those ‘nasty malware writers’, it would be secure. That some how, malware gets on the machine, but it isn’t the result of a security hole.
As a result, the focus is deliberately moved onto the malware instead of the security hole itself which allowed the malware to get onto the machine.
3) If Microsoft had a fast turn around because they were dealing with clean and efficient code (rather than the spaghetti mess they have today), then the malware writers wouldn’t have the window of opportunity to launch a malware attack.
If their operating system had proper separation between the system, user and services, then we wouldn’t see vulnerabilities within one service resulting in a roll on effect to the rest of the system.
Edited 2008-07-20 06:50 UTC
Aside from the obvious issues with the fact that 99% of linux users run as a normal user, while 99% of windows users run as admin (and its a pain in the ass not to)…
There is the basic design of the OS to content with too…
Linux has a package manager, and the package manager can detect if any files it installed have been modified, sure if you have root you can modify the database too but you would need your malware to support every package manager.
Linux is much simpler and far more modular, the places where malware and security issues can hide are well understood, windows on the other hand is much larger and massively more complicated.
Linux has a logical filesystem layout, with consistent permissions… Binaries go in /bin, libraries go in /lib etc, windows dumps everything in system32 so the typical user has absolutely no idea what’s supposed to be there. A normal user has no need to write to the system dirs, and therefore can’t.
Linux is not a monoculture, there are many linux distributions that use different methods to start programs at boot or store checksums of installed packages etc, a piece of malware may run on any distro, but certain parts of it (like its ability to survive a reboot or try to cover its tracks) may not work, not to mention all the other os’s which are mostly source level compatible with linux (bsd, solaris, etc).
Linux has mount options – you can mount certain areas without execute privileges, and it makes sense to use this feature on /home and any temporary areas… Malware downloaded to such areas would not even run.
Windows was conceived for physically secure standalone single user desktops without network access. Unix-like systems (including linux) were designed for multiple users with network access via terminals. The terminals were assumed to be physically accessible to unauthorised persons. The Windows paradigm is totally broken for a modern networked environment.
Allthough I think you summarized some very valid points, I may add another reason that is simple, too.
It’s the users. Yes, I think it is that simple. If you compare typical statements regarding questions about security in general, viruses, malware, their effects on one user himself and on others, you’ll see an interesting difference in the users who use “Windows” and those who don’t.
I’ll explain this in a very generalized way.
First for the “Windows” users: They don’t care for others. They claim to have no virus allthough they haven’t checked, and finally, due to the closed nature of their “Windows”, they cannot be completely sure. There are firewall applications that do consist of spyware, there are trojans that install theirselves over a faked virus scanner. So it’s completely possible that such users run a spam speading machine, including illegal file sharing services. And it does not matter to them. Even if they feel something is wrong, they just reinstall their system (including their malware) and believe everything is fine. If I may say this: I think “Windows” security is about believing, not about knowing.
On the other hand, there are those who use Linux, UNIX, and Mac OS X. Most of them are interested in what their machine is running. They follow the approach: “If I need it, I will install or enable it.” Those who want to know exactly what’s going on “under the hood”, those ones install packet monitors and system diagnostic tools. They can even examine the source code of their OS and their applications and find out where a possible problem can be caused from. Those users usually care for others, for example, they pay attention not to run an authentification-less mail server or let everyone on the Internet access their system without permission. They tend to read what’s on the screen instead of just clicking the queaking buttons.
Of course, there are “Windows” admins out there trying to fix security problems day by day that their users cause or that comes from general problems with the software their run. Those could be considered to belong to the group mentioned before because they show responsibility, but they usually don’t have such good tools and means to achieve their goals.
According to Linux (and UNIX) getting more usage share among newbies and average users – NB that I’m not talking about oh joy oh market share here! – I hope there won’t be many changes to this situation. Because if you treat Linux in a responsibility-less way, remove every barrier that is well intended, just to increase supposed feelings of comfortability, you will end up in a messed-up Linux that is to be compromized within no time.
Finally, ask yourself a question: Why is more than 90% of the world’s email amount transferred today nothing but spam? Refer to the first category of users I mentioned above.
So, why care? 🙂
Social engineering, phishing scams, xss attacks and zero day browser vulnerabilities are not going away and thrive on the weakest link the user.
True. However, that is also a very poor security assumption. Many security issues in a corporate environment arise because it is the internal users who try to circumvent corporate policy. Local users cannot be trusted either (and they are not trusted on Un!x). That’s why host-level administration operations should always require a password. User-level account configuration/preferences should not be tied into this. See the Windows design-flaw now?
I know nothing about Vista because the last time I saw windows was in 2004 and it was XP.
So I can only compare XP to Linux.
With Linux, you are fully productive if you ran as a limited users. With XP, you are very handicapped if you ran as a limited user. At least back in 2004, a lot of applications for windows still weren’t ‘multi user’ and thus required administrator privileges to function properly.
Even if more people ran Linux than windows, Linux users will still install only applications from trusted sources such as their distributions. Windows users will always download and install anything they can get their hands on from the Internet whether it is something they need or not.
How is the situation now with Vista? Any good news? Is it better?
One of the issues with virus infections was that I used to get a lot of infected documents from friends and co workers but McAfee used to detect those. Now with Linux, I still scan incoming document files with clamav
Edited 2008-07-20 10:21 UTC
It’s all about ABI and usage style for viruses. Trojans are a bit of a different beast but even those are less “able” on Linux.
For a virus to work properly on an OS it has to depend on some weakness. This is usually some sort of buffer overrun combined with known memory layout or more elaborate stuff. The key thing here is OS stability. Permissions and usage style aside, Linux is so damn unstable when it comes to ABI that anything which depends on some sort of set memory layout will simply work only on 1% of the targets (different settings, different lib version, different compile-time switch, etc. etc.). The only “stable” part of Linux OS is the kernel and even that can change a lot by just different settings.
So a cross-distro virus is almost as hard as cross-platform virus (even for one architecture).
Linux also limits the possibility of virus and trojan propagation because of the package based redistribution. Sure one could infect a repository but official ones are very unlikely to get infected and 90% of users don’t have unofficial 3rd repositories. Sure there’s a program here and there installed directly from deb/rpm/whatever but it’s rare.
So it’s not about the number of users as much as it’s about the number of 3rd party programs available.
IMHO a virus won’t survive on Linux for long, if anything then because of it’s ever-changing nature.
I am not Windows expert, but occasionally I deploy my Java written software on Windows machine. Here is what I noticed.
1. Some active network services can not or should not be shut down
2. In fact there is no complete information on what each service does
3. There are no domain sockets. There are “local” named pipes, but many applications use sockets.
4. Majority of files are accessible by anyone in every way. If I created a file, it would not be owned by me. I have to make additional effort to ban other users from accessing it. It is true even on Windows server. I wonder if it can be configured differently and if it can, why nobody does it.
Admins have to keep Windows machines in total lockdown. Even the access from local network, for administration tasks, are not allowed. There is no such thing as SSH, although there is a SSH server for Windows. What about access lists ?
I wonder is it a cultural problem or there is a deficiency in OS design. I’d like to read some comments on this.
The problem with people making the same argument as the author is that they assume that if Linux were to ever gain a significant desktop marketshare (say 30-40%), the Linux environment would be just as it is now. However, there are several things that would change:
1. Instead of having a user base that is mostly technically savvy, you’re going to have several million new users, many of whom will be absolutely clueless about computer security. When some of them get their new Linux box up and running, the first thing they’ll want to do is get on whatever the Linux equivalent of Limware is and start downloading everything they can get their hands on. Is Linux really up to the task of handling anything you’re willing to run on it? Though security on Windows is bad (esp 2k/XP), the problem wouldn’t be nearly as bad if the majority of people had even a passing knowledge about security.
2. The author states that binaries in Linux are not very common. However, as more and more users start to switch, this is going to change. The more users that Linux gets, the more people writing proprietary software are going to make the jump from writing apps for Windows to writing apps for Linux. And many of these apps will not be avilailable as source.
3. As an extension of #2 above, the concept of the ‘package manager’ won’t mean as much once Linux starts to go mainstream. In order for that to happen, somebody is going to have to write a universal installer that works across all distros. (Linux pundits may pound their chest and say that I am wrong on this point, but Linux right now is humming along at about 1% marketshare, and this is one of the reasons why.) Either that, or when Linux does go mainstream, it’s likely that the majority will probably be using one distro (I imagine ‘AOL Linux’ or somethin similar), so malware writers would only have to target their apps for that one distro, and make their wares available wherever, probably as a drive by install due to browser vunerabilities. Only thing I’m saying here is that Linux will have to stablize a bit, and won’t be such a ‘constantly changing environment’.
4. More users means more hardware vendors will be writing more proprietary drivers. And if Windows is any indication, many of them will be very badly written, which I’m sure will introduce quite a few more security holes.
5. The most important point of all: the threat these days is not so much from viruses, but from malware. And malware doesn’t exactly have to gain root access to propagate itself. For example, if an attacker wanted to use a Linux box as a spam zombie, basically all the program needs to be able to do is run in the background and send outbound emails on port 25. I’m assuming that most users running as a non-root account has access to do that. Otherwise, how would they ever send mail ?
Am I saying that Windows is more secure than Linux? Certainly not. Such a statement would be laughable. However, to assume that if several million Windows users were to switch to Linux, they’d all be happily going along with no significant security issues to speak of would be ludicrius. Would it be as big of an issue as it is on Windows? Probably not, but it would still be an issue!
Edited 2008-07-20 20:06 UTC
While I somewhat agree with your package manager remarks, your market share statement is nil. Market share is irrelevant to real Linux adoption %. I’m not saying there’s a huge amount but it’s more than 1% of desktops which have at least multi-boot Linux on them.
Also, package managers won’t go away, because they are by definition superior to what windows has (if you disagree google for “shared libraries”, “PIC” and a few others things which lead to it). The only problem is making ONE package manager universal (so everyone packs for it) OR an abstraction above them with good enough shared points (same names for libs etc.).
Imagine if .deb was the only package manager for all distributions and that there wouldn’t be any useless name changes per distribution for basic packages and libs? Then everyone could simply pack whatever they want into a .deb (closed source too) without much fuzz and it’d still work fine.
And if we consider that the majority of servers are Linux (again, market share aside, let’s be honest) then one must ask.. how come there’s no virus for Linux? I mean what’s more lucrative for malware writers than to infect a server (especially say a repository! Imagine the power!)
Edited 2008-07-20 21:50 UTC
Well, I’m not going to get into a big pissing contest about package managers vs the Windows way, because that is off-topic. However, I will say this: the article makes an argument that viruses and malware will have a harder time on Linux because of the variety of distros and configurations make it harder for malware to spread. Well, if these differences make it hard for malware writers, it also makes it hard for legitimate developers who might want to develop commercial applications for Linux and only release them it binary form. That’s going to slow down the adoption of Linux on the desktop. Once you make it easier for developers to write one app who’s binary runs on any/all distros, so too have you opened the door for malware writers.
There is a BIG difference between server and desktop. By that I mean that Linux servers aren’t generally administrated by morons who will run anything that promises them nude pics of Megan Fox.
1. Instead of having a user base that is mostly technically savvy, you’re going to have several million new users, many of whom will be absolutely clueless about computer security. When some of them get their new Linux box up and running, the first thing they’ll want to do is get on whatever the Linux equivalent of Limware is and start downloading everything they can get their hands on. Is Linux really up to the task of handling anything you’re willing to run on it? Though security on Windows is bad (esp 2k/XP), the problem wouldn’t be nearly as bad if the majority of people had even a passing knowledge about security.
Package repositories mitigate this to some extent because Linux users don’t need to grab a lot of random stuff off the web.
2. The author states that binaries in Linux are not very common. However, as more and more users start to switch, this is going to change. The more users that Linux gets, the more people writing proprietary software are going to make the jump from writing apps for Windows to writing apps for Linux. And many of these apps will not be avilailable as source.
Most users, especially novice users are going to stick with what is in the package repository except for maybe large well known applications. Who is going to go out of there way to get a no-name dvd ripper when there are several in the repository?
3. As an extension of #2 above, the concept of the ‘package manager’ won’t mean as much once Linux starts to go mainstream. In order for that to happen, somebody is going to have to write a universal installer that works across all distros. (Linux pundits may pound their chest and say that I am wrong on this point, but Linux right now is humming along at about 1% marketshare, and this is one of the reasons why.) Either that, or when Linux does go mainstream, it’s likely that the majority will probably be using one distro (I imagine ‘AOL Linux’ or somethin similar), so malware writers would only have to target their apps for that one distro, and make their wares available wherever, probably as a drive by install due to browser vunerabilities. Only thing I’m saying here is that Linux will have to stablize a bit, and won’t be such a ‘constantly changing environment’.
That’s an interesting theory but it is merely your opinion and no facts support your position.
4. More users means more hardware vendors will be writing more proprietary drivers. And if Windows is any indication, many of them will be very badly written, which I’m sure will introduce quite a few more security holes.
Linus doesn’t look too kindly on proprietary drivers and most Distro’s don’t package them in any way. No novice user is going to go out of there way to install proprietary drivers when free drivers are available. If a free driver isn’t available a novice user probably won’t even know about it. My bet though is that just like today there won’t be many proprietary drivers in the future.
5. The most important point of all: the threat these days is not so much from viruses, but from malware. And malware doesn’t exactly have to gain root access to propagate itself. For example, if an attacker wanted to use a Linux box as a spam zombie, basically all the program needs to be able to do is run in the background and send outbound emails on port 25. I’m assuming that most users running as a non-root account has access to do that. Otherwise, how would they ever send mail ?
I’m not sure you really understand how email works because you are missing a very important detail…the SMTP server. You need an SMTP server zombie to send spam which means you have to root at least one system first. If you try using the ISP’s SMTP you will be cut off pretty quickly.
Unfortunately for you he is right. In order to send spam you only need to be able to listen on any port and send outbound and any user can open connections to servers on port 25 and listen on ports above 1024. Being forced to use the ISP relay does help but has nothing to do with the malware having root access or not.
Unfortunately for you he is right. In order to send spam you only need to be able to listen on any port and send outbound and any user can open connections to servers on port 25 and listen on ports above 1024. Being forced to use the ISP relay does help but has nothing to do with the malware having root access or not.
You cannot open a connection on port 25 as a normal user. Your only options for sending email are either an ISP SMTP server, a webmail SMTP server, or your own SMTP server which requires root access. The first two will drop your spam in very short order. In order to send with your own SMTP you have to have an SMTP server installed and configured, which isn’t common on a Desktop distribution. Contrary to what you said ordinary users cannot bind to ports lower than 1024, you must be root to do that on Linux.
Edited 2008-07-21 20:40 UTC
Sure you can. How do you think you talk to the ISP’s SMTP server that is on port 25? Magic?
True but of no relevance.
Actually, that’s exactly what I said. That users can’t bind to ports lower than 1024. It’s irrelevant though.
You on the other hand, does not seem to understand the difference between connecting to a remote port and binding to a local port or how email works.
A spam malware does not need to bind to port 25, it only need to bind to *any* TCP port. Heck, it could even use UDP if the malware author felt like it. The important thing for the malware is that it has an inbound communications channel (TCP or UDP on any port) and can make outbound connections to TCP port 25.
It is actually not even necessary with an inbound communications channel but having control over the malware gives it greater flexibility.
Allowed inbound connections (eg:firewall policy) aren’t needed either. All that is needed is something that homes in on some rogue server sowhere on the net ( eg: with netcat), and you have established a connection.
Other than that, you could use icmp,tcp,udp to transfer your data.
Depends on if the ‘no-name dvd ripper’ is better than what’s in the repository. For example, here’s what I use:
http://www.aimersoft.com/dvd-ripper.html
This program can decode and rip DVDs in one pass. Oh, and while we’re on the subject of DVDs, take a look at this:
http://www.slysoft.com/en/anydvdhd.html
Any open source Linux apps on the repositories that have the ability to crack the DRM on Blu Ray discs and allow me to play them on non-HDCP devices?
If not, people are going to go looking for alternatives, and those alternatives are probably going to cost money. If they’re honest like I am, they’ll pay for the product, and get clean versions. If not, they’re going to go hitting the P2P sites looking for a warez copy.
Like I said in a previous post: The article makes an argument that viruses and malware will have a harder time on Linux because of the variety of distros and configurations make it harder for malware to spread. Well, if these differences make it hard for malware writers, it also makes it hard for legitimate developers who might want to develop commercial applications for Linux and only release them it binary form. That’s going to slow down the adoption of Linux on the desktop. Once you make it easier for developers to write one app who’s binary runs on any/all distros, so too have you opened the door for malware writers.
So, if a few million more people were to switch to Linux, you don’t assume that hardware vendors won’t start including Linux drivers on the CD that comes with the product? And if they do, you don’t suppose that users will start installing them like the quick start guide tells them to?
Depends on if the ‘no-name dvd ripper’ is better than what’s in the repository. For example, here’s what I use:
I thought we were talking about ordinary users? Just take a look how MediaPlayer and IE became so popular. They were included with the OS. Normal users don’t go looking for third party apps when they already have an equivalent installed, unless it is a known application which negates the possibility of it being a virus.
Any open source Linux apps on the repositories that have the ability to crack the DRM on Blu Ray discs and allow me to play them on non-HDCP devices?
If not, people are going to go looking for alternatives, and those alternatives are probably going to cost money. If they’re honest like I am, they’ll pay for the product, and get clean versions. If not, they’re going to go hitting the P2P sites looking for a warez copy.
That’s not something a normal user does. You must remeber that you and everyone else who has an account on this site does not qualify as a normal user. With that in mind I don’t see this as an issue. There is very little outside the repository that a normal user is going to need.
Once you make it easier for developers to write one app who’s binary runs on any/all distros, so too have you opened the door for malware writers.
I guess I have to say it again. That is an assumption that has no basis in fact. It is your opinion that a universal format is needed and will arise.
So, if a few million more people were to switch to Linux, you don’t assume that hardware vendors won’t start including Linux drivers on the CD that comes with the product? And if they do, you don’t suppose that users will start installing them like the quick start guide tells them to?
Linux has made it this far with very few binary drivers and excellent hardware support. There is nothing that makes me think this is going to change. Users won’t need to install drivers off of the CD if their hardware works as soon as they plug it in because the drivers are included in the kernel.
My biggest grievance with your arguments is that they are based on assumptions that so far have not been true.
Let’s assume for the sake of argument that you’re right. But you still have to account for the millions that are on P2P right now (about 20% of people in Europe according to a recent survey) downloading movies, music, and porn. So let’s say there’s a popular video player on Linux that has a security hole in it… somebody uploads a video and says it’s a video of Megan Fox having sex. But, the video has malicious code in it, and so when people download and play it, BOOM… instant malware Again, this doesn’t happen currently because very few people using Linux would ever fall for such a thing.
So you’re saying that if I go out tonight and buy an iPhone, a Zone, or any number of electronic devices I can buy in a computer store, I can go hook it up to any computer running Linux and it’s going to work right out of the box???
What I’m basically doing is making predictions about what would happen if Linux gained a much bigger marketshare than it currently has, and how that might negatively impact security. It’s just a like living in a city… if you start off with a small population, and then it grows and hundreds of thousands of new people move there, crime IS going to go up. It’s unavoidable.
Let’s assume for the sake of argument that you’re right. But you still have to account for the millions that are on P2P right now (about 20% of people in Europe according to a recent survey) downloading movies, music, and porn. So let’s say there’s a popular video player on Linux that has a security hole in it… somebody uploads a video and says it’s a video of Megan Fox having sex. But, the video has malicious code in it, and so when people download and play it, BOOM… instant malware Again, this doesn’t happen currently because very few people using Linux would ever fall for such a thing.
That won’t work. Files are not executable by default. Either the file won’t play and will be deleted or it is some kind of buffer overflow that won’t be able to do much damage because the video is playing as a user, not root. That is the point of this discussion isn’t it?
So you’re saying that if I go out tonight and buy an iPhone, a Zone, or any number of electronic devices I can buy in a computer store, I can go hook it up to any computer running Linux and it’s going to work right out of the box???
A lot of it will. Some of it won’t but that’s not what we are talking about. You are suggesting that if Linux is ever a big player in Desktop operating systems market it will be suceptible to viruses. I’m just arguing that I don’t think we’ll get there through binary drivers. Don’t forget that there are less drivers available for OSX than linux yet it is still considered a mainstream OS. I don’t think you have to support everything under the sun to be a mainstream OS and you definitely don’t have to depend on binary drivers. Linux has already proven this.
What I’m basically doing is making predictions about what would happen if Linux gained a much bigger marketshare than it currently has, and how that might negatively impact security. It’s just a like living in a city… if you start off with a small population, and then it grows and hundreds of thousands of new people move there, crime IS going to go up. It’s unavoidable.
That’s not the only thing you are doing. You are assuming that Linux is going to have to change directions from what it is currently doing to become mainstream and that those changes are going to cause Linux to become more susceptible to viruses. That’s a huge assumption, one that a lot of people, including me don’t believe in the slightest.
As I pointed out elsewhere though, viruses aren’t really the big threat on Windows these days. Though a virus would probably need root access to spread and do damage, somebody who just wants access to your box to install adware, to use it on a botnet to send denial of service attacks, or as a spam zombie probably does not need to take over the entire box. For example, there was some debate in this thread about whether or not you would need root access to run an SMTP server and send outgoing mail. But I’m not an expert in this matter so don’t know if it is possible, but if it is possible to do it without root access, somebody would find a way to do it if the target was big enough.
Yeah, that is exactly what I’m saying (though I am referring to malware more than viruses.. there’s a big difference.) Fact is, Linux IS changing. Every time I try it out (about once a year), it becomes easier to use and more idiot-proof. I personally think it still has a ways to go in this area, and I do think that as it heads down this path, it will become more vunerable. You may not agree with me, in which case we’ll just have to agree to disagree
This has been a nice discussion!
You may not agree with me, in which case we’ll just have to agree to disagree
I agree.
This has been a nice discussion!
I agree (again).
Actually, I think that you are wrong on this one. I’m no sysadmin so somebody will have to confirm this, but ports below 1024 are restricted to root in Linux. That means that an unpriviledged user cannot open port 25 or port 80 or port 999 or whatever below 1024. Therefore, unless you can send spam without port 25 and without passing through your ISP’s smtp server, I think you really can’t turn a Linux box into a spam zombie without being root.
Unfortunately it’s still possible to listen to ports above 1024 and in order to send spam any listening port is good enough.
So you’re wrong, not being able to bind to a port below 1024 gives very little protection in this case.
I think that because the popularity of Windows makes it a juicier target, I’m not sure we’ll ever get an objective measure of whether Windows is more or less secure than some other OS. But Windows does make some things more difficult then they have to be.
Ever run the Autoruns utility from the Sysinternals (now part of Microsoft) site? My output numbers hundreds, if not over 1000, lines. One thousand automatic startups of programs on the system in question. How is a normal user supposed to wade through that and troubleshoot problems? Hell, the number of startup categories (tabs in the main Autoruns window) is something like 15 or 20. Compared to rc scripts, this is a nightmare, though I’ve gotten pretty good at visually scanning the output after manually disinfecting several friends’ and family’s machines.
Running as a non-admin user is still too difficult in Windows (XP and earlier at least). On a fairly static desktop it’s not too bad, but it was impossible on my laptop when having to deal with wireless networking problems. I have had better luck with creating a new limited user from scratch rather than downgrading a formerly administrative user, but still run into some problems. I’ll also admit that much of this is 3rd-party application vendors, but Microsoft was the only single entity that could have enforced least privilege with the app vendors, and they should have done so when NT came out. I found the nonadmin.com site, lots of good info there, and plan to try again as a limited user on this laptop when I do my biennial Windows reload to clean out the crap (another discussion thread).
[Vista UAC actually seems to be a pretty decent tool, at least the few times I’ve used it, but there again the account has admin privs to begin with, and now you have to trust that Microsoft securely removes them except when you click OK in the UAC dialog — seems that the Unix model of no privs to begin with and elevate them sudo-style would have been easier.]
The registry — Microsoft’s single worst idea ever. No friggin’ way you’ll ever know what all is in there. The Linux/Unix text file config model is far simpler and more effective from a security standpoint. I really like the trend now toward “portable” apps that are moving away from registry usage.
Someone pointed out that Windows’ single-user heritage is the cause of all this, and that’s probably true (that certainly seems to be at the root of other Windows issues I’ve encountered). Could be. In any case, Microsoft could make this easier than it has so far.
I would rather gnaw my fingers off than use Windows.
That being said. I’ve seen Windows machines that ran for years without getting a virus. These were ones that were never connected to the Internet an actual non-administrative account was used to log users on.
This article has to do with assumed knowledge. The fact that an operating system is more widely used and therefore more targeted for malware cannot be proven until the Unix based open source systems are as widely used as a Microsoft Windows system. I believe this to be somewhat true but not the only factor in the explosion of malware in Windows.
First we must look at how a virus could be possible in an open source system like Ubuntu. It could be done with a parsed scriptng language like PERL or Python. Then the user would have to unwitingly right-click the file and then click the “run in terminal” option, or in terminal type ./thismalware.pls. Having done that, the script could then be able to be emailed to other users who may or may not be using a UNIX or Linux system so it would not be able to be run in that form on a Windows platform.
After all that the fact remains that permissions must be set, the version of the installed parser must be known, and the user must be easily duped. I believe it is just too much trouble to build a virus in the *Nixes.
So at worst, Linux is as insecure as Vista. I think I’ll take my chances with Linux.
The author seems to think that Linux is inherently more secure than Windows and says that the idea that Windows has more viruses simply because it’s targeted more. However, all the arguments give are really a question of the user being smarter and a user can be smart on both Windows and Linux. He then proceeds to cite a number of instances where viruses for Linux have been create
Nothing in this article shows that Linux is inherently more secure than Windows. I think that it is. It’s not completely safe, but I do think that it is at least safer. But if anything, the arguments in this article debunk the point that the author is trying to make rather than supporting it.
Edited 2008-07-21 18:49 UTC
The same inherent problem remains with Vista, the
ability for the user to run all applications
they install without admin access or rights.
When I install Amarok in Fedora 9 for instance,
I install the application by using sudo yum install
amarok then I can start the app as my regular user
account without any permissions. I believe Windows
can be very secure only when they require the
coders to write the code correctly maximizing the
new security UAC included in Vista.
With a Linux distro, you can use sudo to perform
system work and the inherent file system and
OS is built for a multi-user environment. Plus,
the networking is more robust, plus with RHEL or
Fedora you can setfacl’s on directories, files
and restrict access and SELinux is enabled by
default. I think Windows has a long road to tow
until they enforce application writers to code
for the end user to run the application.
Both operating systems have their weaknesses,
however right now in my opinion the Linux
distro community has a leg up on the core
functionality of security with ease of use.
You SHOULD see more UNIX and Linux boxes affected by malware. The payoff is much higher for these systems since they prevail in server applications. Those servers are far more likely to be part of infrastructure, or serve files and information to a very large number of systems. Further, they are more likely to contain critical/private information.
Cold you reek more havoc or reap more profit compromising your mother’s laptop and her address book, or by becoming root on Google’s main servers?
I think it’s a red herring to explain away the observed security advantages of UNIX and Linux-based systems as the result of a lower installed base. The fact of the matter is that despite a smaller base, these systems are many-fold more desirable targets. Were they equally as simple to compromise, they’d most assuredly the principle targets of ne’er-do-wells.
Well put.
A relevant analogy is that of robbing banks versus robbing individuals. Banks tend to have a lot more money in their vault than an individual will be carrying, so one might imagine that since robbing banks should be expected to have a higher payoff, everyone would be robbing banks.
That turns out not to be the case; while, if you succeed, the payoff will be high, it has turned out to be the case that there are decent enough mechanisms available to secure banks that they turn out to be terrible targets. Bank robbers don’t get that much money, and tend to get caught pretty quickly.
The analogy should be pretty obvious :-).