Computer code that exploits a flaw in Apple’s Mac OS X was released over the weekend. The code takes advantage of a weakness in core parts of Mac OS X and could let a user gain additional privileges. Apple provided a fix for the error-handling mechanism of the kernel last week, but the exploit appears to have been authored before then. “It appears to have been written well before the vulnerability was fixed,” said Dino Dai Zovi, a researcher with Matasano Security who was credited by Apple with discovering the flaw when the patch was released. Obviously anything but spectacular (since it’s fixed), but it does raise the age old question: will the growing popularity of both Linux and OS X lead to more of these exploits– possibly one that does get released ‘in time’?
It’s never been about popularity, just because BSD and Apache own the majority of the webserver market, I don’t see such severe reports about it, yet I still see heaps on IIS.
It’s a fact of the industry that things are going to be insecure and have bugs, what is good to see which company’s deal with it in a timely fashion, and whether that fix is a good fix, or will be exploited the next week.
Looks like your going to need a person infront of the machine that doesn’t have many rights and is disgruntled.
Glad it’s fixed, but its not really a huge threat.
Edited 2006-10-03 09:26
Because (as we well know) no system is 100% secure, flaws *will* eventually be discoreved – it’s just a matter of time and code quality.
IMHO, popularity/user base may contribute to this, but only to a small extent.
Fortunally for us, not all OS’s take ages to release security patches, and in general the quality is at least good enough to not need a [re-[re-[re]]] patch a few weeks later.
Well where are they? Those “heaps” of security flaws. Compare security records for the latest versions of Apache and IIS… You are up for a surprise.
I still can’t believe people say Apache is secure when mass exploits are now being run by bots exploiting Apache servers all over the internet.
Once again I ask… Where are those “heaps”?
Please talk about something you actually understand. Those “Apache” flaws you like to talk about are from insecure scripts like the crappy xml-rpc php scripts that bots such as the sasser worm exploit. That means they aren’t Apache flaws.
Take a look at the latest Stable versions of both Apache and IIS on a well respected vulnerability classification site (secunia). Notice that the numbers are exactly the same.
http://secunia.com/product/9633/?task=advisories Apache 2.2x
http://secunia.com/product/1438/?task=advisories IIS 6.x
Also note that these are the published vulnerabilities and there are many more unpublished ones circulating around the computer security circles. IIS 6 is more difficult to crack because it took a lesson from Apache and introduced features in loadable modules. Earlier versions of IIS were easy to exploit because everything was always running. Could you ever use a URL equivalent to this on Apache:
http://www.victimserver.com/scripts/..%c1%9c../winnt/system…
I used that to hack my middle school years ago when I was a script kiddie.
If you think I made that up, take a look here:
http://www.hackingspirits.com/eth-hac/papers/iis_uni.html
To my knowledge, Apache (even older versions) didn’t have lunacy that allowed it to be hacked that easily.
Edited 2006-10-03 14:19
Wow man, that is really disingenuous.
You’re comparing a product that’s been out for many years (IIS6) to a product that is a minor version that has only been out for less than a year.
Popularity can have a lot to do with exploits in software. As GNULinux becomes popular among the general user population a lot more stupid people will have access to it. Stupid people will aways find a way to make software insecure.
People have been recieving malicious software in their email for 10 years, but people still run attachments from an unknown source.
Popularity can have a lot to do with exploits in software.
Partially true.
[…] a lot more stupid people will have access to it. Stupid people will aways find a way to make software insecure.
Well, I would’ve never thought that those stupid people make those software insecure. I, humbly, thought the software makers produce insecure software. Those “stupid” people are bright enought to discover flaws with lightning speed and write exploits loooong time before the software makers can produce any patch. And some sw makers even produce patches that need patching to patch the flaw.
Unethical ? Childish ? Too much spare time ? Jobless ? Evil crackers ? Maybe. Stupid ? I doubt that.
“It’s never been about popularity, just because BSD and Apache own the majority of the webserver market,”
Apache 2.0 has 11 times the security holes as IIS6.
On top of that, as the recent GoDaddy experience shows, Apaches lead in the webserver market is exclusively because large hosting companies park millions of unused domains on Apache.
When GoDaddy switched to W2K3, Apaches lead dropped by a huge amount. And that was ONE hosting company.
Edited 2006-10-03 16:50
Apache 2.0 has 11 times the security holes as IIS6.
Sources?
On top of that, as the recent GoDaddy experience shows, Apaches lead in the webserver market is exclusively because large hosting companies park millions of unused domains on Apache.
When GoDaddy switched to W2K3, Apaches lead dropped by a huge amount. And that was ONE hosting company.
5 percent is a huge amount? You really think Apache has DOUBLE the market share of anyone else (August 2006 Netcraft survey: http://news.netcraft.com/archives/web_server_survey.html ) because all of those are “parked domains”?
Secunia: http://secunia.com/product/73/
33 for 2.0. 3 for IIS6.
“In Netcraft’s June 2006 Web Server Survey, the ‘Net services company found that Microsoft’s Internet Information Services (IIS) continues to make gains on the Apache web server, garnering another 4.25 percent of the market share while Apache lost 3.25 percent, leaving Microsoft and Apache holding 29.7 percent and 61.25 percent total market share, respectively.
Compared to three months ago, Apache has lost 16.7 percent of the market. Netcraft attributes much of the movement from Apache to hosts such as Go Daddy switching over to IIS. Go Daddy’s move to a Microsoft-based environment cost Apache 1.6 million hostnames alone.”
http://arstechnica.com/journals/microsoft.ars/2006/6/8/4271
One hosting company switches its parked Domains to IIS6 and Apache loses 16.7% of the market.
Imagine if 5 more hosting companies switched.
” read this and say: wow… are those peoples seeing themselves what happens? Does this makes any sense to them? 1.6 million hostnames parked at Godaddy changed the stats dramatically like that… If so, then we do realize that in the 85,541,228 sites they are using for the web server survey there are much more domains parked… We have here included millions of other parked domains from other major domain parking providers like Sedo, Afternic, DomainSponsor, etc. that are much bigger than Godaddy.”
http://www.ducea.com/2006/06/09/netcraft-web-server-survey-or-shoul…
Apaches lead over IIS is because of parked domains.
Compared to three months ago, Apache has lost 16.7 percent of the market. Netcraft attributes much of the movement from Apache to hosts such as Go Daddy switching over to IIS. Go Daddy’s move to a Microsoft-based environment cost Apache 1.6 million hostnames alone.”
http://arstechnica.com/journals/microsoft.ars/2006/6/8/4271
One hosting company switches its parked Domains to IIS6 and Apache loses 16.7% of the market.
(Bold mine)
The article quoted does NOT say that “one hosting company switches…to IIS6 and [as a consequence] Apache loses 16.7%”. Not even your own quote says that! It says:
1. GoDaddy switched to IIS;
2. Over a three-month period Apache lost 16.7% of the market.
3. “Netcraft attributes much of the movement” to GoDaddy’s move from Apache to IIS. “Much” != “all”. In fact GoDaddy’s move (quoted as 1.6M sites) represents roughly 2 percent of the market.
1. GoDaddy switched to IIS;
2. Over a three-month period Apache lost 16.7% of the market.
3. “Netcraft attributes much of the movement” to GoDaddy’s move from Apache to IIS. “Much” != “all”. In fact GoDaddy’s move (quoted as 1.6M sites) represents roughly 2 percent of the market.
GoDaddy moved the parked domains over a period of months.
In June it was 1.6M. The grand total over 3 months was 4.5M parked domains.
And GoDaddy isn’t thje biggest provider of parked domains.
If Netcraft quit counting parked domains, Apaache would drop behind IIS.
1. GoDaddy switched to IIS;
2. Over a three-month period Apache lost 16.7% of the market.
3. “Netcraft attributes much of the movement” to GoDaddy’s move from Apache to IIS. “Much” != “all”. In fact GoDaddy’s move (quoted as 1.6M sites) represents roughly 2 percent of the market.
GoDaddy moved the parked domains over a period of months.
In June it was 1.6M. The grand total over 3 months was 4.5M parked domains.
Or about 5 percent. Not nearly 16%, and (my personal opinion) not enough to qualify as “much”.
And GoDaddy isn’t thje biggest provider of parked domains.
Which provider(s), being bigger than GoDaddy, have moved (singly, or together) a larger amount of *parked domains* to IIS than GoDaddy ?
Note – Apache’s losses are smaller in each case than Microsoft’s gains.
If Netcraft quit counting parked domains, Apaache would drop behind IIS.
Source?
Not nearly 16%
The lead changed by 16%.
“If you take a look at the last three months, though, Windows IIS (Internet Information Server) numbers have taken a tremendous jump. Apache’s lead over Microsoft, which stood at 48.2 percent in March, has been narrowed to 31.5 percent, a shift of 16.7 percent in just three months.”
And that one domain parking company.
If 6 of equal size switched, Apaches lead would evaporate.
Shrug. It’s probably not good for Apache to have such a big lead; it gives them no reason to improve. Plus, looking at the history of Web servers 1995 to 2006 on the same site, Microsoft has had several large increases in usage followed by large reductions; Apache’s jumps in both directions have been small. Also, as I said earlier, Apache’s lost market share has been smaller than the gains MS is making – and fully one percent of those in June were due to inaccessible sites from one company.
You still haven’t provided evidence for your claim that Apache has more parked domains than any other server, or your inference that that presents a problem for Apache if they become active. So I’m not buying it. It may also be the case that IIS, for whatever reason, is better than Apache for parked domains, or that for parked domains Apache provides no significant improvement over it, but that it’s not as good as Apache for active domains. In which case if those sites that have switched to IIS become active, its either big problems for up to 5 million sites or a switch back to Apache.
As long as Netcraft counts parked domains in its index, instead of functioning domains, the Apache lead over IIS will be considered a joke.
Apache: The webserver of choice for doing nothing at all.
If you consider the Apache lead over IIS as a “joke”, there goes your last shred of credibility.
If you consider the Apache lead over IIS as a “joke”, there goes your last shred of credibility.
If you consider parked domains an important indicator of Apaches “superiority” over IIS, you are a joke.
2% is still a significant amount.
If so, then Linux has “a significant share of the desktop OS market”, and MacOS is a runaway success.
Do you know what the word significant means?
If 6 million americans suddenly moved to Canada, would you not call that significant or a large amount?
We aren’t talking about 6 million people anywhere moving to Canada, which would present logistical challenges far more significant than those involved in changing from Apache to IIS – especially since that need not necessarily include change of OS. In the late 80s and early nineties 2 million people used Amigas – I’d be willing to bet that was about 2% of the market then, at least – but still not “significant” enough to create a critical mass of developers bent on ensuring the success of the platform.
What are you talking about? It was an analogy to illustrate significance and numbers.
Significant is NOT always the same thing as majority/large.
Precisely, and 6 million Americans moving suddenly to Canada WOULD be significant; 2% of websites is NOT significant, particularly when those two percent are controlled by ONE company.
I guess we’ll just have to agree to disagree.
I believe that 1.6million domains switching servers they are hosted on is meaningful, you do not.
And you apparently believe that one company switching its systems is meaningful. I do not.
1)
Symantec’s OS X spyware prediction in flames
Symantec published its 10th Internet Threat Report this week and quietly admitted a few days later that its predictions of increasing Mac-targeted spyware threats have not been realised.
http://www.zdnet.com.au/blogs/securifythis/soa/Symantec_s_OS_X_spyw…
2)
Mac OS X: Viruses and Security
Let’s start with the hot-button issue of Mac OS X viruses. Simply put, at the time of writing this article, there are no file-infecting viruses that can infect Mac OS X.
http://www.symantec.com/enterprise/security_response/weblog/2006/07…
: : Mac OS X : : 5 years on the market : :
yea, it’s getting worse for Mac OS X all the time. Always worse, never better.
: : Microsoft Windos XP : : 5 years on the market : :
well ? you know the rest of that history, all too well
Mac OS X – Getting worse – episode 2345
Why was this modded up? Comparing MacOS X unfavourably to Windows by saying it is getting WORSE and Windows BETTER over the last five years (when in actual fact it’s more like there is ONE new vulnerability we didn’t know about in OS X after 5 years, and maybe 1,000 less (out of how many?) in Windows after 5 years) is just dishonest. You can reduce 2,000 bugs by 1,000, and maybe MS should be given credit for that, but you can’t reduce 1 by 1,000.
So is it a local exploit? Any info on it?
What makes me angry is that we have tons of articles that just say “a vulnerability has been reported…” and then 99% of the text is just babbling about popularity or rehashing old stories. I mean, who do they target anyway?
i believe it was a privilege escalation, requiring local access or an account
http://www.securityfocus.com
Check out IIS vs Apache, 4 vs 2 for the major versions.
I wasn’t picking on IIS, or Microsoft, just trying to point out that being a majority doesn’t automatically make you insecure.
It’s really boils down to good design in the first place, plus how quickly these things are patched properly.
Nothing is going to be 100% secure, but it seems some things suffer from the same bug over and over.
“http://www.securityfocus.com
Check out IIS vs Apache, 4 vs 2 for the major versions.
I wasn’t picking on IIS, or Microsoft, just trying to point out that being a majority doesn’t automatically make you insecure.
————————————————-
Seems that you were picking ont IIS and Microsoft, by focusing on an out of date version of IIS. IIS 4 is old; we may as well talk about Win98 if we’re going to talk about IIS 4. And you said “It’s never been about popularity, just because BSD and Apache own the majority of the webserver market, I don’t see such severe reports about it, yet I still see heaps on IIS“, as if you’re talking about the current version of IIS, which is IIS 6 (and has been for nearly four years, now).
What you were doing, was parrotting the often stated claim that popularity has nothing to do with the number of attacks and then using Apache vs IIS as your proof. I see this all the time on slashdot, and it’s stated as if it’s an axiom or an article of faith (neither of which require proof to back it up).
Here’s the real stats on the latest versions of Apache vs IIS (Apache 2 vs IIS 6):
IIS 6 security record since 2003:
http://secunia.com/product/1438/?task=statistics
From 2003 to the present, IIS6 has had three (that’s right, just three), security advisories, all of which are patched, none of which were highly critical.
Apache 2 security record since 2003:
http://secunia.com/product/73/?task=statistics
33 security advisories, 3 of which are unpatched today, and 3% “highly critical”
Does the above disprove your point? No. But you’d do well not to use Apache vs IIS as your “proof”, as that “proof” is more myth than anything else.
will the growing popularity of both Linux and OS X lead to more of these exploits– possibly one that does get released ‘in time’?
That assumes there isn’t one we don’t know of (0day).
It’s not the flaws/exploits that are known that you need to worry about. It’s those that are only known to a select group (private) that you really really need to worry about.
If an exploit (on any system) is worhwhile to make lots of money with, it will first be exploited before making everyone aware.
Like someone said, this is not the first time time malware for already-patched systems (for Mac, Mac OS X, Windows, Linux, etc.) has appeared – and it certainly won’t be the last.
In related news, a Mr. J. Sixpack of East Bumfsck, Ohio, solved last week’s NY Times crossword puzzle just days after The NY Times published the answers. Congrats!
however remember that most security problem remain right between the keyboard and the chair and some users might wait before applying the patch ( IT validation or something like that, to chek if it did not break current application). So even if the flaw is patched there are still vulnerable system in the wild. ( as long as I remember code red virus was based on a flaw on IIS that was already patched, or was it on mssql ?).
Fortunately Mac OS X are rarely used for mission critical application as linux or windows are used for, so most of the time users are willing to reboot for a patch and in 48 almost all system connected to the internet can be patched.
The article makes a big deal about . “It appears to have been written well before the vulnerability was fixed,”, and that it is a big deal, but even if it had been written after the patch was released it would still be a “medium sized” deal. The overwhelming majority (I’d guess at least 80%) of malware for Windows is written after the patch for the flaw that they exploit has been released. Indeed, most of Windows malware is written by examining a newly released patch to determine the flaw that it fixes, then writing an exploit for that flaw, in order to attack unpatched systems. Another large proportion of Windows malware is simply tweaked versions of existing viruses (so you get versions 1, 2,3,4,5, etc), tweaked in order to get around an anti-virus package’s malware definitions list, but all exploiting the same previously patched OS flaw, so only unpatched systems are vulnerable.
The HUGE virus outbreaks for XP three years ago (blaster, red something, and whichamacallit (I forget the names)), all attacked flaws that had been patched months earlier.
If Mac use increases, there may be a greater attempt by baddies to examine Apple’s Security Updates (which are released fairly frequently, so there are indeed holes in the system), determine what flaws they address, and write code that exploits those flaws to get at systems that are unpatched. And Mac users may be more vulnerable, since they (we, since I fall into this category) have a sense of invulnerability, running with no anti-malware software (sorry, Symantec, I’m not putting your crap on my Mac), putting off applying Security Updates (if I see that a reboot is required, I do procrastinate applying Security Updates, something that I would never do on my Windows computer (which I have set to do Automatic updates)), and being more likely to blissfully browse questionable sites.
Edited 2006-10-03 13:11
Pointing out bug reports is a useless statistic. These reports don’t take in to account severity, range, etc.
Making predictions that if the Mac had more market share a virus/exploit would be more likely and dangerous is pointless also. Whether fair or not, the market share and an exploits ability to propagate should be a consideration.
The bottom line is under the default setup is an average end user more secure or not.
Excuses for any OS saying that it can be made secure are useless. Any piece of software including an OS whether Linux, Windows, Mac OS X can always being locked down by a knowledgable user.
Generally exploits and viruses aren’t propagated by knowledgable IT managers or experienced users, its the users who don’t know better. The ones that open attachments in email without any protection for example.
IMHO the fact that a huge percentage of the Windows XP user base is running day to day accounts with administration rights makes it inherently more insecure than the other OSes. Sure this is a user problem, and Vista will help, (assuming users switch), but this is a real world security flaw.
Windows fanboys: Yes we know Windows should be locked down and administration rights should not be needed, but the fact is the Windows makes it too easy and some software requires it.
Most Linux distros in comparison are now defaulting to not allowing root users X server access. Sure it can be bypassed, but for end users it a big stop sign to preventing a foolish act.
no kernel/OS is 100% invulnerable, right? what’s the big deal if one flaw is discovered every now and then? I mean, isn’t that to be expected every once in a while?
I still put my money with Linux and other open source products. At least the hole can be fixed ASAP without having to rely on a company that does damage control for a month before they release a fix.
Wasn’t two weeks ago the “defy that Mac is more popular” week where everyone said “look, usage stats for OS X aren’t up at all!”
Which do you want? Less popular and less secure or more popular and less secure. Because only one of them protects the myth that popularity determines exploitation…
As in every crime, motive drives exploitation. For botnets (a very popular reason to exploit users), popularity and bandwidth are the driving factors. For adware, it’s probably popularity, and the derived average intelligence — Windows users are likely less acutely intelligent on average when it comes to consumer awareness, so they’re less likely to actively boycott an advertiser and more likely to be influenced by a picture ad on their PC.
Many Windows XP users have administrator privileges because they need to run some software that won’t run otherwise. This is particularly likely to be true if they run an older program that goes back to the Windows 98 days.
Example: my daughter was given a computer game for her birthday. We usually run Linux at home, but can dual-boot one machine to XP if we really need to. Sure enough, to get the game to work, I had to make my six-year-old an administrator.
It’s not user stupidity when you have to turn off security to get your work done. To be fair to Microsoft, they are handicapped by legacy: every boneheaded mistake they ever made that’s visible by some ABI has to be preserved forever.
[JoeBuck said]: Many Windows XP users have administrator privileges because they need to run some software that won’t run otherwise. This is particularly likely to be true if they run an older program that goes back to the Windows 98 days.
Example: my daughter was given a computer game for her birthday. We usually run Linux at home, but can dual-boot one machine to XP if we really need to. Sure enough, to get the game to work, I had to make my six-year-old an administrator.
It’s not user stupidity when you have to turn off security to get your work done.[endquote]
I strongly disagree. This can be the case with some older programs and a few relatively new ones too, but this isn’t the real reason most users run with admin rights. I should be eliminated as an easy option.
(There are better options. Since VirtualPC is free MS or any user could easily implement a virtual machine for these “special cases”)
[JoeBuck sys]: To be fair to Microsoft, they are handicapped by legacy: every boneheaded mistake they ever made that’s visible by some ABI has to be preserved forever.[End Quote]
I 100% agree. MS is at a a crossroads after Vista. Its time they scrap the legacy stuff and build a solid base for future development. They use the argument that they need to maintain compatability for users, but each release always has add problems anyways, and if Apple can do it, I’m sure Microsoft can too.
Edited 2006-10-03 17:57
Apple has nowhere near the marketshare, particularly with businesses running critical applications, that MS does. If MS abruptly breaks compatibility, many customers will not move forward. Apple has lost much of the business market because of frequent breaks and lack of roadmap.
You will see increased reliance on virtualization going forward, at the system and application levels. Those plans have already been laid out or hinted at publicly, but compatibility itself isn’t as big of a problem as many make it out to be. In most cases it isn’t the same old code running in Windows, but support for routing the old API calls through new APIs or virtualization layers.
Most of the problems with running games as standard user on current versions of Windows can be fixed by modifying permissions on certain directories (like the app directory in Program Files) rather than having to run full time as admin. You can also use RunAs for running specific applications as admin (unless it’s been updated, some games that use SafeDisc or similar require running as admin for instance).
To understand the timing of this, you could probably assume a few things about the creator of the exploit.
Firstly, either the individual has money problems, or just not much extra to spare on minor things, such as purchasing MacOS X updates.
So, we could assume that the MacOS X version on the perps machine was out of date, and probably still is.
Not only that, this should help explain why viruses (p. virii ? – what I was taught in school, but wrong it seems… stupid American school system) are often behind the curve in “rapid-fire-patched” systems, such as Mac OS X, Linux, (and soon) Haiku.
This is not a bad thing, just the way the world works. Just like in the Windows XP world, where you could be running anything from Windows 2.1 Interface Manager to Windows Vista RC1+, and EVERYTHING in between.
There are likely viruses written to exploit flaws that may have only been present for fifteen minutes, on one system, because of an odd software combination (I have fought with these types myself on systems configured close enough to allow *harmful* infection).
Of course, I would be most just try and use Windows Kernel calls… so, if Vista truly secures the Kernel, that could make Windows finally usable and tolerable again.. if you ignore whatever you may hate about it (I hate nearly everything, usability-wise, about it).
All of the above seems to be a good reason why to stay up-to-date. Well, depends on what your running. Keeping Windows up to date has been, historically, a pain in the arse. Not because the updates were hard to find, hard to install, or anything like that. No, it is the fact that at least 1/3rd of every Critical Patch released by Microsoft for XP was incompatible, in some form, with well.. XP! Meaning, to get one patch, one normally had to get many many many more, first. That, in and of itself, is only a minor annoyance thanks to the relative ease of use of the updating service.
The problem, is, as usual, the registry. Nearly 1/3rd of the time, when I actually would use Windows for something, when I decided to go ahead and patch it up some.. maybe to get USB2 working, whatever.. I had to reformat and start over after one of the fifteen or twenty or thirty or forty patches were installed, one after another, with no stability or compatibility checking during the process.. no logical checks at all it seemed… just copy this into file X at offset 0x33ff blah blah…
This is not true on all machines, I have a couple of customers running Windows XP Pro, from day one, and have automatically had EVERY patch applied… at release time, pretty much. In this case, which is rare, the XP systems both went down at the same time, on the same day, in the same hour… I got the calls.
Problem? Virus? Spyware? Nope, none of that on either.. just some bad update was installed. Removed, fine and dandy, Windows XP re-installed the patch about 5 seconds after reboot on one (and crashed instantly), and I disabled automatic updates in Safe Mode on the other (learned my lesson from the first, which I fixed also, of course).
So now, why do virus writers write viruses? Personal reasons in some cases, just for ‘fun’ in others, to push the envelope, to act out against society, to lash out a particular software product, whatever the case may be. Popularity has only as much to do with it as there are people who understand how to use what they are using to such an extreme.
Slow adaptation to a new platform is common. We all have pretty much gone through the rigeurs of learning and adapting to a new software platform, even if just between one version of Windows to the next, you know the learning curve.. even just for USING it. The learning curve for creating malware is, always higher.
If you are a developer, then you can just imagine that if you make your API full of holes and gaps, you will lose one part of the virus making audience. If you are not #1, you lose another part. If you are not Windows, or don’t act anything like Windows internally, you lose a large chunk of the virus-making potential.
If you run out and scream: I’M BULLETPROOF!!! Someone may just shoot you… to prove you right! (i.e. Firefox’s understandable ignorance – not idiocy).
If you run around and say, nothing on my system makes me secure, I just have no Windows compatibility, no Linux compatibility.. whatever, I’m safe enough… then you need to think about one very important item than is a security hole that IS exploitable… and exploited, on EVERY OS that supports it (which is very, very, many)… The TCP/IP stack, ftp/http servers (any of them, really), ANY system-critical application or server than can be killed, frozen, whatever.
Using BeOS, I can see that my possible points of vulnerability are enormous. The only thing I can not kill on my system, is the core thread of the kernel. And only, just that one thread!!! But, only one virus has ever been known for BeOS. Why? We haven’t attracted the virus making audience yet by being placed in a position to be hacked.
Virus writers are usually not worried about the home machines, except to breed user contempt for the OS. No, normally, a virus is targeted to some specific internet-centric feature.. such as IIS, Apache (or other server products), Eudora or Outlook, Internet Explorer, Firefox, Norton (worst thing to have on your system, IMHO, if you don’t want viruses)…
Windows screwed up in XP and left the Messenger service running (a service which is DESIGNED to allow network (a.k.a. internet) – born infections..errr..messages… to be placed on your Windows box.. with no exploits needed!!!!!!!!!!
XP also has the Remote Procedure Call service ( and others ), which can safely be killed and restarted a thousand times a session without harming the system, setup to lock you from your machine, with no cancel button, until it restarts.. but at least it gives you a countdown….
Don’t run the services and most targeted software, even on Windows, and you are not likely to be targeted by a virus writer intentionally. Meaning, don’t use popular software… even if it just happens to be popular because of how secure it is. Bull, everything is flawed! It just matters on the user knowing what to avoid to prevent infection.. and on many systems the only way not to get infected (because they use all the biggest software products available (Dell is BAD with this, as is HP, et al)) is to unplug from the internet, never place a CD-rom or disk in any drive without scanning on a non-Windows (read: non-major) software package first, etc… It can become a real pain if you 100% cannot afford to have a single infection.
Regardless of OS.
–The loon
Of course root escalation is serious, but all the dimwits talking about doom and gloom and trying to make MickeySoft look “secure” are just talking out their collective asses.
“The vulnerability could be exploited by a local attacker or someone with privileges to remotely log-on to a machine. Macs that are used by multiple people as well as servers with remote access capabilities are most at risk, experts said. A user with limited privileges could exploit the flaw to possibly gain full system access.”
So the attacker needs to either be:
-sitting at the machine and logged-in as someone. I think you’ve already lost at that point, flaw or not.
-logged-in via ssh or telnet remotely. Of course neither of these services are on by default. And yeah, the attacker either needs to know or guess your password first.
Nothing like 0-day IE exploits that steal your computer by visiting a website. Not even the same league, you ignorant haters.
The vulnerability could be exploited by a local attacker or someone with privileges to remotely log-on to a machine.
I’ve set up all my OS X machines to not allow remote services. It works for me; I don’t have a need for remote access. Obviously not workable for everbody.
If you don’t need it, don’t turn it on is my philosophy.
Local exploits? Possibly. But it’s made that much harder because none of my machines boot/wake from sleep/stop screensaver without a password. I don’t even allow my machines to share files over the network. (Largely because I don’t want my computer inept husband to somehow stumble beyond the bounds of his laptop and use his amazing stuperpowers* to get everything completely frelled up.) At my house, if you’re on machine A and you need a file from machine B, you do without or you grab a thumbdrive and go get it.
Again, not practical for everybody, and I don’t think the world’s beating down my doors to get access to my home computers, but a short walk in my house isn’t going to kill me.
—
*Oh, the horror stories I can tell ….
Edited 2006-10-03 21:22