Linked by Thom Holwerda on Tue 10th Nov 2009 09:31 UTC
Windows Last week, security vendor Sophos published a blog post in which it said that Windows 7 was vulnerable to 8 our of 10 of the most common viruses. Microsoft has responded to these test results, which are a classic case of "scare 'm and they'll fall in line".
Thread beginning with comment 393873
To view parent comment, click here.
To read all comments associated with this story, please click here.
boldingd
Member since:
2009-02-19

There are a few problems with this approach, among them the performance hit that would come from virtualization (which might be small, but won't be zero), or the fact that a virtual machine wouldn't expose the host's hardware well (in particular, so far as I know, there's not good, high-performance way to expose the host's GPU). There's also the problems that, then you've got a lot of still-fundamentally-insecure apps running together in a virtual machine that's running a guest OS that's less-secure than the host. If any of those legacy apps manage sensitive information, and the virtual machine gets compromised, then you have a serious problem. There's also the fact that many insecure, low-level APIs don't virtualzie well.
Apple did something like this when they moved to OS X: if you had an <= OS 9 application, OS X would try to run the application in what amounted to an emulated OS 9. It didn't work very well; most legacy apps either didn't run well, or didn't run at all, and they didn't integrate with the rest of the system regardless. I think most Mac users took the hint and wrote off their Mac Classic applications, and used OS X native equivalents if they existed, and did without when equivalents weren't available. I know that's what I did.
I'm a fan of virtualization, but it's not a panacea, and it's not really a good way to handle any legacy apps on which you're dependent. At least, not in a desktop environment.

My other concern is that legacy applications and backwards-compatability really are good things. As someone else on this site has elegantly said before, you don't throw out a code-base with a 20-year track record just because the OS vendor says it's time to move on.

Reply Parent Score: 3

kaiwai Member since:
2005-07-06

There are a few problems with this approach, among them the performance hit that would come from virtualization (which might be small, but won't be zero), or the fact that a virtual machine wouldn't expose the host's hardware well (in particular, so far as I know, there's not good, high-performance way to expose the host's GPU). There's also the problems that, then you've got a lot of still-fundamentally-insecure apps running together in a virtual machine that's running a guest OS that's less-secure than the host. If any of those legacy apps manage sensitive information, and the virtual machine gets compromised, then you have a serious problem. There's also the fact that many insecure, low-level APIs don't virtualzie well.


Virtualisation isn't meant to be a long term solution - it is only there for backwards compatibility until such time that the customer can upgrade their software to a version that is compatible with the underlying operating system. It is a zimmer frame for applications - that is it. Time for people to wake up and stop expecting software to be perpetually supported on their computer - attitudes expecting perpetual support are as stupid as the person who fills up their car once with petrol and is pissed off when he or she finds out that they need to fill up the tank again.

You buy a car, you need to fill it up with petrol and maintain it. You exist because you have to go out and purchase groceries from the supermarket. You want to run BluRay? get a BluRay drive. Life is a continuous movement forward - stop trying to hold onto the door frame like a child being told that they need to go to the dentist.

Apple did something like this when they moved to OS X: if you had an <= OS 9 application, OS X would try to run the application in what amounted to an emulated OS 9. It didn't work very well; most legacy apps either didn't run well, or didn't run at all, and they didn't integrate with the rest of the system regardless. I think most Mac users took the hint and wrote off their Mac Classic applications, and used OS X native equivalents if they existed, and did without when equivalents weren't available. I know that's what I did.
I'm a fan of virtualization, but it's not a panacea, and it's not really a good way to handle any legacy apps on which you're dependent. At least, not in a desktop environment.


And you know, here we are 8 years later, after Apple bit the bullet and they have a top of the line operating system. They made the tough decision when they needed to - Microsoft every release doesn't want to address the problem. They're like the obese person who tries every diet under the sun; the pickle diet, the orange diet, the prune diet - all hoping that there is an easy way out instead of facing reality that it is calories in, calories out. Microsoft is like that obese person - avoiding what needs to be done by gravitating around the periphery.

My other concern is that legacy applications and backwards-compatability really are good things. As someone else on this site has elegantly said before, you don't throw out a code-base with a 20-year track record just because the OS vendor says it's time to move on.


Who the hell said throwing out old code for the sake of it. When the new code addresses all the flaw of the old code and a period of time has been given for programmers to migrate off the old API - you then need to remove it. More code staying in the code base means more area for which a hacker or cracker can aim at.

Yes, keep old code for a period of 5 years to allow customers to migrate off it and address the concerns if the new API lacks certain features developers require - but that isn't and shouldn't be an invitation to keep layering multiple API's from 20 years worth of development. You create an API, 5 years later you realise that assumptions made in that design aren't meeting the requirements so you create a new API that replaces it. You deprecate it, you remove the ability to compile against it then eventually you remove support from the operating system.

Again, it is pathetic and childish to label what I posted as merely a knee jerk reaction of throwing out 20 year ideas out the window because I feel like it. I've laid out reasons for why you should, not just practical but also economic reasons as well. Instead of repeating the same things over and over again - address why what I state can't and won't work in reality.

Edited 2009-11-11 06:15 UTC

Reply Parent Score: 2

vaughancoveny Member since:
2007-12-26

Lateral and constructive thinking could be used to solve
Virtualization and Legacy Application problems.

Lateral Thinking is concerned with using random words to change concepts.

Constructive thinking places judgements from people down side-by-side. Rather than the old western argument system.

Its time to move away from these age-old problems.

Reply Parent Score: 1