Username or EmailPassword
Yeah, I switched, but mostly because I'm an OS junkie. I just like running all of the fun *nix emualtors with a good UI, for the most part. (You could flame this either way, but I'd just rather not go there.)
OS X has been nice enough for me, but there's no way I'd recommend it to every one.
I don't think virtualization/emulation will ever cut it in the real world. You'd need to natively port the APIs, then you'd have something that people wouldn't have to argue about.
Virtualization has been cutting in the real world for decades, back to the first timesharing mainframes. What it means to be virtualization and to be an OS has changed over the years, but the general theory is the same. One of IBM's key strategic initiatives is simply, "virtualize everything." The demand for virtualization solutions is strong and diverse, spanning virtually (no pun intended) all sectors of the computing market.
People want to run more tasks at the same time, and tasks are easier to program if they think they have unfettered access to the hardware. This can work to your advantage on many levels. Hardware can run multiple OSs, an OS can run other OSs, which in turn could run multiple instances of itself, each of which can run multiple tasks, which finally can spawn threads. The many levels of abstraction have many compelling advantages, too many to list here.
I wouldn't rule out the possibility that in 10 years, application software might be customarily delivered as a virtual machine.