The second beta of Vista will not come until next year. Microsoft did not give a time frame for the release of Beta 2 of the operating system, and said only that it would have more to say next year. MS had not said when Beta 2 will come, but some had expected it might come in December or January. More here, and here is a review of build 5259 that was supposed to be the November CTP, but actually wasn’t (get it?). In related news, Microsoft has entered the beta period for Windows OneCare Live.
Since Vista is Microsoft’s ‘copland’ it would be worth while for OS News, once vista finally comes out to do a “what they told us it would have” versus “what it actually has”
Other than that – no news here
Yeah… wow… how delayed is Vista now? I predict by the time they get it released, I’ll have saved up enough money and they’ll have an intel apple laptop released so i can ditch Windows here.
Um.. it’s not delayed this time. They’ve been saying 2nd half of 2006 for a while now, and that’s still the target release.
That wasn’t Microsoft’s original intended release date, iirc they overshot that by a few years. Therefore I agree with the person you’re replying to that Windows Vista was delayed.
YES, vista WAS delayed. However, they said Vista was delayed AGAIN, when it was in fact not. Follow now?
I think I read somewhere that Apple will launch Intel systems as early as January, so you might actually be able to get your laptop before Vista beta 2 ships.
that means you could test Vista beta 2 on a Intel apple
Gives us time to buy that extra Gig of a ram and a 256MB graphics card.
They might as well get it right as rush it out, there is a lot of work going into it.
Vista requirement sarcasm oh boy! >_<“
Now now … why the GB of RAM? This isn’t GNOME.
I’m going to start making notches in my belt for every time I call you out for spouting garbage.
From http://www.microsoft.com/technet/windowsvista/evaluate/hardware/vis…
To take better advantage of Windows Vista functionality, you should have at least 512 MB of RAM, on your PC. This provides enough memory for both the operating system and a typical application workload. And while 512 MB is great for many scenarios, more advanced users will want 1 GB of memory or more. If your typical workload is heavy, you do a lot of image editing or development, or you run multiple applications all the time, then more memory is good. In general, an investment in additional memory is wise, and you should certainly make sure that the computer you buy has room to add additional memory later. [emphasis mine]
Amazing what you can learn when you go out and educate yourself instead of spouting inflammatory nonsense, isn’t it?
512 is the requirement for the full blown Aero Glass experience. 1024 is the recommended for advanced users, which isn’t much different from now. Most advanced users should have 1gb of mem when they have heavy workloads.
However, run with plain Aero or classic and you need even less memory. You know what that is called? Choice. Microsoft is offering more choice than they have before as far as this goes.
Also, your link lists the requirements for a graphics card, and it says 64mb, not 256, like the grandparent poster stated.
For the fullblown Gnome experience you need 128 MB RAM, with Gaim, Firefox, Thunderbird, OpenOffice, Firestarter, XMMS, Skype. several consoles and Eclipse running. (I expect you know what Eclipse is).
Windows Advanced Users == Anyone Doing Something More Advanced Than Doubleclicking An Icon.
The System Requirements for Vista is right out insane compared with other systems.
Aero Glass is bloat, it’s a mistake, and puts extra stress on the CPU. Something Windows doesn’t handle very well.
Windows (even 2K3) breaks down in regard to responsiveness when CPU-idle goes below 90.
Considering the earlier Windows releases Vista will probably not be able of opening an icon without at least 2 GB of RAM
Now, if MS could just code properly, throw out the garbage, and do it right this time.
Before I can consider Windows a professional system, it needs to lose a lot of weight in regard to system usage. Look at alternative OS’es (SkyOS, AmigaOS, BeOS, Haiku, Syllable) to see how this _can_ be done.
Vista is about a decade delayed. We now have the right to demand anything from Microsoft – and expect it to be fulfilled.
Uhm…. 128mb with gaim, firefox, thunderbird, openoffice, firestarer, xmms, skype, eclipse all running? HAHAAHHA. That’s a freakin’ joke and you know it. Half of those apps are pretty big memory hogs (FF, OO, eclipse). I’ve run Ubuntu with GNOME and unneccesary services, and I certain could not run all those apps comfortably with 128mb. You are stretching that so far it’s scary. I find it hard to even take your post seriously after reading that.
And I like your arbitrary definition of an advanced user.
I would put requirements and resource usage for vista/aero just a tad above OS X/Aqua (at least before Tiger).
Aero Glass is bloat? Extra stress on the cpu? Where are you getting this stuff. The point of WPF is that you can do stuff like Aero Glass and it puts LESS stress on the CPU than UI with GDI.
Windows responsiveness, I have not experience what you claim, but there is no objective way to argue this really, so why should we bother?
Considering the earlier Windows releases Vista will probably not be able of opening an icon without at least 2 GB of RAM
Seriously, why should I take this post seriously when you post drivel like this? You know that someone will probably read that and take it to heart because they respect you as a poster.
And yes, Vista has been delayed way too much, WE KNOW. The point is, it was NOT delayed this time. What don’t you people understand about this?
I agree. Gnome needs at least 256 MB of RAM to run happily, and I would prefer 512 MB with that number of applications. Be that as it may, I run Gnome very comfortably at home on a P3 1 GHz with 512 MB PC133 RAM. I generally have Firefox, Evolution, xchat, Gedit, Liverea, XMMS, Firestarter, and GAIM all running simultaneously, and experience no problems by further opening OpenOffice Writer or Abiword.
Reports of Gnome’s bloat have been greatly exaggerated.
For the fullblown Gnome experience you need 128 MB RAM, with Gaim, Firefox, Thunderbird, OpenOffice, Firestarter, XMMS, Skype. several consoles and Eclipse running. (I expect you know what Eclipse is).
And you would wade through syrup when switching between the programs. The stuff may run, but it won’t be any exceptional good experience unless you have around 512MB of ram to go with that stuff.
Aero Glass is bloat, it’s a mistake, and puts extra stress on the CPU. Something Windows doesn’t handle very well.
It puts extra stress on the GPU, that otherwise would be mostly idle.
It’s too bad that OSNews don’t have a mod down option for obviously misinformed opinions.
It puts extra stress on the GPU, that otherwise would be mostly idle.
Windows has a very poor scheduler. It doesn’t handle it too well at all with CPU usage above 10% – unless you start modifying the priorities of running programs so they run with idle priority (except for system critical processes).
Personally my cpu usually runs idle 0% of the time. On both systems, that is.
Speed and stability is everything. A good look is nothing.
You keep saying that Windows has a very poor scheduler, but I’ve never seen anyone else say that or anything to back it up.
But regardless, what you said has nothing to do with what he said, at least in context to what you quoted.
Are you ok tonight man? You seem way off.
It puts extra stress on the GPU, that otherwise would be mostly idle.
Windows has a very poor scheduler. It doesn’t handle it too well at all with CPU usage above 10% – unless you start modifying the priorities of running programs so they run with idle priority (except for system critical processes).
What does that have to do with GPU being mostly idle?
Ignore him. He’s had too much GNU/Linux cake tonight.
That’s not true. Gnome runs fine on 128MB of RAM with a few (4ish) programs open (say, firefox, gaim, abiword, and your favorite smaller program).
Yes, you’ll definitely swap, but you’ll be well under swapping as much as you have in RAM, so you’ll be ok.
If you want to use a machine with 128MB of RAM on a modern DE, Gnome is probably your best choice; except maybe Xfce.
Ubuntu ran like ass with 128 MB. Opening a file browser window alone caused the hard drive to grind, and it would come up about 5 seconds later.
Actually Ubuntu has Gnome working nicely even on systems that only have 128 Mb of ram. But then you like spewing out anti-linux libel don’t you.
I ran a stock Ubuntu 5.04 on 192 MB of RAM, and it was as slow as an old dog. I won’t even count the 5-minute boot time.
I ran ubuntu on 128Mb (SD)RAM this week and while can use it, I wouldn’t recommend it unless you’re a patient man. Running eclipse was just impossible, browsing only worked half-decent with 1 to 3 tabs open… more would just kill the browser. I’d say a minimum of 384Mb RAM is quite normal for ubuntu (+gnome), 256Mb probably works as well… and now back on topic :p
Dude, you don’t run Eclipse unless you have AT LEAST 512MB of RAM. Every one should have figured this out by now….
What are you talking about? dylansmrjones runs it in GNOME with Firefox, Thunderbird, OpenOffice.org and other apps with only 128mb!!
Gnome doesn’t require 1 GB of RAM. It runs smoothly with 128 MB RAM on any system newer than K6-3 / PIII and above 500 MHz (if setup properly – avoid Fedora and it’s likes).
It requires a lot less memory than Mac OS X and Windows 2003 Server, and is a lot more responsive. Only during compilation do I see memory usage going up in the slightly illegal area (above 512 MB RAM). And playback of video during compilation is perfect. Without modifying niceness
So don’t spread your lies L-I-P!
GNOME runs smoothly with 128 MB? Yeah, maybe GNOME and absolutely *nothing* else. If you want to get anywhere near smooth usability with GNOME and any non-200 KB application, you ought to have at least 256 MB, and if you’ll be running Firefox/XMMS/Evolution/etc all at once, minimum is 512 for a smooth experience.
By the way … my installation of OS X 10.4.3 uses 50 MB of active memory and 50 MB of inactive memory after boot. A stock Ubuntu uses something like 110 MB active — and Ubuntu is supposed to be one of those “optimized” distros.
Don’t spread your lies, dylansmrjones!
Well, comparing apples with oranges always do the trick. We can beat each other with memory use as long as we don’t use the same terms in the same meaning.
On an Athlon 600 with 128 MB RAM you can easily run a lot more than just Gnome, if you’re using an optimized distro for you computer. Of course, using a mainstream kitchen sink distro (like FC3) won’t do you no good.
However, running it on a 128 MB RAM Pentium MMX 200 is a different matter. That’s right out impossible, unless you’re willing to wait 5 minutes for icons to show and so on.
But on a reasonably modern system you don’t need all that much to run it reasonably (but don’t forget splitting swap out on several partitions on several harddrives – e.g. 1 swap partition per harddrive – no more, no less).
A standard kitchen sink linux distro is unusable on anything but major systems.
Ohhhh so it was okay for you to compare memory usage between GNOME and OS X, but when I call you on it and provide actual numbers, it’s an unfair comparison?
Damn, I should be more careful! Sorry!
What is an optimized distro? What can I run along with GNOME on 128 MB and retain smooth usability? Come on, specifics, please.
PS: 128 MB of RAM + P200 MMX = 5 minute wait for the desktop to load, but 128 MB of RAM + Athlon 600 = snappy? That is either a lie, or GNOME has such bad programming that it requires 600 MHz of Athlon power to process desktop items and icons. Which is it?
I cannot get OneCare to install and run. I keep getting errors during the installation. ‘Beta’
“I cannot get OneCare to install and run. I keep getting errors during the installation. ‘Beta'”
I had no problems at all getting it to install and run.
Rather impressed actually.
I installed it on a known to be trashed system, and it did a pretty fare job of cleaning it up.
It’s slow when running a Full Service scan, but the results were positive.
It’s certainly good enough to have the marketing folks at Symantec and other anti-malware providers, pissing straight down both pant legs.
“Next year” is only a few weeks away … no big shock, here.
“Now now … why the GB of RAM? This isn’t GNOME.
Because it’s Microsoft. The true kings of bloat.
Don’t you mean Netscape?
Yeah, try telling that to the typical 3 GB+ Linux installs that take 5 minutes to boot.
My ubuntu desktop, unpatched and as default as it could be (I could use initng and write my own .xsession file, it gives ridiculously fast startup times), takes about 40 seconds to get to the login, and 20-30 more to be fully logged in. Your 5 minutes logon is somewhat bogus, unless you’re talking about GNOME/KDE on a AMD K6 300mhz laptop with a 2000rpm hard drive.
Speaking of that, GNOME 2.14 will be faster. A lot faster. I hope Nat/Federico/Miguel/others’ hacks will get there, I want my 4 seconds flat logon just to make jaws drop
Yeah, they say that about GNOME with every release — too bad it’s never true.
Boot time of Ubuntu on a PII-266 w/ 128 MB of RAM: ~5 minutes (literally)
Boot time of Server 2003 Standard on a PII-266 w/ 128 MB of RAM: 1 minute (literally)
I shit you not. Why the huge discrepancy?
Bloat.
Hi, I believe you and agree totally that on old machines, a default distro is not as slim as Windows. I just thought I might drop you a huge record of links however, because I think “Yeah, they say that about GNOME with every release — too bad it’s never true. ” is unfair, or to say the least, seems to come from a misinformed person. I’m just willing to show some of the things that I am really happy to see being worked on GNOME 2.14. I don’t know about those “previous promises” they might have done, I am here since GNOME 2.6 only. From what I read everywhere though, the main focus of 2.14 is performance, not “omg we got some totally awesome, it’s awesome, and it rocks and we like awesome new features!”.
Know that there are a bunch of genius hackers working full time on squeezing performance out of GNOME and other applications currently. Here is a bunch of links to back up my saying.
By the way, I agree that the current state of desktop Linux is sluggish. I am just saying that it will come someday. And I don’t believe we can take longer than Vista, can we?
http://initng.thinktux.net/index.php/Main_Page
http://mces.blogspot.com/2005/11/pango-extra-lean-1110-released.htm…
http://nat.org/2005/october/#Keep-It-Simple-Stupid
http://nat.org/2005/november/#OpenOffice-startup-time
http://primates.ximian.com/~federico/news-2005-11.html#moz-images
http://primates.ximian.com/~federico/news-2005-10.html#oocalc-perfo…
http://primates.ximian.com/~federico/news-2005-10.html#08
http://primates.ximian.com/~federico/news-2005-10.html#20
http://primates.ximian.com/~federico/news-2005-11.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-10.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-10.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-10.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-10.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-10.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-09.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-09.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-09.html#gtkfilechoos…
http://primates.ximian.com/~federico/news-2005-07.html#26
http://live.gnome.org/RoadMap
But that is, of course, only the knowledge I have about GNOME. Sorry I don’t know anything about what KDE is planning
Oh, agreed. However, the problem is that there is fighting within the GNOME group itself:
One group is heavy-set on tweaking performance of parts here and there, and generally improving the user experience. The other group is bent on adding more and more features, and more and more abstraction.
The net result is very glacial movement in the direction of improved performance.
Vista is slipping again, no news here…
This is just funny, that’s all. I work as a programmer, so I know about slip dates, don’t get me wrong, but this is getting a little funny I think.
Vista will be out in 04, then 05 and now 06 (maybe), that’s around 3 years! But not just 3 years, 3 years with a product less than what they originally planned for 04. When I look at Vista, I’m struggling to see 4-5 years worth of work (I’m assuming they started with Longhorn a little after the release of XP). I can see a couple of years worth, but not 4-5?
Apple tried this with Copland (as someone has pointed out) and failed (if they had MS’s money and commitments, then maybe we would have Copland now). OS X does have some of the Capland tech, so it wasn’t all wasted.
I think MS would have loved to have thrown in the towel a long time ago and do what Apple did with Darwin, put their application layer on top of a *nix kernel like Linux or some such, but are now committed. I think Apple did the right thing, I think MS needs to do the same but can’t. They would also have to move some cool stuff like completion ports and so on to the kernel, but then MS could concentrate on the App related stuff like Apple does, and less on the underlying OS.
I’m sure we will all read soon what is going on within the walls at MS, I’ve heard some interesting tales already… Anyway, MS can only grow and learn from all this, but it also means that everyone else now had their foot in the door.
As I already stated, 2nd half of 2006 has been the target release date for a while now (since the reset last year). Nov/Dec 2005 was the projected release date of Beta2, but it was NOT official, or at least not publically official. The beta2 date has slipped, but the final release date still remains the same.
Also, while they have cut back a few things (WinFS likely won’t be shipped with it, and Monad likely won’t be shipped with it), they have also announced many other things since that were not announced before (new network stack, net audio stack, new color management system, etc).
People read that feature Y won’t end up in Vista and get knee-jerk reactions that things are going bad. Then the cool things MS is adding to Vista, or things they are redoing, don’t seem to get publicized. Why? Well, bad news makes for more hits I guess.
My point is simply that there are a lot more changes in Vista than people are lead to believe by reading these articles.
That may well be the case, there must be a lot more to Vista than I have read about so far. I’m not a tester, so I haven’t used it yet, but from what I’ve read, I can’t see 4-5 years worth of work in this one. Maybe a couple of years.
Oh, and WinFS is a pretty big “cut back a few things” thing. It’s not like they are cutting back some cosmetic thing, or something like that. WinFS is a pretty big deal, and one of the main things that MS pushed as to why you’d want to switch a few years back, infact, I’m pretty sure it was the main thing they pushed, that and Avalon.
Who knows, they may surprise us and pop it back in, or at least get it to us in the following months of the launch, maybe SP1.
My point was that MS has been slipping a lot lately, and XP is feeling very old. Imagine if OS X was still at 10.2, the Apple guys would still defend it, but to be honest, it needs to be further along now. Linux is catching up way more than MS would have liked (which I think is a good thing) and Apple is moving further ahead (at least as a desktop OS).
Yes, WinFS was a big thing. However, they didn’t drop development, just decided to not ship it with Vista for sure, in case it’s not ready.
There are a lot of other major things, though none as “revolutionary” as WinFS. I’ll give you that point. But WinFS was not the only big thing in Vista.
Some big things going into Vista:
– Windows Presentation Foundation (Avalon)
– Windows Communications Foundation (Indigo)
– WinFX apis (emcompasses the above two and more) to supersede Win32
– New network stack that gives more low-level control for such apps as firewalls, and better performance
– New audio stack which does such things as provide per-app volume control, and is moved into userland so an audio driver can’t take down the system
– A lot of bundled apps are getting overhauls (Sndrec32, outlook, wmp, picture viewer, explorer)
– Virtual folders and fast global system searches
– New Color Management system
– New printer stack and architecture (XPS)
Some of that may not seem like much to you, or an average user, but some are major things for developers. You’ll start to see after Vista is released a lot of updated applications with improved functionality and integration because of the changes in the platform. Trust me on this one thing, if you trust me on anything. I’m a developer for Windows, so I’m looking forward to a lot of these changes. But don’t get me wrong, I don’t currently write commercial software for Windows, just freeware, so I have nothing to gain financially.
WinFX, new network stack, audio stack and etc. is nothing big. It’s just very very minor bugfixes. Fact is Windows is 25 years behind. Even though win2K3 is snappy, it’s still a beautified corpse.
None of what’s happening to Vista is anything good – not even for a developer. They just mean more conflicts to solve, so far.
Let’s see if it ever ships.
Are you serious?
Wow, what a joke.
I’m sorry, I’m done with you (in this article). Your skewed definition are ridiculous.
Yeah… a whole new system-wide API is nothing big, just a minor bug fix. 25 years behind… I think Linux Is Poo got a hold of your account and is playing devils advocate.
Nah, that’s pure dylansmrjones, and only him. When I post my anti-Linux drivel, I usually retain at least some semblance of believability. 😉
Twenty-five years behind … hahaha that’s great. What does that make Linux? Forty years behind?
I’m a Windows developer too, work and home.
My point was that should the list you provided have taken 4-5 years to put together (maybe I’ve missed something, maybe it should have)? I remember back in 04 that Avalon was first shown (almost 2 years ago now), so I know that one was pretty much done a while back.
Remember also that OS’s and Office are the foundation products for MS, and MS has a few resources to put together for this, so their would have been a prioity to get these done.
I agree that this is all good for the developer if we wish to go with these new API’s and frameworks (however, due to the target audience, some of us may not be able to use these new API’s right off as most of our install base may be stuck on older variants of Windows – but I take your point).
One example of a company that has been able to do what MS struggles with is Apple. Apple has been able to introduce new API’s with 10.3 and 10.4 (core image, core video, core data, core audio) all of which are pretty big (not like WinFS I guess, but pretty big, in the case of core image – Avalon big), not to mention a host of smaller ones too – including Spotlight, updating Quicktime and so on.
All of these have come out since Vista was originally slated to ship, how can a smaller company with less resouces manage this? They have also come out with some interesting apps too, like Pages, Aperture, Motion and so on.
I guess one of the things that Apple learned from Copland was not to reinvent the wheel too much. I think one of the reasons that Apple can bring out things like core image, core audio and so on is that they can leverage from OpenGL, OpenAL and so on.
This isn’t a pro Apple speech, the same could be said for Linux too.
I remember the days when Apple was famous for their attitude that if it wasn’t build in-house, it was no good. Now MS seems to have this attitude. Luckily Linux distro’s out there don’t have this (except for the kernel where there is no alternative) and a couple of exceptions such as KDE v Gnome and so on…
I’ll be very interested to read in a few years time about what went wrong during these years. As I said before, MS will learn and become stronger through this, but they have also opened the door for companies like Apple and distro’s like Linux to get a foot in the door. People are not as willing to stick with MS like they used too. Maybe in a few years time, the market will be more evenly distributed, I hope so for all of us.
Edited 2005-12-01 09:25
No, it shouldn’t have taking 4-5 years — and it didn’t, for the most part.
Remember, Microsoft did a reset on the Vista code last summer, and basically started fresh (though they kept some of the code like WPF and what not). Microsoft f–ked up with Vista and are only now on the right path after doing the reset. Let me repeat. Microsoft f–ked up, and they admitted this, and did what they could to correct it and set on the right path.
What I listed is mostly 1+ years work, with some components having a little more.
As far as Apple…
Their Core APIs are pretty damn cool. I’ll give them that. Why were they able to introduce these new APIs in less time than MS is taking? One big reason: legacy code and compatibility. Apple started fresh with OS X (which took them a while to introduce the original version). They built upon a new foundation and didn’t have to keep any legacy code around. They took a big risk by making such a huge transition, and it worked. They have a much small user-base, so it was easier for them to introduce a bigger transition.
Microsoft has a god damn huge user-base, and they aren’t willing to make such a huge transition by breaking so much compatibility. They aren’t starting fresh with a new foundation. They are refining their current foundation and restructuring the building blocks on that foundation. That’s a lot harder than building on your fresh foundation.
I’m not trying to say it’s ok that Microsoft has taken so long with Vista, but there are reasons why they are taking so long. They miscalculated how big of a transition they could make with Vista and ended up f–king themselves and having to start anew. It’s not like they are purposefully taking their time. It’s not like their engineers are incompetant. The executives wanted too much too fast, and only after the engineers showed them that it was failing did they wise up and take a smarter approach.
One example of a company that has been able to do what MS struggles with is Apple. Apple has been able to introduce new API’s with 10.3 and 10.4 (core image, core video, core data, core audio) all of which are pretty big (not like WinFS I guess, but pretty big, in the case of core image – Avalon big), not to mention a host of smaller ones too – including Spotlight, updating Quicktime and so on.
All of these have come out since Vista was originally slated to ship, how can a smaller company with less resouces manage this? They have also come out with some interesting apps too, like Pages, Aperture, Motion and so on.
All of those things are much smaller than Avalon. Avalon is essientially equivalent to Cocoa in size. Actually, it’s bigger than that because Avalon even runs on multiple platforms (WPF/E). Contrary to what you may believe, Microsoft has not been sitting still. They release DirectX9 (which is huge) and XNA, .net 1 and 2, 64-bit versions of nearly everything, Windows Desktop Search, several Media Center updates and API’s, several Tablet PC updates and API’s, Windows Services for Unix 3.5, Windows Server 2003, Windows Cluster Computing Edition, 64-bit XP Pro, XP Embedded, SQL Server 2005 and Visual Studio 2005, Visual Studio Express, IIS 6, Acrylic, Office 2003 (several service packs), several editions of Windows CE and Windows Mobile, Sharepoint, Exchange, Live Messenger, Virtual Earth and API’s, MSN Search and API’s, ASP.NET, Halo 2, PGR, tons of other games, Xbox360, Xbox Live, and more.
You make it seem as if Apple has been putting out all these new API’s and technologies and Microsoft has not. That’s not true at all. In fact, Microsoft has put out almost 6 operating systems in the past 3 years:
Windows Server 2003, Windows Server 2003 64-bit, Windows XP 64-bit, Windows Mobile 2005, Windows Mobile 2003 (and SE), and Windows Cluster Computing edition is in Beta 2.
“Oh, and WinFS is a pretty big “cut back a few things”
Working test versions of WinFS backported to XP have already surfaced on the net. Anyone who wants WinFS will eventually be able to enjoy it on XP as well.
Being removed from the release version of Vista is far from being cut as a project. Perhaps it was removed from the Vista beta for other reasons entirely.
I’m far more interested in the features being backported to XP than I am in Vista itself.
My understanding is that’s because much of it was restarted last year after discovering that their modifications were so broken it would be easier to rewrite than fix.
I’m not entirely sure of the validity of this as it was reported on slashdot…
I think the thing with MS is that their customers don’t buy new copies of the OS, in general. Mac users, even non-techies shell out $129 for the new version every year or every other year if they skip one. So, to get people to upgrade, Microsoft has to give less incremental features.
This is partly because they have a history of making things worse on occasion. And partly because their customers aren’t nearly so gung-ho as Apple’s.
I was actually looking forward to the beta. This is the one where they reveal the final interface right?
A brazilian Microsoft employee said in a Windows Vista Roadshow earlier this month. Let’s wait to see if it’s true.
Edited 2005-12-07 18:53