“It’s been a long and arduous journey, but Microsoft continues to make progress in its plan to release Windows Vista. This week, at the Windows Hardware Engineering Conference in Seattle, Microsoft is unveiling Beta 2 of its next-generation operating system, marking a critical milestone on the release plan. Because Vista code, at this point, is essentially feature-complete, Beta 2 – also known as build 5384.4 – doesn’t contain a lot of readily visible changes from the builds we’ve covered previously. But in the months since the February CTP release, Microsoft has continued to improve Vista’s fit and finish, flesh out the capabilities of bundled programs, and clean up bugs (though there are still plenty).”
So the Home editions won’t have the full Aero experience, but the business, enterprise, and ultimate editions will. But the business editions will have the full Aero experience, but won’t have media center and some other stuff. So can I download media center if I get a business edition? How come business needs a full Aero “experience”, but the consumer editions don’t apparently have the need? How much is the “ultimate”?
All these editions seem kind of wacky to me. OSX is looking more attractive as my next purchase. Hopefully, Apple will offer more choice by that time, and at a later date I can dual-boot if I want.
Home Basic doesn’t have Aero, Home Premium does.
None of the previews (PC Mag, PC World, etc) add new info to what’s available. I realize reviewers are throwing something out and haven’t had any time to really dig in yet, but this is just page fodder.
The MS WinHEC site provides good information on Boot Configuration Data (Vista + EFI/ BIOS, bootloaders – very interesting read), XPS & Vista’s Printing, Kernel Mode Enhancements (new memory management and dll handling/ updating were interesting), Audio effects, Longhorn Server, Drivers, WinFX, etc.
Did anyone know they had a new graphics format to compete with TIFF and JPEG? I didn’t until I read some of the XPS info.
Basically meat is available, but the news sites are only touching on fluff. The only questions they answered concerned UAC:
Is it still annoying? Yes.
Does MS realize this and working toward it? Yes.
Unless something changes with WMP or Photo Gallery, we’ve all seen them. There’s a lot of changes in here but we see the same stuff. We’ve had previews out the wazoo on the UI, how about something a bit deeper?
Did anyone know they had a new graphics format to compete with TIFF and JPEG? I didn’t until I read some of the XPS info.
Like we need another one. I understand they want to unify the user experience, but it doesn’t have to involve the creation of proprietary formats for everything. Just with Vista, I believe they’ve added XPS, the new format based on Sparkle (the “Flash-killer”) and that new graphic format. I’m probably missing one or two others.
Anyway, do they plan to open up the specs for this graphics format? To what I’ve heard, XPS is going to be open, although I am not exactly sure what it means (paying royalties for getting the specs, having to sign a NDA or specs fully available to the general public?).
XAML + Sparkle is the “flash-killer” (it’s really not though).
XPS is the “pdf killer”. PDF is way too established though. XPS will be a rather contained format I think.
In another review, the reviewer said that Vista is gobbling up 700 meg at boot time. Ouch. Now I’ve got 2 gig, but Firefox can easily chew up 700 meg if it’s been sitting around for a while, and with various programming environments running, that 2 gig starts to disappear quickly.
It looks like Vista is going to be what puts 64-bit into the mainstream.
I thought 640K was enough for anyone;)?
P.S. Gates denies ever saying that. http://www.wired.com/news/politics/0,1283,1484,00.html
Could be good to know that 700 Mb for Firefox is a lot of cache memory, memory that is not used, but cahced and ready for the application to use, the same goes for windows, it takes up an larger portion of memory to be ready if it is needed, that does not mean that it uses all 700 Mb at once.
If other applications need the memory, the mamory will be allocated to the needed process, that is why windows and all other OS:es (linux, solari, windows and so on) has great memory management where it controls the resources itself.
Could be good to know that 700 Mb for Firefox is a lot of cache memory, memory that is not used, but cahced and ready for the application to use, the same goes for windows, it takes up an larger portion of memory to be ready if it is needed, that does not mean that it uses all 700 Mb at once.
At the risk of sounding stupid, I’ll say the following: If an OS (or an application) “reserves” a certian amount of memory, that memory is not available to other applications. This is the point of reserving something, so it would not be available to others. Wheither or not this memory is actually used by OS is, well, irrelevant.
If other applications need the memory, the mamory will be allocated to the needed process, that is why windows and all other OS:es (linux, solari, windows and so on) has great memory management where it controls the resources itself.
This does not make any sense to me. What is the point of this “reservation” if the OS is prepared to give it up at first request? How is this different from simply allocating more memory as it is needed?
700MB for Firefox? There’s got to be something wrong. I tried seeing how many tabs it would take to crash Firefox once, and got up to around 180 before getting bored. Even then, it only took ~550 MB. In normal usage, I leave Firefox open for days at a time, and it has never gone above 300 MB for me. And I use tons of extensions.
As someone who is trying a build from not 2 weeks ago, let me tell you that this is not like Whistler betas at all. With the CPU occasionally pegging at 100% with *no apps open*, and a crash about twice daily, I feel that I can’t recommend this for a daily use system.
To be honest, I was somewhat surpised at this, as the code was refactored to work from the Server 2003 codebase, something that is fast, stable, and a close second to Windows 2000 as the best thing to come out of Redmond. The crashes mainly seem to come from intensive for the desktop (well, 1-2 megabit) use of the network stack. You can do the ping of death against it quite reliably – I can make it crash from ping flooding a VMWare image. I don’t know if it will make Novemeber, but I hope not. I’m not a windows user by choice, but I thought that 2003 was approaching something I could recommend for some server tasks, it’s nicely locked down.
Speed wise it is noticibly more demanding then XP, and takes up about 30% of the CPU time of a XP 2500+ at desktop, but peaks semi-randomly to 100% for 20 seconds or so.
The defaut user is still admin, but with the requirement to click on a confirmation before anything is done that would require admin privalidges. I do however worry that there are so many of these (I got one for removing an icon from the desktop) that users will merely be trained to click yes all the time. IE has also been loosened in the default config to something that uses ActiveX etc by default, unlike 2003. The auto firewalling on install is good, as is the way that it asks if a network is public or private when you connect (public turns on the firewall).
The one thing that somewhat confuses me is that I can ping of death even when the firewall is on, but I hope that this will be softed by RTM.
It is basically far, far more unstable then Whistler was at this point. And yes, Linux works fine on the same hardware. Edit: Just to note, so do 2000 and XP (as a web dev, I have test images for IE).
Edited 2006-05-23 17:41
…either one more delay or semi useful OS that is hogging system badly.
If I was to be asked I’d go for delay. XP is fine and non reliable Vista certanly isn’t much of a replacement people expect from it. I’m prepeard for 6 more months of delay, I mean 6 months is nothing compared to 6 years of development.
At this point in time, I would take a nice solid service pack for XP over Vista, period.
Exactly, the balance of security confirmations is very important to get it right, and I strongly suspect that Microsoft will cock it up. Apple seem to have got it right with OS X thus far though.
I can’t help thinking though, on either company’s products, the whole security thing like this can be rather confusing. What I think should be done is in the confirmation prompt, offer to show the user a nice simple, concise video demonstration of the how and why.
I sympathize with your experience, but as someone who has run Vista 5381.1 for over 1 week, I couldn’t diagree more with you.
Sure it’s resource hungry in terms memory, 700MG is average after bootup. But it’s actually storing data pertaining to the UI experience (aero) to speed up requests. The result is a much more responsive and delay free GUI.
If you shut off aero, you’ll see your memory consumption decrease by at least 200MG.
In one week of usage, i’ve yet to have a single system crash, and my CPU cycles are never above 30% with the usually around 5%.
The same was true even after installing IIS 7.0 and MS SQL Server 2005.
There are definitely some bugs under the hood that I’ve found, but none around performance as you’ve stated.
If you shut off aero, you’ll see your memory consumption decrease by at least 200MG.
Well, 500Mb with Aero disabled is still nothing to brag about.
I used Windows 2000 on a Pentium 200MMX machine with only 128Mb of RAM. After disabling stuff like menu transitions and some background services it was quite usable. I even played original Half-Life and Unreal Tournament on it )
So, six years late, what is it that users will get in return for all those hundreds of megabytes of used up RAM?
I don’t care if the memory has been ‘consumed’ as long as it doesn’t affect system performance, and I don’t notice it at all.
Also, from what I’ve read some of this memory is just ‘reserved’ and is not used until the OS requires it.
So, if you’re content with Windows 2000, that’s great, but it is by no means anything like Vista. I won’t deconstruct the many new features and layers of Vista, it’s been done too much.
But I used Windows 2000 for 4 years and there’s just no comparison for me.
Currently i use XP, Vista, SUSE, and Mac OS X and i find something nice about all of them, and something lacking in all of them as well.
Edited 2006-05-23 20:57
Please, time change. The amount of ram a home user has grows. You can’t expect to get by with old hardware for long on an OS which changes with the times.
I’d find for the amount of useability Vista offers, it’s a fair trade off.
Now as for useability the deal is this:
– Everything is easier to find and easier to use, look at the interfaces for a bunch of things. Microsoft has taken close attention to detail in this aspect. Vista is certainly easier to use than XP.
– Everything just feels like it’s at your finger tips. On XP I’d be searching and opening countless folders to find what I need. If somethings not immediately infront of me I can just instantly search for it.
Now my problem with Vista was how slow it ran on somewhat decent hardware.
With 768MB of Ram and a 9600Pro the thing ran like a turtle with Aero Glass on. I know I only had a 1.09GHz processor but really, shouldn’t the stress be offloaded to the GPU?
I mean for near a gig of ram to run that slow is rediculous.
I do however believe there is hope, things like this usually are just optimization issues and a lot of debugging features left on. I expect it to get better.
I felt a significant slowdown compared to XP but that’s again understandable.
Sure it’s resource hungry in terms memory, 700MG is average after bootup. But it’s actually storing data pertaining to the UI experience (aero) to speed up requests. The result is a much more responsive and delay free GUI.
A more responsive and delay free GUI? I find that on an old P3 with integrated graphics the Windows 2K/XP UI is responsive and delay free, even with the XP eye candy left on I don’t find it slow. With things like the pointless menu animations turned off it’s pretty fast on a 400Mhz Celeron, just about any PC sold today will be an order of magnitude faster than that.
I find it hard to believe that using up 200Mb of RAM is really necessary to improve UI responsiveness when it’s running on a modern PC. If that is necessary and the UI experiences noticeable delays without it, then IMO something’s badly wrong with it.
Even after my Windows XP system has been running for quite a while it uses less than 100Mb RAM, does Vista really have so many new features that 500Mb+ of RAM usage is justified?
I suppose with RAM so cheap these days it isn’t such a big deal, I just find it a little hard to understand. It certainly doesn’t do much to change Microsoft’s reputation for unnecessarily bloated software.
Maybe it is just me. With the gadgets and things, and the whole feel of the desktop, reminds me of OS X. Yes, I have run them side by side with Vista on my desktop and OS X on my Powerbook. Should the bugs be worked out of Vista it will have the one thing that OS X lacks IMO, and that is Usability. It seems to me Microsoft may have copied the better aspects of OS X, and is combining them with the best aspects of Windows. I will wait to make any final opinions of course until it gets closer to release. As with Windows XP, the Betas ended up not being close to what the final product was like, and I am thinking this will be the same.
*Disclaimer* This is just my opinion, not a troll, so do not take it as such as everyone has different perceptions.
Familer to OS X…How?
Useable…How?
Best Aspects…Which?
I’ve not got a copy but from what I’ve said and seen, so far I have seen a nicer screen scraper, a few widgets and a sidebar.
I suspect it looks familer to OS X, becuse its prettier, which to me means less functional. Usable means its very similar to 95/88/ME/XP etc. Best aspects, means that its different, and I hate the word *best*
To be fair though, I care less and less, about the desktop analogy that is *universal* between OS’s. I’m more exited by a clean icon set, or a non-intrusive widget that if not resource hungry
It looks to me that I’ll just NEVER go to Vista for any reason. There seems to be NOTHING, repeat NOTHING compelling about it. My Windows XP systems have been working fine and A.H., I agree…I’ll just take a service pack down the road and stretch XP to the limit like I did with Win98.
For fun, I’ll keep playing with my Linux systems…and that is TRUE fun again.
The windows is still essential to have around for me for a couple of items…but VISTA? I’ll never need it. Sorry, Microsoft…you have too much of my money already. No more.
Bye
It looks to me that I’ll just NEVER go to Vista for any reason.
Unless you’re building your next PC from scratch, you WILL be buying Vista. You just don’t know it yet. But don’t worry: Dell or whoever will be happy to put it on there, whether you want it or not. ;-p
Does anyone know how to betatest Vista without a soul-sucking registration or a money order sent to Microsoft?
I’d like to test it to see what it look like!
Since this is still a Beta, I would guess that the code still has all of the debugging stuff in it, which obviously would make the current builds of VISTA run slower now (and use more memory) than when it goes to general release. Also, since VISTA is supposedly built on the Server 2003 base, doesn’t that mean that a lot of the GUI code that was in the kernel on W2K and XP is now run in user space? Does anyone know if any parts of the code base required for the VISTA GUI take advantage of advanced GPU hardware capabilities such as those on the latest nVidia and ATI graphics cards? I would guess that if it does, that using an older, less capable GPU would negatively affect performance. I am sometimes amused when I see posts discussing performance of a new, bigger OS on old hardware. I would say that this pretty much guarantees complaints about slowness. Now, I am not a big spender on the latest hardware myself. I currently run XP on a 512MB Sempron 3100+ as my main working system, using mostly office applications, mail and Internet browsers. I do turn off most GUI effects, though. While it is not a screamer by any means, it performs satisfactorily. However, I also realize that if I am to move up in the world regarding OSes (be it VISTA/AERO or soemthing else), that I will need something more than this. Therefore, I expect to soon put together a new system based on an Athlon 64 X2 3800+, 1GB memory, and an nVidia 7300 or 7600 based graphics card. If this platform can’t support VISTA with AERO, then I will be staying with XP and/or LINUX. By the way, XGL/Compiz is looking pretty hot right now.
Also, since VISTA is supposedly built on the Server 2003 base, doesn’t that mean that a lot of the GUI code that was in the kernel on W2K and XP is now run in user space? Does anyone know if any parts of the code base required for the VISTA GUI take advantage of advanced GPU hardware capabilities such as those on the latest nVidia and ATI graphics cards?
GDI was also in kernel mode on Server 2003. Vista (and “Longhorn” Server) move this into user mode. Display drivers run in user mode for the most part — there is a small kernel module. This allows you to switch drivers or recover from most driver faults without needing to reboot the system. Vista does take advantage of hardware acceleration, requiring a 64MB GPU for Aero (more memory for better performance if you run higher than 1280×1024), but only a normal VGA card if you just want to run Vista. Drawing and composition is accelerated via the GPU, along with video, text, and effects rendering using Shader Model 2.0+. Some may see performance issues due to the alpha/beta quality of drivers. The GPU verdors’ drivers don’t accelerate everything yet, and they aren’t at a stage where there are a lot of optimizations as with XP drivers.