Get it while it’s hot (translation: before the Apple laywers take the site down)! MacNytt has put online an exclusive video showing Quartz Extreme’s capabilities, the new 3D graphics acceleration technology used for the 2D desktop of MacOSX 10.2. Our Take: Let’s hope that the brand new iBooks that feature a 16 MB Radeon Mobility AGP graphics card will be able to run Quartz Extreme at 1024×768 at 60 Hz. Depending on the resolution you run your desktop, you will need either 16 MB or 32 MB and above of VRAM and AGP 2x (PCI won’t cut it because its bus is times slower). For anything above 1024×768 though, you should be considering upgrading to an AGP card with lots of bandwidth and lots of memory.
This was posted on the MacNN (http://www.macnn.com/) forums a couple of days ago. I don’t think that Apple will shut down this site, people have been posting Jaguar pics, reviews, movies, etc. since WWDC. They probably are happy for the publicity – I know that as a Mac user, I am getting excited for Jag to be released.
LEAVE SPACES around URLs.
Let’s see… How many times will I have to say and write this? I am to the point that no messages will be approved in the future if they do not use the URL guideline. >:(
Cant you just make the script ignore trailing ) chars after URLs? I have yet to see any URL that ends with a ), so there would not be much damage.
No. I do not know regexp, neither I have any time, or any interest *whatsoever* to learn. I found a 2-line script that parses the URLs, on the web. If anyone can write regexp/php code that takes a variable that contains a blog of text and parses the urls by ignoring parenthesis I will use it (email me, do not post code here).
However, the parenthesis is not the only problem. Many times people are just using comma after their urls, and ther is no way to know if the url has a comma or not (commas are valid in a url, news.com and other Cold Fusion sites are using it).
So, what it remains really, is people to learn to read before they click “submit comment”. We can’t allow full HTML support anymore, there are people who do not respect our web site and they take advantage of their freedom, so everyone else must pay for their stupidity too.
The “demo” at that URL is pretty lame compared to what Quartz Extreme can do. I mean, so what, they show the zooming? How about putting a 3D OpenGL object over top of the DVD movie, and then a translucent terminal window on top of that, and watch as the 3D object reflects the colors of the DVD movie playing underneath it and the terminal window on top of it.
THAT would have been an impressive movie. And QExtreme can do it.
I’m no good at regex either, but I bet someone else has already written a good filter. I think that there are only so many valid characters for a URL, so it’s just a matter of rejecting all illegal characters. Unfortunately, parentheses are legal, so you’d need to extend it to include improbable characters too. Is this for Perl?
BTW, here are the legal ones (RFC 1738):
0-9 a-z A-Z $ – _ . + ! * ‘ ( ) ,
I suppose this movie like others, various snapshots, et al has a “gee whiz” appeal. It may suggest the reason why Uncle Stevo thinks everyone really needs a 64MB graphics card, although 32MB applicants are accepted. But if there’s some purpose in posting this movie and most other stuff in advance I don’t have a clue what it is. So why did I download it? Dumb I guess.
Geez. Can anyone say memory sucker? Instead of real improvements this seems to be more pointless eye candy that uses up resources like crazy.
I think one main purposes of QE is to take the load from quartz off of the cpu and system memory and have the graphics card take care of it.
From Apple’s Website:
Jaguar dramatically improves the performance of Mac OS X with Quartz Extreme hardware-based graphics acceleration. Quartz Extreme takes advantage of the OpenGL 3D graphics engine to make the entire desktop a fully accelerated OpenGL scene. A supported* video card can then render the drawing of the desktop, just like it would a 3D game. The main CPU chip(s) can then focus on application-specific needs, making the whole system faster and more responsive.
not a memory sucker. that is the reson it is HARDWARE acelerated. it offloads most of the UI onto the GPU of the Grafics system….and the rest is barly a dent in the memory of the system.
I think the movie is a fake since the mouse cursor doesn’t have the shadow that I’ve seen in Jaguar.
Nothing there is pointless eye candy. The fact is that X is the ONLY Os to use a 3rd generation display model (one where images on the screen can be manipulated in new ways. Linux, Win, Be, Amiga, and everything else can’t do this. It’s like haveing the OS display engine have the capibilities of photoshop. Hence the neat minimization effects, variable size icons,etc. Someone will point out that Win/Linux can use more than one size icon, but they are actually more than one file and are not able to resize the icons merely use different sized preconstructed icons – you can’t make them any size only pre defined ones. If you don’t think this is an advantage I don’t care. It’s your opinion thats all.) The fact is that because of this display model a lot of the eye candy doesn’t use much more cpu usage than using the display model without the eyecandy. Most if not all OS will change to this eventually, but there will be a lot of nay sayers just like the original GUI had (oh, it wastes CPU time so it will never catch on, but today most people use a gui). Every other OS uses fixed size raster image vs. Mac OS X which can manipulate its vector based images in the same way a graphics app can. Things like the graceful move into position when you drag an icon to the dock vs. the blunt there of windows/Lin/etc. are possible because of this. Right now it might seem useless, but in the future it might be considered as crutial as the gui. Who knows, I’m not an apollogist, and I don’t care if you like it. Everyone said things like the gui or web graphics wouldn’t catch on, but here we are with out guis and web graphics so keep an open mind.
A last point: Quartz Extreme wouldn’t help MPEG2/DVD decoding so that whole part of the video was useless unless he just wanted to show off the zooming effects. Too grainy for the rest of the suff as well.
Ah! I misunderstood… thanks for clearing things up.
Corrections, corrections
Nothing there is pointless eye candy. The fact is that X is the ONLY Os to use a 3rd generation display model (one where images on the screen can be manipulated in new ways. Linux, Win, Be, Amiga, and everything else can’t do this.
>>>>>>
Umm, NeXT’s Display Postscript and Sun’s NEWS system were out a long time before X came out. Also, Linux *can* do accelerated 3’rd gen effects. See: Render extension, RandR extension, and E17.
It’s like haveing the OS display engine have the capibilities of photoshop. Hence the neat minimization effects, variable size icons,etc. Someone will point out that Win/Linux can use more than one size icon, but they are actually more than one file and are not able to resize the icons merely use different sized preconstructed icons – you can’t make them any size only pre defined ones.
>>>>>>
Both KDE (in the form of a patch to 3.x) and GNOME (native in 2.0) have SVG scalable icon support.
Hmm, I think you strech the CLI vs GUI analogy a bit far. While there is no doubt that this is cool technology, and it might become quite useful someday, it’s doubtful its as much of a fundemental revolution as the GUI itself. More of a nice evolutionary step than anything else.
Actually, the video bit is pretty important. Of QE allows integration of the video acceleration with the OpenGL acceleration, that would be seriously cool Imagine, Be’s MPEG’s mapped on a cube routine accelerated in hardware!
If I had a product that lagged behind the industry by several years in something as basic as graphics acceleration, I’d hang my head in shame. Woe be to the fool who squandered money on a product that was so deficient.
What’s next, the sudden discovery of SCSI? *snicker*
Just for more info on e17 (enlightenment, devel version) which Rayiner mentioned. The graphics of enlightenment itself are based on a canvas (evas), which allows quite a bit of the things mentioned here (hardware acceleration, scaling of icons/other graphics, etc). However, it is still limited by the fact that it uses the XFree86, which although beginning to have support for “3rd gen effects” (through the render extension), is still quite lacking. It is not possible to have (the nice, albeit pointless) effects such as shadows on windows, alpha blended border edges, transparency, etc.
Also, evas doesn’t much support for vector graphics at the moment.
E17 also probably won’t be finished for a while (we developers are a bit lazy at times
now if only we had a full windowing system based on evas…
I’m up for some 2D acceleration, but I don’t know about this.
People usually buy a GeForce4/Radeon 8500 (I’m an ATI fan) to make their games go faster, but what the hell, now we’re going to have people bragging about how many FPS they’re getting in their OS :shrug:
OS X isn’t hardware accelerated because the existing 2D hardware acceleration on existing graphics cards are way too primitive to hardware accelerate OS X’s fully vector-oriented user interface. While other OSes have the ability to do some vector processing, OS X is vector processing throughout. This is why it IS a third generation UI system unlike the primitive 2D bitmapped windowing systems featured under X Windows and Microsoft Windows. Like another poster posted, yeah, the NeXT did have Display Postscript but OS X for the most part *IS* NeXT and Quartz is a direct descendent of Display Postscript (Quartz is kind of like “Display PDF” and PDF is a modified version of Postscript). NeWS was an excellent windowing system as well … too bad UNIX politics killed it in favor of the vastly inferior X Window System. The original poster was correct though in that OS X is the only OS with a third generation UI system that is actually in widespread use.
That poster who commented that people should hang their head in shame for buying a next generation UI system needs to be beat senseless with a clue-by-four.
Just like hardware acceleration lagged a few years for second generation 2D bitmapped systems, hardware acceleration is also lagging for third generation vector -oriented UI systems. However, Extreme Quartz addresses this issue so I’m not sure what people are complaining about.
The simple fact is that not only did Apple beat everyone else to a third generation UI system in widespread use, it is also hardware accelerating it while everyone else hasn’t even made it to the unaccelerated vector UI stage yet. Microsoft is scrambling like hell now that Quartz is out and they hope to release a cheap knock off in Longhorn, which is due in 2004-2005 (or 2006 if they slip their schedules). Pretty shameful that a monopoly with $40B in the bank is so far behind in technology than a much smaller company with just a few percentage points in marketshare.
it should handle http://www.slashdot.org, OSNews (http://osnews.com/) and http://somethingawful.com.
*j*
It’s nice to see someone is working on making GUI’s better. (Or is it just cooler?)
Seems like Windows got even more boring
Btw yet another ‘feature’ that was ripped of BeOS before it is implemented 🙁
the next version was supposed to include something called ‘PicassoGL’ doing exactly what this thing does. This is bs, I’m sick to see the behaviour of the whole industry… No respect anymore 🙁
No, PicassoGL has nothing to do with it. Your sources are all wrong.
PicassoGL was nothing more but an effort to unify the 2D drivers in a new, single format. Nothing that a user would ever “see”, it was just an under-the-hood development enhancement.
Just because Picasso was called once PicassoGL, people are drawing their own conclusions and are imagining 3D objects flying around the desktop.
Get real.
I thought this was just the ability to have a hardware accelerated background, not much more?
First of all, why och why does a comment get posted as soon as I press enter? I tend to do that to get from the subject to the body area (7 years of Pine easily gives you habits). Not too nice to see posts from me with just a header…
Now about the video. What is this thing about zooming? I can’t come up with a single reason why I would ever want to do that, neither why anyone couldn’t write a hack for XP/XFree86 that wouldn’t do the exact same thing. My point is, if you want to show off a feature, show the usefulness of it. Otherwise post an iotd at http://www.flipcode.com (this is to the developers of EQ, not anyone here).
Vectorized graphics is real neat, we will most surely see it in all OSs fairly soon. Spiffy stuff is going on, and 3d hardware is finally getting useful and common. But I need to be convinced of all this zooming around, or even worse, those going around in 3D GUIs. After all, anything that will lower my productivity is bad (i.e. anything that takes time is bad, and floating around in 3D takes ages. In this respect, one second is an eternity).
This whole thing with DPI aware GUIs is simply a pain, and according to me, useless. If I want to show something on a screen, speaking as a developer, I want to do it with pixels. DPI would only be useful if you really need something to be a specific real world size. Which is only DTP/graphics.
But then again, the world needs crazy people ^_^
That’s an interesting explanation, cluedealer. Thanks very much! When you put it that way it makes more sense. But what I don’t see is a good reason to commit so heavily (especially in a way that makes a shipping product painfully slow) to vector when the display and printing is raster. Isn’t this really just a very elaborate kludge to compensate for the 7400 family’s shortcomings? Or is it a way for Apple to spoil OSX for anything else but the current hardware platform? When I look at Motorola’s precarious state, I wonder if Apple’s policy of isolationist protectionism and the incompatibilities that it causes is a gamble that Apple may lose.
It is based on OpenGL, which supports – in contrast to conventional 2D hardware acceleration (mostly, I suppose) – alpha blending, rotating or any other transformation, so these are the things accelerated by QuartzExtreme.
OS X uses alpha blending for menus, window titles and borders/shadows and for image composition (e.g. icons and widget graphics with smooth borders).
Quartz gives you the ability to easily rotate graphics, but only a part of all programs can use that feature. (Finder, iTunes etc. can not – OmniGraffle, Illustrator etc. may use it for viewing rotated images or so)
Other tranformations used by the system are the genie-effect and – in Jaguar – zooming.
It will not accelerate vector graphics or font rendering. Though both may become faster through more intelligent algorithms in 10.x.
BTW: Vector graphics are currently not much used in OS X. The icons are scaled images and (almost?) every GUI-object is composed of vertical or horizontal lines, non-rotated rectangles and (the “new” visual effect in Aqua) alpha blended images.
Greetings, Lars
Actually Eugenia, you’re wrong, and maybe JBQ is as well.. Be was working in a completely hardware 3d accelerated app_server, call it picassoGL or whatever you want, but it existed and worked.
Actually Eugenia, you’re wrong. Be did work in a full hw accelerated app_server.. call it PicassoGL or whatever you want.
I can see it being useful in graphics applications like Photoshop where the user wants to get nitty gritty with pixel manipulation. Sure you can zoom in Photoshop, but it doesn’t follow your mouse cursor around like in the demo movie. And if you ever see a strange artifact in your images and want to know what the hell it is, you can just zoom in. I’m making a big effort to justify this zoom feature, and I myself think it’s no big “Wow.”
Well, i am sure this is a wonderfull little video. Hopefully someone will convert it to mpeg or avi or something so the rest of us can watch it
From what I have read, the purpose of the zooming functinality is for the visually impaired since, for them, small objects on screen can be difficult to distinguish. This is supposed to be an option setting under Universal Access.
> Someone will point out that Win/Linux can use more than one
> size icon, but they are actually more than one file and are
> not able to resize the icons merely use different sized
> preconstructed icons – you can’t make them any size only pre
> defined ones. If you don’t think this is an advantage I don’t
> care. It’s your opinion thats all.
I don’t know from which planet you come from, but on earth Windows is able to scale icons.
Windows is capable to do this since Win98, propably since even longer time.
With WinXP the scaling got smoother.
You can have a big icon image (eg. 96*96) and let Windows scale it.
The advantage of different icon sizes is that you can optimize the look for each resolution. You can have totally different images for each resolution (eg. the big icon shows a person’s face, while the lower res icon shows just an eye).
My school sucks, I skip class, see this headline and what the hell can I do about it; NOTHING. We are on shity P2’s and still use netscape 4.0!!!!
Eugenia, you keep telling us QE requires a very fast AGP port but haven’t said why. Is there something we don’t know?
AGP was originally invented because graphics memory was horribly expensive and games needed to get textures to the 3D chip quickly. Memory prices promptly dropped so the AGP port actually became unnecessary although games probably do need it these days.
However a 3D desktop is hardly going to be throwing vast amounts of textures to and from main memory. A desktop can pretty much all be drawn by the GPU so all you need to do is send lists of graphics operations and you could use a PCI bus for that.
The only thing likely to be really demanding is video which doesnt use the graphics chip but you’re hardly likely to find much of that about (apart from BeOS…).
PCI can handle uncompressed TV resolution so why exactly is a super fast AGP port required?
—
Funny it reminds me:
There have been other uses of 3D – one company decided to use the Quake engine to disply CAD data so they could show people inside buildings they were designing for them. It worked great until people would walk into a lift and lost a life…
> Eugenia, you keep telling us QE requires a very fast AGP port but haven’t said why. Is there something we don’t know?
AGP 2x is 4 times faster than your average PCI bus/card. AGP offeres a reasonably fast backup when running out or main memory.
It’s the exact same reason why you need AGP for 3D games, because you don’t want it to slow down too much when going from 6 to 7 windows (or whatever the limit would be). You don’t want your UI to be much slower just because you opened one more window that made you overflow your 32MB of memory. Just like you don’t want quake to downgrade from 100 to 20 FPS because a new monster showed up, you want it to go from 100 to 75fps.
Your not missing much, but i’ll see if I can convert it later…
Personally I think Apple is just saying that to get people to buy new graphics cards, or new systems. And it does kind of make sense. OS9 is discontinued, it’s time to start fresh with a new OS and new hardware.
Eugenia’s explanation is less paranoid than mine. But I don’t get what you mean by going from 6 to 7 windows. I have an iMac G4 700 MHz. It doesn’t matter how many windows I have opened, it just feels slow.
Window resizing can only be done from one point, the lower right corner of the window. Very annoying.
The green + button doesn’t maximize windows to the full extent of the screen. It just toggles back and forth between weird sizes. Very annoying.
The drop down menu items in the topmost Apple menu are transparent. Very annoying.
Accelerated UI sounds nice, but I’d rather see these problems fixed, or allow customization, in Jaguar.
> But I don’t get what you mean by going from 6 to 7 windows. I have an iMac G4 700 MHz. It doesn’t matter how many windows I have opened, it just feels slow.
Exactly why they want to use Quartz Extreme. Your iMac does not yet run QE, you are running in 2D mode, not in 3D acceleration mode.
The 2D acceleration you get now, does not care if you have 5 or 100 windows open to *display*. Only the CPU and memory will get clunged by it. Not the 2D acceleration, which has a flat 1024×768 resolution to render each time, regardless of the objects presented.
In the 3D acceleration, things are working the other way around. The CPU and memory are freed up and the graphics card will handle the effects and the rendering. Each window or widget is an object and it will be handled different, not “flat”. However, you do not have unlimited memory on your graphics card. The way 3D works (especially if you don’t use ZBuffering) is that it will try to render all these windows, no matter if they are visible or hidden by another window. Therefore, if you have more than 5-6 windows open (or whatever their limit is), you will get a big slow down with QE, because you will run out of graphics memory (hense the “at least 32 MB of memory” Jobs said).
Exactly because of that, a scenario that will be happening all the time – especially with people with big resolutions – you will need as much faster AGP bus you can get for the money, because then it will swap the rest of the windows in the main memory, outside the already filled up gfx memory (hense the “at least AGP 2x” Jobs said).
Is it any clearer now?
To make it more clear, I would like to present the importance of the speed of the graphics card on the new MacOSX.
Scenario:
Let’s say that I have the money to buy a new PC and a new Mac (I don’t btw .
For the PC, I would buy a pretty fast PC (around 2 Ghz) and a GeForce3 Ti 500 64MB, would do the job beautifully.
For a Mac, I would prefer to buy the “cheaper” G4, 933 Mhz, instead of the dual 1 Ghz, and spend the rest of my money on upgrading the GeForce4MX to the brand new GeForce4 Ti 4600 128 MB, AGP 4x.
Such is the importance of the graphics card in the new OSX if you want to use it as a desktop. Better not buy the most expensive PowerMac model, but instead, use that money to buy the BEST graphics card you can get if you want to do all that crazy stuff, plus having a pretty responsive desktop experience.
Exactly how much eye candy should I expect to see in 10.2? It took me 3 releases of Windows to convince me that it was time to switch to a different OS, and the one thing holding me back from getting into Mac and OSX sooner was the GUI. I really can live without it. I’m probably going to buy a GF4 or Radeon 8500 anyways for games, and I do appreciate the fact that Apple is putting these hardware to good use. Eugenia, your explanations are very clear, I wasn’t confused by the technical aspects, what baffled me was why Apple to continue to over glamourize a GUI that already has 10 pounds of makeup on it already, to a point where they have a limit on # of opened windows?
I wasn’t confused by the technical aspects, what baffled me was why Apple to continue to over glamourize a GUI that already has 10 pounds of makeup on it already, to a point where they have a limit on # of opened windows?
They didn’t; fundamentally little has changed to the overall look since its release. The increased claims of Apple about how cool the OS looks are merely demonstrating the capabilities OS X had from the start but thanks to great new coding and resource shifting, are now more usable and groovy =]
My OSX now uses almost exactly the same look as it did when I first used it in march 2001, the only differences are small GUI novelties implemented now because I can.
Prime example: Fire.app superb all protocol IM client. Since its release it had the capability in development and implementation to have cool transparent windows / buddy lists and stuff. However they have only implemented it now because OS X’s graphic handling has greatly increased in speed.
No more layers of make-up have been added, they were all in the ‘purse’ to begin with but apple and developers just never used them (damn that has to be the worse metaphor I’ve ever used.. *hangs head in shame*) =]
This was said best in a thread on the AppleInsider forums ( http://www.appleinsider.com/ ):
Well, I’ll tell you about the one that was briefly shown in the keynote (and explained in great detail in one the Quartz sessions).
But, before I describe the scene, I want to make it very clear what Quartz Extreme (QE) is. QE is not about graphics acceleration. It’s more accurate to think of graphics acceleration as a side effect of what QE does. Quartz Extreme is literally an implementation of the Quartz Compositor in OpenGL. And, not only that, but all I/O operations are carefully coded to use DMA only (versus programmed I/O), freeing the processor from both compositing the pixels, and pushing them out to the destination device (ie: the “side effect”).
And to further clarify, the term Quartz refers to two seperate elements. One is Quartz 2D. This is the 2D API that is used to draw 2D elements. The other is the Quartz Compositor. This piece is responsible for integrating and compositing the 2D, 3D, and media elements generated by Quartz 2D, OpenGL, and QuickTime. Quartz Extreme deals with the Quartz Compositor, not Quartz 2D. So, to reiterate, Quartz Extreme does not accelerate Quartz 2D.
So, on to the demo in the keynote.
During the keynote, the primary demo used to show off the power of the Quartz Compositor involved the compositing of several layers of raw motion pictures from a movie, to create a scene (sorry, can’t remember which movie it’s from). Now, this doesn’t sound too terribly complex, does it? You’re probably thinking, “Final Cut Pro can do that now with its real-time effects!” Well, here’s what made this hard (read: impossible) to do before, in real-time, without additional add-on cards.
The scene consisted of 5 layers:
-one background plate with the “sky” and background scenery
-one back ground plate of a train station, with green screen where the sky and environment should show through
-two layers of actors, on stages, in front of a green screen
-one layer of a model train, on a stage, in front of a green screen
During the keynote, these five layers were composited, with garbage mattes, chroma keys and colour correction, all in real time! It doesn’t sound as impressive with me describing it here, but try this in After Effects and it would take hours to render! Remember: real-time!
However, during one of the Quartz session I attended, this demo was went over in greater detail, with even more complexity added to the scene. So, what was added?
-the angle on the two plates of actors, and the train station were off. The actors were skewed, rotated, and scaled to match, in real time.
-three floating, rotating, transparent 3D objects were added
-realtime scrolling credits were added to the scene
You had to have seen it, to fully understand how incredible this was.
As for the DVD demo movie that was posted earlier today, a similar demo was shown at WWDC. However, it involved a DVD movie, 20 transparent terminal windows (which is actually 40 transparent layers in the Quartz Compositor), and the same 3 floating, rotating, transparent 3D objects floating above all that. For comparison, it’s not even possible to composite the volume indicator in front of a DVD in 10.1.4!
I wish I could tell you guys everything about QE. But, if I did that, I probably wouldn’t be able to get stuff from Apple anymore.
But, the point is: do not think of Quartz Extreme as graphics acceleration, think of Quartz Extreme as a graphics compositor on steroids^(squared). As I said, the “graphics acceleration” is more of a side effect, despite the exponential increase in complexity that Quartz Extreme allows. Another note: all the new OpenGL extensions required to make the QE version of the Quartz Compositor are also available to developers. So, it’s really up to the developers to come up with interesting (eg: 40 terminals over a DVD) and useful (eg: the film compositing demo) ways to use this new power.
So, they are using windows like “textures” and if you are using quite a few of them on a high res display you’re going to need AGP…
Thanks for explaining that.
It sounds really cool but it’s going to use *massive* amounts of memory.
If thye could somehow combine the old approach with the new approach it would use less memory.
The old approach is to draw something every time, this isn’t as slow as you think, it’s so fast in fact that you dodn’t see it – but if you say have a few big windows open in BeOS you can notice it.
You can draw once but then you have to store the result in a bitmap and this takes up loads of room. It would appear Apple need to do this for the 3D engine to be any use.
The ideal solution would be to have a “programmable” graphics card so it would redraw textures when it needs them, (you just store a description of the graphics in that case). This is effectively putting a CPU onthe graphics card – exactly what NeXT did once – wonder are Apple planning to do the same? It is an obvious next step (pun intended) for graphcs cards anyway.
…and something the Amiga did in 1985!
Both ATI and NVidia and their related partners recently released video cards with 128 MB of video memory on board. The usual hardware review sites began to question “do people really need a video card with 128 megs of memory?” Even current and upcoming games won’t need that much memory. Well, here’s your answer. If there’s a Mac version of these video cards, QE will benefit from the massive on board memory.