This is not exactly OS news, but they are definately interesting geek news. A lot of things have been said about Matrox Parhelia’s inability to beat the GeForce4 Ti or the Radeon 8500 in pure FPS, however, also a lot have been said that the point of Parhelia is to offer some advanced 3D features and great rendering quality. This great quality and advanced features (like displacement mapping) can be seen in the first 12 screenshots of the Imperium Galactica III: Genesis web site. This game is the first to be released that it has been optimized for the Parhelia. Enjoy the view.
Matrox Parhelia in Action
2002-07-04 3D 32 Comments
What really intrigues me is the anti-aliased glyphs that the Parhelia is capable of. Pretty graphics are good, but so much of the final image quality depends on the game designers that it is difficult for me to compare one good video card vs. another good video card.
On my XP box, ClearType is the one feature that I love. I am spoiled having sunk the money into two Geforce2MX cards with DVI output and two LCD monitors. Other than that, XP doesn’t offer me much.
If Matrox has decent drivers this time around, this card might be looking into if you’re one of the people that has the display quality fetish. As I don’t think Matrox has delivered quality drivers since the days of the Millenium, my hope is there but not my confidence. I’d wait for the drivers to get out there and then check the Matrox support boards before plunking down $300+.
As a long term investment in display quality, tt least with the Parhelia, I know I can upgrade my dual DVI 1280×1024 setup to two of those monsterously large DVI LCD’s 😉 Just a small question of a few missing pesos…
Overall it’s good to see Matrox back on the board when it comes to features. Parhelia seems like a solid choice for the display quality sensitive buyer.
TomsHardware promised a review of the card, measuring 2D quality and speed. Visit their web site regularly to see when it will be posted.
Traditionally, Matrox had the best 2D quality of all the other cards, as their primary market is boards for hospital equipment. I do not know specifically for Parhelia though.
It looks a bit better than other graphics cards. Yawn. These screenshots don’t even have any characters in them, and that is more interesting to me that just some spaceships.
What would impress me more than just graphics looking a little better is HIGH QUALITY drivers and software for a graphics card. Even more impressive would be to provide the same for Linux.
I still have an G400 DualHead Matrox card and I became a Matrox fan. I tolled myself that if Matrox would come up with a new videocard with 3D capabilities I would buy it. Now I have seen these specs, some details and read some reviews I am only more convinced that my next computer will have a Matrox videocard. They really made a nice comeback.
Being a Matrox user myself, I must say that the best feature of any Matrox card is the display quality, (even on the old Millenium 2 card, which I have found puts out a better image than even most modern GeForce based cards, even though they are getting better). This was the one factor which lead me to get a Matrox card in the first place. So until someone can produce a card that has better display quality, Matrox is the one for me…
i owned the g400max during the opening of the geforce era, and i cant say that i was disappointed
sure it wasnt as fast as the geforce, but the feature set of that card totally owned
ive been trying to decide what direction i was going to go for my next video card, i think i might go matrox … i really like visual quality, and multihead gaming just looks like a blast
I have a Matrox G400 MAX somewhere, but I lost it since our three house moves last year…
Matrox sent it to me for free for the BeNews benchmarks I did 2 years ago, and I wanted to use this card on the MicroTel PC I was given to replace the onboard crappy SavagePRO, but your guess is as good as mine as to where that card is now.. I have turned the whole house upside down, and still haven’t found where it is.. :o(
Are there any physics around here?
I have said this many times before. I don’t care much if a graphics card can produce 50 FPS or 1000 FPS in a certain 3D scene if my human eyes can`t distinguish the difference! What would be the point then?
I think harware reviews should be far more based on practical results instead of insignificant figures which doesn’t relate to human limitations. For example movies in the US run at 30 FPS, do you hear anyone cry about it? The old European PAL standard only produces 25 FPS but gives a far better quality as compared to the old NTSC standard. (Because of around 100 extra horizontal lines/i.e. normal hollywood films only use 24 FPS!)
It like promoting CDs with ultra sound (unhearable) to people and comapring them with CDs without. Would you care?
Testing FPS with the games of today is something of an indicator, certainly not perfect, but still a valid measure of two things —
1) Does the graphics card have enough muscle to give the game player a smooth fun experience with the current raft of games? Can the card handle more graphics detail and still maintain good frame rates?
2) Does the graphics card have a decent performance horizon? Tomorrow’s games push the hardware envelope much more than today’s games. Witness all the discussion over the various graphics cards/chips vis-a-vis Doom 3.
As for CD’s, I’ve listened to Sony’s SACD and it is amazingly better than CD. You do need a high resolution audio system to hear much of the difference, granted. Same thing with DVD-Audio. When I all six of my Von-Schweikert speakers hooked up, DVD-Audio surround audio was spectacular. Of course not many people sink $25K in an audio system. And if I’d known the economy was going to crash, there’s no way I would have done that 😉
E, I’ve got one of these lying around doing nothing. If you want it, email me your address and I’ll send it over.
Mike that is not exactly true. A tv set uses a interlaced display that paints “every second” line at each retrace so it has screen update rate at 50 times a second for PAL and 59.94 times a second for NTSC (don’t ask). It’s true that there are only half that amount of full picture but the fact they do use the interlaced scheme makes it all seem smoother than it actually is. A movie uses 24 full frames of data but due to the nature of such a format, i.e. screen projection it does not use a screen refersh mechanism as that found in CRTs and LCDs so you don’t see blanking period and therefor it appears smoother.
So to answer your question yes I would care. I do want a frame rate in my games at least as high as that of the current refresh rate of my monitor (75).
“So to answer your question yes I would care. I do want a frame rate in my games at least as high as that of the current
refresh rate of my monitor (75).”
Surely it should be exactly the same as the refresh rate? (Timed by an
interrupt on the vertical sync).
The question then is what quality of image can be displayed at that
rate, and can that quality be kept up when there are more characters
in the scene, etc.
FPS is about how smooth an animation is being played. 24 frames per second is perfectly smooth to the human eye.
Refesh rates ralate to a stable (flickering/flickerless) screen output (screen updates). 50 Hz is just fine for almost anybody, increasing the refresh rate further decreases flickering, reducing eye strain (the eye gets less tired when watching the screen), but almost nobody can notice a change above 60/70 Hz. That’s why Philips for example doesn’t produce televisions with a refresh rate of over 100 Hz. What would be the point?
So 24 FPS at 60/70 Hz (at same res) would almost be indifferent to the human eye compared to 1000 FPS at 1000 Hz. The point I’m trying to make is that most benchmarks by hardware websites are meaningless. Instead of testing blindly all kinds of figures, maybe they should hire a biological eye expert to explain to them the limitations of the human brain/sensors. Just my opinion. 🙂
> FPS is about how smooth an animation is being played.
P.S. an animated film can consist of drawings, computer renderings, photos or combinations in motion. So with animation I do not per se mean animated images/drawings of Mickey Mouse or Donald Duck. I thought I should clarify that, as some people use another definition when it comes to the word “animation”.
I view running the various games and measuring the FPS output much as I’d use a dynamometer to measure the horsepower of an engine.
One cannot simply look at the specs of a graphics chip, one must test it. Just as one cannot simply look at the specs of an engine without testing it.
It’s not a measure of whether or not you need 100HP, 200HP, 300HP, etc for your driving needs. That all depends on how and where you want to drive. And how much your car weighs! 😉
What we do know is that if we set our game to 1600×1200 and high-detail if our graphics card doesn’t spit out very many frames per second, we know that it might not do well on the games of tomorrow. It doesn’t measure very well on the “horsepower” test.
The people on the cutting edge of film really want to move past 24FPS — for quality reasons. An excerpt from an interview with Rick McCallum of ILM:
“It was the same bullshit arguments. It’s the same fear that paralyses the industry on so many levels and the issues for editors then were that you can’t cut unless you actually touch the film stock. The studios had the same problems – who’s going to pay for it; we don’t have any standards; it’s going to give the director too many choices.”
“We have a business that is incredibly conservative. We shouldn’t even be at 24P. We should be so far ahead of that with the technology that we have available to us. We should probably be shooting images at 100 frames per second and projecting them at 100 frames per second. It’s absurd that we are stuck where we are right now for all the resources that we have available to us,” he added.
We are getting closer and close to real-time rendering 100FPS movie-quality graphics on a PC. With the advent of this level of graphics, we will have real-time Final Fantasy 😉
Mr McCallum is good at producing movies, but of course he is no scientist. Real useful information is gathered by doing realife scientific tests.
1) Show a movie with 24 FPS (same res/Hz) to people.
2) Show a movie with 100 FPS (same res/Hz) to people.
Then ask opinions with regard to the picture quality. Also use placebos, a term we use in the medical/scientific field for fake medicin/tests (Latin word meaning “I shall please.”). => For example the person thinks he is watching an ultra high quality animation while in fact he isn`t, or vice versa. Only comparing such scientific results within a large enough population is meaningful to determine the sensoristic abilities of general human beings.
> One cannot simply look at the specs of a graphics chip,
> one must test it. Just as one cannot simply look at the
> specs of an engine without testing it.
Of course you must test the hardware in real-life but IMO your hardware testers need to keep things in perspective. What good would it do if your engine is able to deliver speeds up to 1000 miles per hour while the car is not physically able to handle such speeds. (Let alone that anyone is allowed to drive at such speeds. The limitations with regard to my previous statements do not relate to hardware limitations but to human limitations.
With regard to the performance of future games using Parhelia chips, this would be dependent upon well-written drivers and hardware/feature optimisations/support by game developers. If a game in the future runs in standard mode faster on another chip while in fact nobody will want to use a standard mode while having a parhelia card this fact is close to meaningless to real-life.
Unfortunately I’m one of these people who can get motion sickness from playing 3D games. I can take a level of smothness but go up higher and I just become queazy.
My friends used to play Duke Nukem on my PC but it was far too smooth for me (all in software on a P133!) and I remember feeling quite ill one day after one had been playing for about an hour.
So I set the quality to maximum and res as high as possible.
I don’t want it slow and jumpy as that makes a game unplayable, I just don’t want it too smooth. Above 28 FPS is smooth.
I can however get used to playing 3D games with time.
Mike Bouma wrote:
>Refesh rates ralate to a stable (flickering/flickerless) >screen output (screen updates). 50 Hz is just fine
>for almost anybody, increasing the refresh rate
>further decreases flickering, reducing eye strain (the
>eye gets less tired when watching the screen), but
>almost nobody can notice a change above 60/70 Hz.
Depends what you are used to. I have refresh set to 85Hz, if I was to lower it to 60Hz I would not only notice the flicker badly but I’d very rapidly get sore eyes.
I used to use 50Hz on an Amiga (ouch!) but 72Hz was pretty comfortable. I can set my monitor higher but 90Hz seems uncomfortable for some strange reason. I can sit in front of 85Hz for hours and hours.
If you want to see screen flicker look about a foot away from the screen, you can’t see 85Hz flicker but you begin to notice it as the refesh rate drops, go into a TV shop and notice how all the TVs flicker like hell! The 100Hz ones don’t flicker at all however.
One day I’ll get an LCD and it’ll all be a moot point.
If you think that 50hz is fine, you need to get your eyes checked.
I can’t stand 70hz, it still hurts too much (especially on white backgroundS), I actually run my monitor at lower resolution than I’d like just to get 85hz refresh, which is the lowest I can tollerate w/ a white background.
> I can take a level of smoothness but go up higher and I
> just become queazy.
Are you sure a higher level of smoothness relates to your motion sickness? Some small tips: some people with this kind of motion sickness have worse problems while watching others play the game and lesser problems while playing from a third-person viewpoint (if possible).
In many simulated 3D environments, you receive lots of visual information through your eyes telling you that you are moving, while on the other hand you receive information from your vestibular system that you are stationary. Such conflicting signals can confuse your mind.
A high level of realism can have a negative effect as well. More blocky, less lightning effects or other special effects could help. Also less fast/action packed games gives the brain more time to anticipate and process incoming (conflicting) information.
Also a big difference in FPS all of the time while playing a game can aggravate problems. For instance where the game drops to 18 FPS, goes up to 24 FPS and down to 12 FPS, etc. A constant FPS rate of at least 24 FPS would probably be desirable.
> I used to use 50Hz on an Amiga (ouch!) but 72Hz was pretty comfortable.
Many Amiga users in the past did not even use computer monitors, but used 50/60Hz TVs instead. Generally not bad for playing games, but for more serious stuff like wordprocessing or drawing, it will over time really take its toll. IMO a monitor with at least 70 Hz should be used for anything serious for long periods of time. I couldn’t believe how flickery early PC monitors were, poor employees who had to sit staring at them all day…
> If you think that 50hz is fine, you need to get your eyes checked.
It depends upon what you are doing. 50 Hz for playing games or watching movies is just fine for most people. If you think you get headaches from watching a movie buying a 100 Hz TV could help. If you need to concentrate on the screen for a long time (i.e. programmer, writer, graphics artist) as IMO 50 Hz simply isn’t good enough. It could facilitate headaches and/or weariness.
> I can’t stand 70hz
Yes you should use 85 Hz then, there are always exceptions. BTW as I mentioned the placebo before, interestingly there are people who when they think they are running their screen at higher refresh rates (i.e. above 70 Hz, while in fact they are not) become cured from their problems. That’s also a reason for doctors to sometimes give placebo medicin to their patients, as of course only the end result really counts.
there MUST be something seriously wrong with your eyes.. go see a doctor..
50hz is enough for everyone? that’s why philips only makes 100hz tv’s? ..what the fuck are you talking about?!? the eye can’t see more than xx frames per second?!?.. obviously you never played any quake.. theres a god damn good reason people want sustained 75, 85, 90fps (pick your favorite monitor refresh in hz)
let me ask you a question.. why do monitors allow more than 50hz when the human eye can’t tell the difference..?
i only read your first post.. then i skipped the rest.. because nothing you wrote made ANY sense what-so-ever..
Brunis, you are confused. You mix FPS with refresh rates and state 90 FPS while you probably mean 90 Hz.
Just some food for you to think about:
1) Most TVs used around the world have 50 or 60 Hz refresh rates. There is only a small minority of users which have problems. (i.e the screen flickering is giving me a headache! BTW watching TV can cause a headache regardless of the refresh rate)
2) Most movies use 24 FPS (relates to smooth animations instead of screen flicker). Do you hear many people complain about this after seeing a movie in a cinema or on television? (i.e. that movie was really jerky!)
> i only read your first post.. then i skipped the rest..
> because nothing you wrote made ANY sense what-so-ever..
Well I explained it as good as I could. I understand physics or biology isn’t everyone’s strong subjects. At my university education I scored some of my highest grades within my class on such subjects. If you better specify which part of my postings you don’t understand, I would be glad to write some more background information for you.
Don’t know what’s with your eyes, but when e.g. when I limit Halflife to 24fps, then take the limit out and it runs at screen refresh rate with is 75hz, I actually _DO_ see that 75fps is WAY smoother. 24fps is smooth enough my ass.
When making a film, you’ve got to be cognizant of your frame rate both when shooting the film and when you are doing your CG.
On the CG side of things, there is quite a bit of work done to make sure that animations and motion look smooth at 24FPS. This is no mean feat considering the vast resolution of film and the complexity of CG effects.
When shooting film, one has to be wary of what is called strobing when panning and tilting a stationary camera.
From film shooting lessons —
“Panning is used to follow a moving object or character, or to show more than can fit into a single frame, such as panning across a landscape. It is also used as a transition between one camera position and another.”
“Inexperienced operators may pan too fast and caused an effect known as strobing. This is also a problem in CG and is called tearing. This can cause motion sickness or cause the illusion of motion to be broken. For example, for an animation at 30 fps, the number of frames needed for a 45 degree pan would be about 22 frames for a quick turn or 66 frames for a casual turn.”
“One way to avoid strobing is to use scene motion blur when rendering. This blur is done by sharing information between frames. Note that this is a scene motion blur where a scene shares information from the prior and next scenes. This is not the same as object motion blur. ”
So if the entire film production chain were capable of 100FPS, you’d find even more amazing action shots, effects, etc. The film would look incredibly more real to the viewer. Our eyes can certainly process more than 24hz! The director and cinematographer could move the camera at higher speeds, tracking motion much more easily.
Just as all the comments we had from game players on frame rates indicated that motion-intensive games do indeed look much better (to most people) at higher frame rates. Many games would be unplayable at 24FPS as there is no post-processing engine that takes the realtime CG of a game and processes it to be smooth at 24FPS.
I hope this clears up some of the mystery regarding 24FPS and its limitations. There is a good reason that a celebrated and accomplished film producer says that 100FPS would be a wonderful thing for making movies. Just as 100FPS for playing Doom 3 is also a wonderful thing 😉
Pure refresh rate on a monitor is not the sole contributor to eyestrain.
The persistence of the phosphor plays a major part too. Televisions have a much higher persistance phosphor the computer monitors. This basically means the image fades from view much more slowly – so the flickering you notice on a monitor at 60-70 hz is not visible on your TV at 50 hz since the image hasn’t faded as much.
Computer monitors can’t use high persistence phosphors with modern GUI systems since you’d see a noticeable blur when moving your mouse around the screen.
Another factor is your angle of view on the refreshing display. The sides of your retina have a greater sensitivity to illumination but are less sensitive to colour. If you look out of the corner of your eye at a monitor (or for some people fluorescent tubes) you may detect a flicker, however when you look directly at the source it dissappears.
Fundamentally TV’s and monitors have different display properties, but then each person has different sensitivites. As I write this message I’m typing at the bottom of the screen and I can detect flickering at the top of my ‘vision’ (see previous paragraph), however it doesn’t bother me. Other people might hate it and find it a painful scenario after only a few seconds.
I just tried it myself with Half-Life. Switching from 24 to 72 FPS Max and back again. I don`t see much difference (if any) at all. Even 15 FPS did not bother me that much although I could notice the difference. Below 15 FPS started to get really annoying.
I’m not alone as I asked someone to watch both my PCs carefully to see if they saw a difference between the two Half-Life installs running the intro scene. At first I did not say what to look for. After a while when he didn’t see anything I asked if he knew which PC had displayed a higher FPS. He choose wrongly as he expected my faster machine to display a higher FPS! Interesting tests IMO. Maybe you use larger displays so that could be a factor? (I only have two 15 inch HP monitors)
> The persistence of the phosphor plays a major part too.
You`re right, that is an important factor as well. But I think a motion or motionless display output is more important. If you need to focus on for example just still text then 50/60 Hz is not enough during a longer period of time, that’s why I distinguished between games/movies and displays without much moving around the screen.
> you’d see a noticeable blur when moving your mouse
> around the screen.
Well while for example using an Amiga in combination a good SCART TV this generally doesn’t give that much trouble according to my personal experience. The only time when you would sometimes notice this is when you move a highly colorful mouse pointer over an entirely black background.
Good/interesting points though.
>In many simulated 3D environments, you receive lots
>of visual information through your eyes telling you
>that you are moving, while on the other hand you
>receive information from your vestibular system that
>you are stationary. Such conflicting signals can
>confuse your mind.
Indeed, sort of like travel sickness but in reverse, If you are prone to travel sickness:
Make sure you can see out i.e. sit by a window that way you can see and feel the movement. Listning to music seems to help a bit as does staying cool, if you can’t see out close your eyes. It all works – I know!
>Also a big difference in FPS all of the time while
>playing a game can aggravate problems. For instance
>where the game drops to 18 FPS, goes up to 24 FPS and
>down to 12 FPS, etc. A constant FPS rate of at least
>24 FPS would probably be desirable.
I’ve found that it was the smoothness seemed to have the greatness effect, once the game dropped below a level of smoothness however I was fine and didn’t get ill (or at least got ill a lot slower).
>> I used to use 50Hz on an Amiga (ouch!) but 72Hz
>>was pretty comfortable.
>Many Amiga users in the past did not even use computer >monitors, but used 50/60Hz TVs instead. Generally not
>bad for playing games, but for more serious stuff
>like wordprocessing or drawing, it will over time
>really take its toll.
Been there done that, you can get used to it but even then the hours were limited. The quality of the TV made a difference as well, Philips TVs have the SCART fully wired up and gave a pretty good image – they in effect were RGB monitors, most TVs don’t wire up the SCART properly though.
However the Amiga allowed double the resolution in interlaced mode (in effect 25Hz) in which was difficult to watch anything other than photographs for more than a few seconds. However if you put the brightness and contrast right down low it could actually be used.
>IMO a monitor with at least
>70 Hz should be used for anything serious for long
>periods of time.
I agree but 85Hz is better still, Like CPU guy I always do this even if it means lower res.
Problem is that on CRT (monitors and TV) picture is drawn pixel by pixel, but on cinema, it’s drawn whole at once.
I quess the cinema screen is not entirely flickerless, Film is 24FPS, though each frame is projected twice (the shutter has two interruptors) making the refresh rate 48Hz. Maybe newer projectors work differently?
Regardless I believe there’s even more eye strain to deal with in cinemas, especially when you sit closer to the screen, as your eyes have to move all the time to different parts of the screen. You will probably fall asleep or get a headache while coding on such a screen for more than half an hour.
A good and simple to understand article about how TV/monitor displays work can be found here: