Linked by Thom Holwerda on Fri 9th Oct 2009 11:47 UTC
Hardware, Embedded Systems The future of integrated graphics processors lies somewhere on the dies of future processors, that much is a certainty. However, this creates a big problem for NVIDIA, whose chipset business will be out, of well, business. Beating everybody to the punch, the company announced yesterday that it is ceasing all development on future chipsets, citing unfair business practices from Intel.
Order by: Score:
Apple
by biffuz on Fri 9th Oct 2009 12:45 UTC
biffuz
Member since:
2006-03-27

For Apple it's more convenient to move to AMD CPUs now, to continue using nVidia or ATI (AMD) GPUs, which are way more powerful than Intel's.

This nVidia move actually helps ATI more than it damages Intel.

Reply Score: 3

RE: Apple
by FreakyT on Sat 10th Oct 2009 15:10 UTC in reply to "Apple"
FreakyT Member since:
2005-07-17

For Apple it's more convenient to move to AMD CPUs now(...)


I dunno, I doubt they would. Apple likes standardization, and switching to AMD would entail at least some variation in the availability of those instruction set extensions. Also, if I'm not mistaken, current generation AMD chips run hotter than their Intel counterparts, something that could present a problem for Apple's ever-shrinking form factors.

Reply Score: 2

RE: Apple
by torbenm on Mon 12th Oct 2009 10:18 UTC in reply to "Apple"
torbenm Member since:
2007-04-23

For Apple it's more convenient to move to AMD CPUs now, to continue using nVidia or ATI (AMD) GPUs, which are way more powerful than Intel's.


I have long suspected Apple of moving to ARM for their desktop and laptop machines (so it is ARM all the way from iPod to desktop). Apple stated compute power per watt as a major reason for changing from PowerPC to x86, so moving once again and this time to ARM would fit this reasoning. At the time of the change from PowerPC, ARM did not have any processors with sufficient compute power to replace PowerPC, but now they do. And Apple is investing a lot of money on in-house hardware design, which most people believe to be focused on ARM SoCs.

In any case, if Apple does go the ARM way, it would not matter to them in NVidia won't make Intel-compatible chipsets anymore.

As for NVidia making an x86 compatible, they could just make a good JIT for translating x86 to ARM. Digital did this for running x86 on their Alpha, and Transmeta used a similar approach with their Crusoe processor (except that the translation was done on-chip). So it is eminently possible. It might be a bit slower than running native ARM code, but if all the heavy duty processing (graphics etc.) is done natively, the overall slowdown wouldn't be great.

Reply Score: 2

License?
by Narishma on Fri 9th Oct 2009 13:12 UTC
Narishma
Member since:
2005-07-06

nVidia doesn't have a license to make x86 CPUs does it? If so, I doubt Intel would be willing to sell it to them, and I don't know if AMD can or would want to.

Reply Score: 2

RE: License?
by Thom_Holwerda on Fri 9th Oct 2009 13:13 UTC in reply to "License?"
Thom_Holwerda Member since:
2005-06-29

They could always buy VIA, or sue Intel for antitrust stuff, I guess.

Reply Score: 2

RE[2]: License?
by fithisux on Fri 9th Oct 2009 14:09 UTC in reply to "RE: License?"
fithisux Member since:
2006-01-22

VIA is a viable option with Nano. The same holds for the new dual core ARM. Finally there is plenty of opportunity with godson processor (it aims to be dual core).

Reply Score: 3

RE[2]: License?
by Bill Shooter of Bul on Fri 9th Oct 2009 14:46 UTC in reply to "RE: License?"
Bill Shooter of Bul Member since:
2006-07-14

Via chips aren't suffering due to lack of names recognition, they are suffering due to poor performance. My parents bought a computer with a cyrix (precursor to via's) processor, it was cheaper and almost as powerful as the cheapest intel. I thought via would be able to boost performance when they bought cyrix and be able to develop some really competitive offerings due to their experience building some of the best chipsets, but that proved to be wrong. Sure via has a niche in the low power/embeded x86 market, but its not good enough for consumer consumption.

Reply Score: 2

RE[3]: License?
by kaiwai on Sat 10th Oct 2009 11:05 UTC in reply to "RE[2]: License?"
kaiwai Member since:
2005-07-06

Via chips aren't suffering due to lack of names recognition, they are suffering due to poor performance. My parents bought a computer with a cyrix (precursor to via's) processor, it was cheaper and almost as powerful as the cheapest intel. I thought via would be able to boost performance when they bought cyrix and be able to develop some really competitive offerings due to their experience building some of the best chipsets, but that proved to be wrong. Sure via has a niche in the low power/embeded x86 market, but its not good enough for consumer consumption.


And thus you have proven why your whole post is wrong - the last time you used a VIA cpu was when it was called Cyrix. Right now VIA suffers because the chipsets used that support the VIA are crap - buggy, unreliable whether running Windows or Linux. The same sort of bugginess that one saw when the only chipsets available for AMD years ago were from VIA. What VIA need is for the low power cores to be optimised for performance, multi-core and hyperthreading - couple it with a good chipset form Nvidia and there will be a viable alternative.

Nvidia also needs to focus on getting their GPU's to do more work when it comes to audio and video compression, and improving gaming performance. Nvidia + VIA could work but I am not a fan of Nvidia simply because they've demonstrated that they have no interest in fixing the manufacturing flaws which has caused massive recalls of 8400 GPUs and faulty chipsets.

There is a reason why I'll be avoiding all Nvidia products in the future -and if that means I can't upgrade to new Apple products then it is the choice I'm willing to make. Apple need to realise that Nvidia product quality is crap and they should stick to using Intel CPU and chipset and graphics supplied by AMD/ATI. More people are taking my position having seen the fall out of Nvidia's poor quality control and happy that we bought our MacBook's when the Intel X3100 was used.

Reply Score: 2

SOC madness
by bnolsen on Fri 9th Oct 2009 13:24 UTC
bnolsen
Member since:
2006-01-06

Well not madness but the new core i7 is pushing more and more into SOC territory. Putting everything on one die is faster and ends up being cheaper for whole systems. NVidia doesn't have a place in this new world.

Its getting about time x86 and wintel had more competition. Right now intel rules (at the high end) but its just a matter of time before they start making p4/rambust type mistakes again.

Reply Score: 4

AMD dominates nothing
by kragil on Fri 9th Oct 2009 13:31 UTC
kragil
Member since:
2006-01-04

I think that is sad and unhealthy and should be investigated, but it is the truth.

Intel dominates .. AMD is struggling to stay alive.

Maybe Apple might help now. An Apple with powerful AMD/ATI interior would be nice .. I don't think Steve drinks the Larrabe(or whatever it is called) kool-aid.

Reply Score: 2

RE: AMD dominates nothing
by Verunks on Fri 9th Oct 2009 14:21 UTC in reply to "AMD dominates nothing"
Verunks Member since:
2007-04-02

I don't know if it's still true, but amd processors are usually hotter than intel and considering that one of the selling point of a mac is the low fan noise they wouldn't fit very well

Reply Score: 2

RE[2]: AMD dominates nothing
by Hiev on Fri 9th Oct 2009 14:38 UTC in reply to "RE: AMD dominates nothing"
Hiev Member since:
2005-09-27

So true, so true.

Reply Score: 3

Bill Shooter of Bul Member since:
2006-07-14

Low fan noise is a mac selling point? Must be one of those little things, I never pay attention to. Dells are pretty quiet as well. Come to think of it, I haven't run across any loud computers ( non server grade equipment) in a while. My amd quadcore beast is whisper quiet with just the stock fan.

Maybe the thermal properties aren't appropriate for the more compact imac form factors, but noise shouldn't be a problem in the mac pros or even mini macs.

Reply Score: 2

RE[2]: AMD dominates nothing
by StephenBeDoper on Fri 9th Oct 2009 22:40 UTC in reply to "RE: AMD dominates nothing"
StephenBeDoper Member since:
2005-07-06

I don't know if it's still true, but amd processors are usually hotter than intel and considering that one of the selling point of a mac is the low fan noise they wouldn't fit very well


The AthlonXP ran quite hot, especially at higher clockspeeds. But AMD's 64-bit CPUs seem to run relatively cool - it probably also helps that the heatsinks & heatsink fans are larger (a larger fan can move the same volume of air while spinning at a lower speed, making it quieter).

Reply Score: 3

RE[2]: AMD dominates nothing
by agnosticnixie on Sat 10th Oct 2009 02:37 UTC in reply to "RE: AMD dominates nothing"
agnosticnixie Member since:
2009-08-20

It also seems to be lacking in the below 35W segment last I checked, it might have got better, but one of the problems with AMD has often been how power-hungry they are, sure the CPU is not everything in power consumption, but a 10W jump will be felt hard and let's face it, the most sold macs are probably the books.

Reply Score: 1

RE: AMD dominates nothing
by JAlexoid on Fri 9th Oct 2009 20:32 UTC in reply to "AMD dominates nothing"
JAlexoid Member since:
2009-05-19

I think that is sad and unhealthy and should be investigated, but it is the truth.

Intel dominates .. AMD is struggling to stay alive.

Maybe Apple might help now. An Apple with powerful AMD/ATI interior would be nice .. I don't think Steve drinks the Larrabe(or whatever it is called) kool-aid.


What do you think Apple will do to Intel? Really...
Apple's computer business is not a big enough deal for Intel. Considering they might actually take the GPU market and convert it to SOC market. All for themselves.
Then, everyone will be standing and watching the "Unbrella" corporation of microprocessors.
Hell, hey aren't even taking their chances with Wintel duopoly.(I reference Moblin)

Reply Score: 2

this patents suck
by puenktchen on Fri 9th Oct 2009 13:37 UTC
puenktchen
Member since:
2007-07-27

nvidia shouldn't need a license either for using the bus architecture nor for building x86-cpus. patenting this kind of protocols and isas shouldn't be possible in the first place. if it really is, intel should be obliged to license them because of their dominant market position.

Reply Score: 10

RE: this patents suck
by Yamin on Fri 9th Oct 2009 16:59 UTC in reply to "this patents suck"
Yamin Member since:
2006-01-10

ssssssshhhhhh.

hardware patents are good. software patents are bad.

//sorry, couldn't resist. One of my pet peeves ;)

Intel restricting the x86 instruction set... might as well allow Microsoft to patent C# and .NET so no one can create a compatible compiler.

Reply Score: 2

RE: this patents suck
by Kalessin on Fri 9th Oct 2009 22:33 UTC in reply to "this patents suck"
Kalessin Member since:
2007-01-18

It certainly comes across as majorly uncompetitive. I mean, when you have to pay a company to be able to become their competitor, there's definitely a problem.

I don't know enough to know whether hardware patents are always bad, but this seems like a clear-cut case where it's causing major problems with competition. All the x86 patents and their ilk are doing is helping Intel become a monopoly. They aren't an outright monopoly, but they're close enough to cause problems. However, much as we'd like the patent situation to change, I doubt that we will anytime soon.

Reply Score: 1

Apple
by FunkyELF on Fri 9th Oct 2009 14:33 UTC
FunkyELF
Member since:
2006-07-26

Apple is in an interesting position here. The Cupertino company has more or less standardised on using Intel processors with NVIDIA chipsets (the 9400M), so it will be interesting to see what Apple will use for future Macs.


You don't need good graphics to run iTunes.

I think current Intel graphics should be good enough to run the 3 games that work in OSX since all 3 are about 5 years old.

Apple using NVidia was wishful thinking on Apple's part trying to get developers to use their platform. Fail.

Reply Score: 0

RE: Apple
by wargum on Fri 9th Oct 2009 15:58 UTC in reply to "Apple"
wargum Member since:
2006-12-15

Now with the introduction of OpenCL in Snow Leopard, the power of the GPU will be more important. Not yet. But in a year or two there should be some apps that really take advantage of the GPU via OpenCL. Intel has nothing to offer in this area right now, only promises for the future which Intel is well known not to achieve.

It would be foolish of Apple not to look into offering AMD solutions, now. Even if it's just for the iMacs and the mini. 'Cos they like to have options ;-)

Reply Score: 1

RE[2]: Apple
by JAlexoid on Fri 9th Oct 2009 20:35 UTC in reply to "RE: Apple"
JAlexoid Member since:
2009-05-19

Using "Apple" and "options" in one sentence is an oxymoron.

Reply Score: 4

RE[3]: Apple
by wargum on Sat 10th Oct 2009 13:43 UTC in reply to "RE[2]: Apple"
wargum Member since:
2006-12-15

I was referring to the sentence "We like to have options" that occurs a few times in Apple's history, that's all ;-)

Reply Score: 1

RE: Apple
by ple_mono on Fri 9th Oct 2009 16:01 UTC in reply to "Apple"
ple_mono Member since:
2005-07-26

I think current Intel graphics should be good enough to run the 3 games that work in OSX since all 3 are about 5 years old.

Do you really think anyone doing heavy movie editing would agree with you? 3d modelling? Doesn't OpenCL and CUDA count as a reason to have a beefy GPU even though you do not play games?

I for one is happy i can continue play BF2142 in bootcamp. That wouldn't have been possible using intel graphics.

Reply Score: 5

hey EU...
by poundsmack on Fri 9th Oct 2009 15:29 UTC
poundsmack
Member since:
2005-07-13

want to do something really usefull for a change? anyone else think that Intel might be abusing a monopoly here just a little? I used to love the nvidia chipsets, good times indeed. sad to see them going. Looking forward to ION2 for Intels low power chips and VIA's nano, 1 more month now...

Reply Score: 3

RE: hey EU...
by helf on Fri 9th Oct 2009 18:43 UTC in reply to "hey EU..."
helf Member since:
2005-07-06

If Intel is integrating almost all the functions the north bridge used to provide... then what exactly is the point in nvidia "chipsets"? The southbridge is just for IO and there isn't a lot you can do to improve that at the moment. So, what is the EU going to do? Force Intel to keep two lines going forever? One with everything integrated, and one with an external support chip, just so nvidia can continue making chips?

Reply Score: 3

RE[2]: hey EU...
by JAlexoid on Fri 9th Oct 2009 20:38 UTC in reply to "RE: hey EU..."
JAlexoid Member since:
2009-05-19

Making a product better does not count as abuse of dominant position. Otherwise we would have not had even combine harvesters, because the farm workers would have risen against them.

Reply Score: 2

RE[2]: hey EU...
by Soulbender on Sun 11th Oct 2009 21:13 UTC in reply to "RE: hey EU..."
Soulbender Member since:
2005-08-18

Of course. Don't you know? When your business is failing it's never because you make crappy products or make bad management decisions. It's always someone else's fault. The legislation, the competition being better..err...unfair, the moons gravitational pull, not enough virgins to sacrifice etc etc.
Important point: it's never your fault.

Reply Score: 2

Comment by robojerk
by robojerk on Fri 9th Oct 2009 16:35 UTC
robojerk
Member since:
2006-01-10

Nvidia should at least consider buying Via. That way they could still keep chipsets like ION around competing in the Netbook/Nettop/embedded market.

After a few yers then they'll be able to get into high end CPU's again.

Edited 2009-10-09 16:53 UTC

Reply Score: 1

RE: Comment by robojerk
by fithisux on Fri 9th Oct 2009 17:11 UTC in reply to "Comment by robojerk"
fithisux Member since:
2006-01-22

VIA has crypto acceleration. Can you imagine OpenSolaris on a Nano+9400M ?

Reply Score: 3

RE: Comment by robojerk
by dpanov on Fri 9th Oct 2009 17:26 UTC in reply to "Comment by robojerk"
dpanov Member since:
2009-01-12

Wasn't there some kind of an agreement between Intel and VIA that if someone buys VIA the licenses are not transferred to the buyer?

Well maybe it's not formulated exactly like that, but I remember reading something like that here on OSNews.

Reply Score: 1

v what about open source?
by minusf on Fri 9th Oct 2009 17:29 UTC
RE: what about open source?
by helf on Fri 9th Oct 2009 18:44 UTC in reply to "what about open source?"
helf Member since:
2005-07-06

What the hell does this have to do with the article?

Reply Score: 3

RE[2]: what about open source?
by minusf on Sat 10th Oct 2009 00:09 UTC in reply to "RE: what about open source?"
minusf Member since:
2009-06-07

i merely reacted to the authors hope that nvidia
will (re)enter the chips/chipsets market once
again with integrated technology.

every piece of hardware needs a driver.
how will you write it if you don't have documentation?

all i said was that nvidia never was, and so far
isn't open source friendly.

lot of fat good it will do us to have competition
if you can't write drivers for it.

Reply Score: 3

RE[3]: what about open source?
by DigitalAxis on Sat 10th Oct 2009 04:25 UTC in reply to "RE[2]: what about open source?"
DigitalAxis Member since:
2005-08-28

Well, nVidia might not be good at opening up their source code or letting third parties write drivers, but they DO make good drivers themselves. I never had problems with nVidia drivers when I had a dedicated nVidia chip; we're having problems at work with the ATI graphics cards that we don't apparently have the free drivers for, and the proprietary ATI drivers (better than they were in terms of installation, etc) are causing everything to briefly and randomly lock up. THAT took a while to trace back to the ATI drivers...

nVidia (seems) to have 64-bit drivers, while AMD apparently wants you to install 32-bit drivers with 32-bit libraries in place to use them. And they're (AMD) releasing new drivers about once a month, that mostly repeatedly try to fix dual-monitors and problems with their own control console.

Reply Score: 4

RE[4]: what about open source?
by blitze on Sat 10th Oct 2009 05:49 UTC in reply to "RE[3]: what about open source?"
blitze Member since:
2006-09-15

When it comes to Windows AMD/ATI drivers are great. No more am I beholden to constant leaked beta drivers and hard OS locks. AMD release drivers once a month and I have yet to have the system lock on me requiring the obligatory one fingered salute.

Reply Score: 2

jabbotts Member since:
2007-09-06

The problem is that Nvidia provides a good driver even though it is closed. Install through Mandrive install wizard, install from Debian repository, install from Nvidia binary download; they all work clean and provide stable performance. Further still, Nvidia's own developers also contribute to the community driver project.

As an end user, I'd happily move to the community driver if it shows the same or better stability and function coverage.

By contrast, the ATI binary was horrid when last I tried it (fps in the 20 frames kind of horrid). The community driver didn't support all the functions of the graphics card but I could get a rational FPS rate out of it. I don't know how the current drivers are doing now that AMD has provided information though.

The poing is that in the case of my ATI a few years back, it was the community driver providing stability and performance where now with the Nvidia, the community driver has not caught up to the closed binary.

I wouldn't say Nvidia is leaving anyone out in the cold as long as that binary installer remains available. They choose to go it alone with driver development though and that's fine provided they can provide equal or better support than a community driver would. This also assuming that the community inherits required information if Nvidia should decide to stop development of older hardware drivers or *nix drivers entirely.

A better company to look at would be Broadcom and other companies who have truly left *nix platforms out in the cold with absolutely no chance of support without reverse-engineering the interface specs.

Reply Score: 3

minusf Member since:
2009-06-07

i am sure the binary nvidia driver is wonderful.
however i don't use linux and/or any of the distros
you mentioned. nvidia does not provide drivers for
a lot of systems, an important issue esp. on osnews,
the site that is not concerned only with the penguin.

yes, broadcom is just as bad, and there are more
companies like that. i try to stay away from them
as much as possible.

i don't care how much fps the nvidia driver has,
if i want to play games, i'll use a different OS.
but i want my hard drive be usable through the
crappy nvidia ahci chipset long after nvidia's gone
out of business or transcends to a higher dimension
or whatever happens to them.

blobs are bad for your health.

Reply Score: 6

robojerk Member since:
2006-01-10

Mind telling us what open source OS you use that you expect excellent drivers for?

Reply Score: 1

minusf Member since:
2009-06-07

i expect an excellent driver for every and all OS's :]

i use openbsd.

Reply Score: 1

jabbotts Member since:
2007-09-06

It was hardware that's kept me from going to Open or FreeBSD. A fully open driver for hardware would be the ideal I think also. In absence of an open driver, one has to go with the driver that works. Unfortunately, being selective about hardware purchases with less mainstream platforms is still a requirement. Get your list of wanted hardware then compare it with your lists of supported hardware. This process sucks but remains as long as the hardware vendors continue to treat non-Windows users like lepers.

Reply Score: 2

strcpy Member since:
2009-05-20

Yes.

But increasingly the "four freedoms" are moot. NDAs here and NDAs there.

On the subject, I'd like to know how many non-Intel employed engineers work with Intel X drivers?

Edited 2009-10-11 12:43 UTC

Reply Score: 3

gustl Member since:
2006-01-19

But increasingly the "four freedoms" are moot. NDAs here and NDAs there.


Our only hope of gaining the 4 freedoms with respect to graphics is the Open Graphics Project.

I would happily fork out EUR 200,- for a not really fast graphics card, as long as "free" is a part of the package.

Reply Score: 3

jabbotts Member since:
2007-09-06

frames per second is not just about gaming. Granted, the gaming size queens use the metric most but I mean with a basic X display. ATI closed driver couldn't push X without tweaking where the community driver could kick out enough frames per second to display a smooth desktop. (It must have improved since the AMD buyout)

Reply Score: 2

_txf_ Member since:
2008-03-17

. Further still, Nvidia's own developers also contribute to the community driver project.


hahahahahhahahaha

the nv driver is a pile of crap. Further, it is completely obfuscated so as to prevent other people from picking up the code and improving on it. The nouveau devs are a brave lot to even try to understand it.

Edited 2009-10-10 08:25 UTC

Reply Score: 3

jabbotts Member since:
2007-09-06

That's kind of my point, the best performing driver happens to be the closed blob from the hardware vendor in this case. The ideal would be open specs and a FOSS driver that could be ported to any platform. At this time though, there isn't a better pick for performance and driver support than Nvidia is there? AMD is not there yet and Intel isn't even in the same game.

Reply Score: 2

gfx1 Member since:
2006-01-20

Yes the nvidia driver in linux is behaving a lot better than the ati one.
I switched motherboards from an Asrock A780GM-LE (ati hd3200) to a comparable (same price) Asrock K10N78FullHD (nvidia 8200)and trying out a new kernel (ubuntu beta etc) doesn't break the graphics anymore.
With ati you had to wait till they released something new. In the meantime you are stuck with VESA.

Reply Score: 1

Apple and Intel
by JacobMunoz on Fri 9th Oct 2009 19:02 UTC
JacobMunoz
Member since:
2006-03-17

...don't seem to be splitting ways anytime soon. Considering Apple's push for the Light Peak interconnect, they probably made some other agreements to tie their lines together in other areas as well. I'm sure Apple knew about NVidia's issues for a while - and was probably asked (by NVidia) to push Intel into allowing the bus license. I can see Apple and Intel having a long discussion over what to do about NVidia (and eachother) until the decision was made to go all-Intel. That would match with the way Apple does hardware: as few vendors as possible. Get NVidia out of the way and you have a platform with centralized and integrated hardware (the 'Apple' way). In exchange for the 'whole enchilada' was Light Peak - a distinctly 'Apple' piece of .. something, that Intel probably didn't think was such a great idea with USB3 coming so soon.

...at least that's just how it looks.

Intel doesn't develop new hardware for nothing, Apple owes them big. And as much as I like AMD, I just don't see Apple going there.

Reply Score: 4

subside?
by wannabe geek on Fri 9th Oct 2009 19:30 UTC
wannabe geek
Member since:
2006-09-27

"I do hope that these tensions between Intel and NVIDIA subside, because I would really welcome NVIDIA entering the x86 market with their own processors with integrated graphics chips."

Did you mean "subsist" instead of "subside"? Otherwise, it looks like a contradictory statement. It is because of these tensions that Nvidia wants to build its own x86 processors, instead of working closely with Intel on new chipsets, integrated graphics or whatever.

BTW, in my admittedly controversial opinion, the problem is not that Intel is abusing an alleged "monopoly position"; Intel is simply using the outright, legal monopoly it has in a bus design for the purpose of protecting its future integrated graphics from a potential competitor. That's what patents do; they are the problem.

Reply Score: 2

RE: subside?
by Thom_Holwerda on Fri 9th Oct 2009 19:41 UTC in reply to "subside?"
Thom_Holwerda Member since:
2005-06-29

Did you mean "subsist" instead of "subside"? Otherwise, it looks like a contradictory statement.


Eh, no. The tensions between Intel and NVIDIA should subside, so that the latter can start making x86 processors - which they will have to do anyway because the chipset market is going down the drain thanks to CPU/GPU combos.

Edited 2009-10-09 19:42 UTC

Reply Score: 2

RE[2]: subside?
by wannabe geek on Fri 9th Oct 2009 21:54 UTC in reply to "RE: subside?"
wannabe geek Member since:
2006-09-27

Ah, sorry. I assumed they could just buy Via or make their own x86-compatible design, even if they have no agreement with Intel. Well, they can always turn to ARM chips for Linux systems :p

Reply Score: 2

ARM to the rescue
by ozonehole on Fri 9th Oct 2009 23:37 UTC
ozonehole
Member since:
2006-01-07

In my opinion, the ARM processor offers the only hope for breaking the Intel monopoly. I was going to buy a new computer this year, but I've put off any purchase in the hope that the ARM-based machines will be coming out in 2010.

Reply Score: 4

RE: ARM to the rescue
by koen.lefever on Sat 10th Oct 2009 16:37 UTC in reply to "ARM to the rescue"
koen.lefever Member since:
2007-07-05

In my opinion, the ARM processor offers the only hope for breaking the Intel monopoly.


Well, that was my very reasoning when I bought an Archimedes in 1987. Sadly, I'm now replying from a PC compatible machine.

Reply Score: 2

PA Semi anyone?
by fraterf93 on Sat 10th Oct 2009 03:37 UTC
fraterf93
Member since:
2009-04-23

Alot are of people wondering about what Apple is going to do. Maybe since they've acquired PA semi they have other plans.

Otherwise the new technologies in Snow Leopard will have been developed for nothing if Intel plans to make their own SOC with a graphics accelerator. It's been stated and I think we can agree that Apple knew about 1) Intel's plans for SOC, 2) Nvidia's troubles, and 3) AMD's lesser offerings in the grapics dept.

In any event I believe Apple have a plan in place and I'm not really worried about what is going to happen to them if Nvidia and or AMD stop making graphics chips.

Reply Score: 1

RE: PA Semi anyone?
by DigitalAxis on Sat 10th Oct 2009 04:31 UTC in reply to "PA Semi anyone?"
DigitalAxis Member since:
2005-08-28

I suppose that would also help with the Psystar situation, if they started making their OWN graphics chips. Still, that seems like an awful lot of time and effort when many companies center around making such products.

Reply Score: 2

Comment by vermaden
by vermaden on Sat 10th Oct 2009 15:11 UTC
vermaden
Member since:
2006-11-18

Long time ago nVidia created many good chipsets, nFroce2 family, nForce4 family, but starting from nForce5 quality/features/technical advancement stalled, they worked, but not as great as in the old days.

In the mean time Intel created very nice family of chipsets with 965P/965G/945P/945G along with newer versions, Q35/Q45/G35/G45/G33/G31/G43/G41/... same AMD here, 780G was quite a small revolution, such great integrated graphics with 40 shaders and such low power consumption, sam for its successors, like 790gx and 785g.

That was also nVidia problem, they chipsets was very more power hungry and because of that they also created more heat.

From their last chipsets chipset with integrated GeForce 9400M was nice, ION also brings something interesting.

Looking at their current REBRANDING policy where 8800GS becomes 9600GSO, which becomes GT130 is sick, same for many other graphics cards from them.

IMHO its VERY GOOD that they stop creating them, maybe some very custom sollutions (like ION) would be nice, but looking on their recent Core 2 chipsets, noching changed in newer versions (like with rebranding on gfx cards).

Also I really did not liked what problems they created for support of SLI, AMD gave away all specification manufanturers needed, so Intel creating chipsets, add CrossFire support without anyproblems. For nVidia's SLI, you had to but special N200 ships to add support of SLI, and you had buy TWO of those, to have QUAD SLI support ... instead of just implementing that info main chipset core logic.

Reply Score: 2

Cry me a river...not!
by clei on Sat 10th Oct 2009 18:55 UTC
clei
Member since:
2008-10-04

The future of integrated graphics processors lies somewhere on the dies of future processors, that much is a certainty. However, this creates a big problem for NVIDIA, whose chipset business will be out, of well, business. Beating everybody to the punch, the company announced yesterday that it is ceasing all development on future chipsets, citing unfair business practices from Intel.

And good ridance to bad rubbish.

At least one won't have to worry about what kind of
defective crap is in a nvidia-based laptop...

Reply Score: 1

RE: Cry me a river...not!
by sgtarky on Sat 10th Oct 2009 22:34 UTC in reply to "Cry me a river...not!"
sgtarky Member since:
2006-01-02

And good ridance to bad rubbish.

At least one won't have to worry about what kind of
defective crap is in a nvidia-based laptop...

+1000 I would love to stick my hp tx1210 up an hp or nvidia engineers ass. The laptop I really want to buy right now is the asus g51 unfortunately it has the nvidia chipset(at least it is replaceable) and evenone says it runs hot as hell.

Reply Score: 1

RE: Cry me a river...not!
by bnolsen on Sun 11th Oct 2009 18:55 UTC in reply to "Cry me a river...not!"
bnolsen Member since:
2006-01-06

Pissing off large groups of customer bases doesn't help a company.

Locking out driver docs, generating lots of heat/pulling lots of power, rebranding the same crap several times, attempting to lock out competition with physics processing.

Why should anyone feel sorry for them?

Reply Score: 2

RE[2]: Cry me a river...not!
by strcpy on Sun 11th Oct 2009 19:11 UTC in reply to "RE: Cry me a river...not!"
strcpy Member since:
2009-05-20

Why should anyone feel sorry for them?


I don't feel particularly sorry for nVidia, but I do feel sorry for the chip markets. Because the biggest villain is still in town.

Reply Score: 2

Intel should buy nVidia
by ruel24 on Mon 12th Oct 2009 00:44 UTC
ruel24
Member since:
2006-03-21

Let's face it, Intel's hardware graphics sucks, big time! AMD made the smart move to buy ATi, and Intel should make the smart move to buy nVidia. AMD is supposedly working on CPU/GPU combined processors, and Intel will be left with to competing product unless it either joins forces with nVidia or just buys them. It's the smart move...

Reply Score: 1

RE: Intel should buy nVidia
by bnolsen on Mon 12th Oct 2009 03:12 UTC in reply to "Intel should buy nVidia"
bnolsen Member since:
2006-01-06

Speculating here, but I bet nvidia over valued themselves.

Intel has larrabee coming. That is what is supposed to compete with ati/amd.

Considering larabee is just another x86 based architecture I suspsect it may be quite open source friendly. It could very well be a serious game changer (if it doesn't suck).

Reply Score: 2

Viva Competition!
by rgathright on Wed 14th Oct 2009 19:15 UTC
rgathright
Member since:
2009-09-24

All I know is that the Intel graphics accelerator in my ASUS 1005HA leaves a lot to be desired! http://bit.ly/44CHFm

If NVIDIA can make a better solution, I will buy that netbook instantly!

I ran some benchmarks and give more detail on the ASUS 1005HA graphics abilities this review: http://bit.ly/44CHFm

Reply Score: 1

RE: Viva Competition!
by WereCatf on Wed 14th Oct 2009 19:27 UTC in reply to "Viva Competition!"
WereCatf Member since:
2006-02-15

What the heck is it with you and advertising that Asus in every single comment of yours?

Reply Score: 2