AnandTech puts Intel’s new Ivy Bridge through its paces. “While it’s not enough to tempt existing Sandy Bridge owners, if you missed the upgrade last year then Ivy Bridge is solid ground to walk on. It’s still the best performing client x86 architecture on the planet and a little to a lot better than its predecessor depending on how much you use the on-die GPU.”
Still comfortable with Core 2 Quad Q9550. Not worth spending just to get maybe 30% better performance. I’ll wait and see if Haswell is any good, and hopefully they’ll release some 8-cores.
And really not worth for me. I am still comfortable with Pentium III and I really don’t understand why some people need two (or three or four) core processors even if they don’t play games, edit movies or do 3d rendering.
Compilation scales linearly. Other computational workloads as well. Time is money in some cases. Making these powerhouses more efficient per watt is even more win.
Edited 2012-04-24 12:14 UTC
I guess I’m on the same boat. My trusty K7@900 (w/256 MB of RAM) never feels short of power, unless I intend to transcode some videos.
Heck, coupled with a GeForce MX 4000 I’ve even played HL2!
Music/Movies/TV/Programming/Games it does it all. I’ll surely shed a few tears when(if!) this baby goes down.
I have same processor as you and I was wondering if there is somewhere that you could point me to benchmarks between Ivy and ours.
thanks dude…
This is something I’ve argued with the gear heads, that for the vast majority PCs have been “good enough” since they went dual cores. I haven’t seen any improvements on the desktop when i went from quad to hexacore and even with both boys loving to game (one on a hexacore, the other my quad handed down) frankly even the games rarely stress more than 2 cores, heck i found that for my mobile needs that a Bobcat dual core is more than plenty for watching HD videos, surfing, webmail, etc.
The problem is the reason why we see Microsoft killing themselves to get onto cell phones and tablets, because PCs simply haven’t had a “killer app” that can take advantage of the frankly insane power both Intel and AMD have given us. I will give Intel credit though, after AMD’s previous CEO backed the company into a corner (Killing Thuban, killing the next version of Brazos called Krishna) and bet the farm on Bulldozer which turned out to be not even as good as Thuban Intel could have just sat back and reaped the profits so the fact that they are still sticking to their tick tock strategy, even though it will cost a ton of money, just shows they aren’t willing to rest on their laurels.
I just hope AMD can come up with something better because as we have seen a monopoly is NEVER a good thing. I have to wonder if Intel would help out AMD to keep from ending up in that situation, as Microsoft did with Apple in the 90s? After all they can afford to give up the low end market (which is tight margins anyway) to AMD a lot easier than they can afford a bunch of governments watching them like hawks.
I bought one of the last core 2 duo laptops, a Sony TT11M/N (with the ULV version of the core 2 duo). The performance/autonomy ratio of that perfect little machine is a testament to how Intel has been screwing us those last 4 years by kneecaping its own products. (If you don’t believe me: this chip is (almost) the one that Apple choose for their first version of the Macbook Air – to the surprise of many pundits, I clearly remember, who wondered why didn’t Apple choose a more recent chipset.)
Since then – and a certain anti-trust lawsuit -, I am willing to sacrifice a little bit of convenience by buying AMD-only hardware when it comes to x86. I think it is worth it, long-term.
Same. I only buy AMD for my CPUs because I don’t like what Intel did to them in the 386/486 years and I also disapprove of Intel’s Microsoft-style “a SKU for every feature combination and woe be it if you don’t anticipate every needed feature” approach to pricing.
(In case you weren’t aware, Intel underestimated demand for 386 chips and had to cut a deal with AMD to use their fabs in exchange for sharing the 486 market. When the time came, they said “We’ve changed our mind… oh, and if you sue us for breach of contract, our lawyers are better and we’ll just bleed you dry in court.)
You’re holding a grudge for something a company did more than twenty years ago? Some of their current employees probably weren’t even born then…
Intel’s business practices have been suspect for the last two decades, what makes you think they’ve changed?
Oh, sure – blame them for what they’re doing *now*, under their current management team. But boycotting a company for something it did decades ago just doesn’t make sense – the people responsible for their current behaviour aren’t responsible for that.
No, you have to keep it in mind as part of their history of douchbaggery.
Only a few more weeks till the Trinity series comes out and Intel’s HD Graphics 4000 can’t even keep up with last gen’s HD6550D, the HD7660D on the A10 series APUs are gonna crush these things where it actually matters for the majority of the market, the general consumer. They may not notice when theres a few more seconds shaved off their MP3 rips, but they do notice when the GPU sucks even though they don’t know to attribute the skipping, the lag and the low video quality to the GPU.
If you don’t think it matters then why is there so much push for WebGL? Why are game companies making non Flash full 3D games that launch via the browser?
Assured to be installed GPUs that don’t suck in the low end consumer market creates a whole new industry.
The Intel antitrust case is seven years old. I don’t remember it led to a shuffle in the executive ranks.
Their strategy of kneecaping their products and planned obsolescence and “trusted computing” (system-wide DRM) and UEFI lockdown and “unlockable (*for a fee*) cores” is right now.
Actually the got busted for bribing the OEMs to take netburst just a few years ago (and paid AMD 1.25 Billion to drop their lawsuit) and the compiler rigging last I checked has been ongoing. You can take a Via CPU (the only chip that lets you change CPUID) from “Centaur hauls) to “Genuine Intel” and magically the chip will gain as much as 30% in the benches! Why? Because most of the benchmark software is compiled with the Intel compiler.
If you want to see how obvious the rigging is look at Atom VS Brazos benches, you’ll see that the in order Atom will magically beat an out of order brazos in many of the benches, yet we know in order CPUs are generally easier to stall and slower than out of order CPUs.
But sadly while I as a system builder have been supporting AMD for the past 3 years since all that came out once the socket AM3 chips run out if they haven’t replaced the Bulldozer arch i’ll have no choice but to go Intel. Bulldozer is the AMD’s Netburst, its a bad design and I seriously doubt ANY updates via Piledriver or Excavator are gonna help. the moron that decided that you could call a quad with hardware assisted hyperthreading an octocore was an idiot and because of how much it costs to implement their boneheaded design (two integer cores forced to share a single FP unit) they have NO choice but to charge like the virtual cores were real, even Intel hasn’t the guts to pull that.
Frankly for us system builders once the AM3 stocks run out they really have no compelling offerings. I suppose I’ll use Bobcats for small office boxes and Liano for HTPCs but that’s about it, they really don’t have a successor for Thuban, heck you can’t even attempt to unlock cores on BD/PD and in fact if you disable half the cores (killing the HT and leaving each core with its own FP unit) you get improved performance! If they sold the BD/PD chips as what they are, duals, triples, and quad with HT they might be more compelling but their pricing right now is in i5 territory and that chip curb stomps it. But if they don’t change direction most of the guys I talked to will go Intel, simply because they don’t want BD/PD, its just a bad arch.
Bassbeast, I found your analysis of current AMD/Intel CPUs interesting, informative, and probably in line with my own opinion, more or less.
But will somebody please take the urban myth that “Microsoft saved Apple in the 1990’s” out back behind the tool shed and put a bullet in its head?
Microsoft didn’t help out Apple, they helped themselves! Apple, even at their lowest point (remember their shares at $10 per? I do.) never, EVER had a market cap much below $1 billion (1 with 9 zeros) that I recall. How could 180 million (only 6 zeros there) “save” such a company? Especially given its often criticized high profit margins? What investor would have turned them down for a loan?
By investing, Microsoft was buying good PR, convincing the US DOJ and the EU that they still had viable competitors, convincing mixed Mac/Windows shops they had confidence in Apple to survive (at least in the short term), and buying the installation of IE as a default browser on the only other large-scale commercial OS in the marketplace. Not to mention putting the Apple “look and feel” lawsuits to bed forever with money, the way they never could in court.
Those two things (MS helped Apple, and MS helped themselves) aren’t exclusive.
You yourself almost write how it restored long-term confidence in Apple, a company visibly straying on its path at the time (or even failing as that company – what came of it wasn’t strictly the same Apple, but also a corporate coup of sorts from Next).
A company on a shaky ground because of inferior and more expensive offerings – but with a bit crazy followers readily buying them to “help save Apple” ( http://www.forbes.com/1997/08/08/column.html )
And the suit was put to bed in court ( http://en.wikipedia.org/wiki/Apple_Computer,_Inc._v._Microsoft_Corp… ), there were just some lingering squabble.
Edited 2012-04-30 23:53 UTC
Do you have links on this? Sorry, but I am not going to trust a post on the internet, without more proof…
Perhaps he got it a bit backwards, but quickly checking the most straightforward place and doing a ctrl+f gives:
http://en.wikipedia.org/wiki/Intel#Slowing_demand_and_challenges_to…
Perhaps he got it a bit backwards… Intel specifically worked to stop 2nd sources by then, and through litigation blocked AMD386 for many years: http://en.wikipedia.org/wiki/Am386
(or: it was a bit like he says, but 286 & 386, not 386 & 486)
Yeah, many might scoff at it “oh, over 2 decades ago” …but all this shaped the present landscape. Plus Intel didn’t play nice a mere less than a decade ago, which quite possibly impacted resources AMD could direct towards R&D and fab development.
http://en.wikipedia.org/wiki/AMD_v._Intel
You don’t just pay such amount if your hands are clean. Then there’s 250 million to Transmeta and:
Intel is producing some amazing tech with the 35W models. Ivy Bridge procs were excellent performers in 35W, Sandy are even more bang-per-watt.