Intel’s highest-end graphics card lineup is approaching its retail launch, and that means we’re getting more answers to crucial market questions of prices, launch dates, performance, and availability. Today, Intel answered more of those A700-series GPU questions, and they’re paired with claims that every card in the Arc A700 series punches back at Nvidia’s 18-month-old RTX 3060.
After announcing a $329 price for its A770 GPU earlier this week, Intel clarified it would launch three A700 series products on October 12: The aforementioned Arc A770 for $329, which sports 8GB of GDDR6 memory; an additional Arc A770 Limited Edition for $349, which jumps up to 16GB of GDDR6 at slightly higher memory bandwidth and otherwise sports otherwise identical specs; and the slightly weaker A750 Limited Edition for $289.
These are excellent prices, and assuming Intel can deliver enough supply to meet demand, I think I may have found my next GPU. If history is anything to go by, these will have excellent Linux support, but of course, we would be wise to let the enthusiasts iron out the bugs and issues. Six to twelve months after launch, these could be amazing allrounders for a very good price.
Everyone who has tested the Arc series online has said the drivers are seriously half baked and buggy af and considering how much money Intel is going to lose on their dedicated graphics section this year? Honestly I’d be afraid to buy one of these even half price as there is a serious possibility Intel may just pull the plug and say what you will about AMD and Nvidia they will support their products for years.
bassbeast
I’d be inclined to skip the early generations also, but assuming the products are well received and the reviews stay positive, they might be a bargain for gamers looking for cheap cards. I think the industry could use new competition and for this reason I hope they stick around.
But the catch is with games you HAVE to have regularly optimized drivers for the latest games and if Intel takes a bath on these and pulls the plug? You have a $300+ paperweight.
So while I hope they stay in the game I’m not willing to bet that much cash on it, not when the market is flooded with used GPUs that I know will easily be supported past 2025.
Intel’s datacenter strategy depends on Xe core development. So it’s unlikely they’re pulling the plug.
Just because they continue making data center chips does NOT mean they will be making desktop GPUs with gaming drivers. Those are 2 completely different markets and the kind of optimizations that benefit AI research doesn’t give you a driver you can use for the latest first-person shooter.
This is why AMD and Nvidia have different drivers for workstation and gaming and Intel could easily just focus on the more lucrative AI acceleration market and not bother with gaming drivers, in fact the YTer Moore’s Law Is Dead who has had the inside skinny on Intel releases for quite a few years says the executives at Intel are leaning this way and just canceling Battlemage for focus strictly on server with Xe is a real possibility.
Intel can’t afford to develop a GPU just for the datacenter. So they need the revenue from both consumer and datacenter.
Intel will likely have a bit of a hard time as crypto mining and crypto in general is collapsing. Due to that there isn’t a shortage of graphic cards on the market anymore. On top of that it’s reasonable to expect a lot of used and powerful graphic cards emerging will further bring the prices down. On the other hand companies such as Nvidia still believe they can get away with it. By selling extremely pricey graphic cards. Here Intel has some room to make a dent with such reasonably priced graphic cards. If Intel manages to stay relevant in next three to five years and constantly improves software. Then this will be welcomed addition. If by any chance Intel graphic cards will have a superb GNU/Linux support. Then on this count alone they will gain a lot of people supporting them. But i guess the main problem for them is people in general don’t trust them yet. And if all will wait for some future generation. Then Intel might drop this division altogether. Due to burning too much money. Lets be both realistic and optimistic and lets hope for the best. All in all it would benefit us if there would be three strong players in this segment.
I wouldn’t use an ex-miner GPU if it was given to me. That is just asking for issues.
Maybe you would not. But when the price tag will be low enough. The rest of the world will.
helf,
It depends. A lot of mining setups are purpose built with much better cooling than consumer gaming rigs where they’re used in tight hot cases. Moreover the on/off cycles creates more thermal stress on components than running them 24×7. It’s the same reason why some incandescent lightbulbs last decades and even centuries.
It would be very interesting to see the actual defect rates on a large scale. If you know of any, please link them!
In my experience (at least pre-Covid era), buying ex-mining cards is not that bad, considering a miner who knows what she’s doing is undervolting the card for maximum power efficiency per hash. The biggest issue to look out for is whether they flashed a custom mining firmware to the card and neglected to flash it back to factory before selling it, but it’s easy enough for most folks to flash it back themselves after purchase. I did exactly that with a R9 290 I picked up for a fraction of the retail price a few years after it came out, but still relevant as a midrange gaming card at the time. It got glitchy in a few games right after I installed it, so I flashed the factory firmware and it worked like new.
The only other issue I had with an ex-miner was a dead fan on a R9 280, which was a $15 fix.
I haven’t bought any ex-mining cards since Covid started, as coincidentally I picked up my current RX 5700 right before the prices went sky-high as retail supply dried up.
It depends. A lot of mining outfits kind of knew what they were doing and ran their boards in proper environments.
The main issue would be the wear and tear in the mechanical components of the cooling system. So a repaste of the heatsink and some WD40 on the bearings of the fan would do wonders for the board longetivity.
In any case, as GPU crypto continues to collapse the flood of cheap GPUs would make it a very attractive proposition. And a timing nightmare for Intel.
> and some WD40 on the bearings of the fan
I mean maybe as long as you mean wd40 as a brand and not teh main wd40 product.
Not sure which product would be best but out of the 2 I have her4e I would be using silicone lubricant over plain wd40. Still a fucked fan is better repalced anyway. Most you can’t get to where it is needed without breaking them.
Still would not but such a card though. maybe at 20% of current retail, otherwise no.
At the moment, they don’t even have great Windows support! But I share your optimism, and expect that eventually, these (or maybe the B or C series) will be excellent alternatives to AMD on Linux (nVidia is a dumpster fire on Linux).
NVIDIA runs just fine on linux. It’s one of the few options for production work honestly.
I’d be interested in finding out how these boards run on linux and the state of OneAPI as an alternative to CUDA would be welcome.
javiercero1,
They generally run well for me too, however it depends how the drivers are installed. When installing the drivers directly from nvidia, which are important for cuda, I’ve experienced numerous breakages after updating the kernel. It’s been problematic enough for me that I do not trust updates as much as I normally would if I’m using nvidia drivers.
But there’s good news because not long ago they announced an open source ABI shim into the kernel. This should provide a stable ABI for nvidia’s proprietary drivers (ugh) to use in a roundabout way, so I’m hoping the update problems become a thing of the past.
I agree. OpenCL seems to be in an evolutionary dead end and it’s not good to be so dependent on proprietary cuda for GPGPU. I welcome an alternative, but it’s not really clear whether the industry will embrace an alternative. I for one would much rather support an open API assuming it can deliver similar features & results.
If you’re doing production work, NVIDIA on an LTS platform + official repository works flawlessly. Linux is actually the reference platform for CUDA.
But yeah, OneAPI should hopefully put some pressure on NVIDIA in terms of SW. Since AMD is basically a no show in the GPGPU arena as their SW stack is just hopelessly behind.
javiercero1,
I do use LTS. It sucks that the drivers you download from nvidia break. It’s not just that the new drivers don’t work, those broken drivers inhibit VESA support and the screen is completely disabled. Repository updates can fail on occasion even if it is due to unrelated conditions (server outage, network outage, inadequate disk space, etc). in this case you generally end up with only some of the packages being updated on your system. While you can chalk that up to bad luck, I just find Nvidia’s drivers a bit too fragile – they ought to be more robust even in the case of unexpected failures. For what it’s worth, I’ve never had this problem with intel GPU drivers. But as I said before, things should be getting better with the new open source kernel module providing a stable ABI for nvidia’s drivers. Less tight coupling is good.
Now if only we can get nvidia to provide complete thermal monitoring under linux, which is something many linux cuda devs have been calling for.
@Alfman
We run CUDA on lots of desktops and a server farm. We’ve never have had an issue now on 4 different LTS releases in both Ubuntu and RH/Fedora.. Unless you’re in an unsupported configuration, the whole process is bulletproof.
javiercero1,
I’m glad it works for you, most of the time it works here too. To be “bulletproof” though it would have to be completely transactional, which it isn’t and that leaves open the possibility of failure. Most enterprise environments I’ve worked in prohibit auto updates. Everything has to be carefully staged and only deployed after testing. Chances are your team does this too.
@Alfman
Yeah, I work in a professional environment for a top semi organization, so we have top talent who actually know what they are doing.
I’ve also ran NVIDIA products on linux for years and years without any significant issue. Turns out that RTFM and checking out what supports what goes a long way.
javiercero1,
That’s great, but I’ll note it doesn’t refute anything I said.
Note that such shim (expectations vise) already exists. That is normally on a distribution such as Ubuntu people are expected to use the driver installer provided by Ubuntu. And not to manually download the Nvidia driver and install it. When resulting to such preferred option the issues you mentioned don’t exist. Still if you for whatever reason decide to manually install the Nvidia driver. Downloaded directly from Nvidia. You have two option. One is to download the driver and run the installer and expect to all work fine. Here a kernel update in the future can indeed cause issues. For Nvidia driver to stop working and for you to need to install it again after. To make it work again with the new kernel. But there is an another option. Depending on your distribution you need to make sure DKMS is installed and you have Linux kernel headers installed on your system. If i remember correctly then the Nvidia installer will detect that and ask you if you want to register the kernel module with DKMS. If you do it like that. Then on kernel update the mentioned shim will get rebuild automatically and Nvidia driver will continue to work just fine with the new kernel. As this is Intel news. And if you don’t want to do all that. And you don’t want to use the built in installer at all. And lets say you bought computer for a person that doesn’t know much about computers. Then just buy and add an entry level Intel ARC graphic card in it. For the next lets say ten years or more. Such computer will always be able to run latest Ubuntu or Debian just fine. Without the need to think about graphic card drivers at all. Microsoft with Windows doesn’t come close to that. Apple with their desktop option is similar to that. But with Apple you are severely limited by the hardware selection. Google with Android should RTFM. Then suddenly Android mobile phones lifespan would increase to a decade or more. With latest Android working on decade old mobile phone just fine. That is if Google wouldn’t constantly increase the hardware consumption with stock Android. Like now where each year you need around a GB of RAM more. To run the same launcher with tray area and a couple of icons in it. All in all i read Intel should have great GNU/Linux support with this hardware. Hence we need Intel to improve the situation on the market. Especially in regards to Nvidia and AMD still has some room left for improvement.
Geck,
Yes, but the drivers from nvidia’s themselves often contain new features. Like when I wanted to use the RT cores on my new card the drivers in ubuntu were not new enough. As a cuda developer you’ll want the latest drivers.
Yes, I use that, in my experience it isn’t always reliable though.
Given intel’s iGPU history, I’m pretty optimistic that should be the case.
Let’s not degrade this into a thread about OS superiority.
@Alfman
No, as a CUDA developer you want the CUDA drivers, which are not necessarily the latest. In fact the latest are usually not certified for CUDA.
It’s OK from time to time. To mention the superiority of some approach. Especially as you get much of that in all PR statements when it comes to lets say iOS or Windows. Just finished watching this. On i guess how one could watch HD videos on Pentium 3 computer combined with latest graphic card and using GNU/Linux. Maybe in the future to try to use Arc series.
https://www.youtube.com/watch?v=HQ7AdXPaPxc
javiercero1,
I am a cuda developer. Obviously you need the cuda runtime drivers as well, but for the record though I always install the newest versions and have never had a problem. Heck even nvidia’s own instructions have you to install the latest standard nvidia drivers. Cuda software is both forwards and backwards compatible within reason so typically you want the latest drivers available.
https://docs.nvidia.com/cuda/cuda-installation-guide-linux/index.html
sudo yum install nvidia-driver-latest-dkms <– these drivers are not cuda specific
sudo yum install cuda
sudo yum install cuda-drivers
On debian the nvidia-cuda-toolkit package links to the standard nvidia driver with a version requirement of greater than or equal to version required used by the cuda toolkit.
So if you want to convince me that you are not supposed to use the latest nvidia drivers for your card in order to use cuda, then kindly provide a source for your claim. Thanks.
@Alfman
I’m not interested in wasting energy in another useless debate with you, where I have to condense years of experience in the matter vs you throwing half assed google searches and see what sticks.
Any CUDA developer understands that you use the driver/toolkit from NVIDIA’s repository for the specific supported distros for production work. And manual installs are for cases whether you want to try out the new features and can’t wait for the validated repo version for your distro, so if you’re making money with CUDA you don’t do that.
I don’t even know what the heck are we discussing here, other than your pathological need to bring issues.
javiercero1,
In other words you are unable to site a source for your claim, got it.
That’s what I said. You like to pretend it’s ridiculous but when we read about new features some of us want to be able to run them and sometimes that requires the latest drivers, which can have fewer bugs too.
@javiercero1
Alfman isn’t alone in his experience that Linux support from Nvidia has been less than amazing. It’s great that everything has been fine for you, but clearly the same is not true for a lot of other users. If the countless posts scattered all over the net about it, and Nvidia’s own Linux devs who’ve previously admitted on their own official support forum that they’re grossly understaffed aren’t evidence enough, I personally have had a rocky relationship with Nvidia on Linux. In fact, that’s why Nvidia and I have parted ways on Linux. Apparently, after a long wait, Nvidia has focused on making their Linux drivers more reliable. It’s great that they’re finally trying to ease the pain of an obvious problem. Why Nvidia can acknowledge this but you can’t is beyond me. Insisting that since your experience is great, everyone else’s should be too if they had any clue what they’re doing is pretty …. stupid, especially from someone claiming years under their belt to know better.
I’m planning on getting one to play with. Intel’s drivers typically iron out and run well. Even the “omg bugs!11” that have been plastered all over the internet aren’t ones /most/ people will even notice. The frame rate issues are still well within the range most people dont give a crap about. It’s kinda funny how much hand-ringing there has been over it.
i had serious issues with intel’s gpu on kernel 3.1. System would simply go BSOD during modesetting on this gpu. I had to use it with nomodeset to see anything. This and host of other bugs.
I reported it, and supplied all the debug logs, helped as much as i could.
It got _accidentally_ fixed around kernel 3.5. Just look at a calendar to figure out how long that was.
That made me avoid intel gpus like the plague since.
The announced prices for Intel’s A-7** series cards is tempting, maybe something to upgrade the GPU in my Minecrafting/Robloxing son’s PC. But with more technical-oriented reviewers like Gamer’s Nexus on Youtube showing some serious problems with Intel’s drivers and ability to play non-DirectX12 games, I’m hesistant. I also have bitter memories of buying a Dell Inspiron when Intel first released their Iris Xe Max GPU that was flat out broken for almost year, Dell support was, well, typical Dell support (problem? what problem??) and then, months and months later, Intel quietly released a fix that Dell never (as far as I saw) pushed to users via Dell Update (installed it on my system thanks to a post by a /Dell subreddit user).
samcrumugeon,
IMHO, if you can wait 6-12mo for the software & hardware kinks to get ironed out, you’ll probably have a better experience. I may get one down the line, but unless I had an urgent reason to get one now I’d let other early market adopters be the guinea pigs! Also we’ll all be in a better position to see how well intel is addressing consumer needs with their new GPU products.