So, AMD, Intel, and Nvidis all decided to announce their latest products all on the same day yesterday. Let’s start with Intel, who announced the laptop version of their latest generation of processors, and if the performance claims hold up, they’re some damn good chips – but as always, we’ll have to await proper benchmarks.
These laptop chips use Intel’s new hybrid processor architecture, which combines larger, faster performance cores with smaller, more efficient cores (P-cores and E-cores, respectively). How many P-cores and E-cores you get depends on the processor you’re buying, and you’ll need an operating system that supports Intel’s “Thread Director” technology to get the most performance out of the chips. Windows 11 supports it now, Linux support is in the works, and Windows 10 doesn’t have it and won’t be getting it.
AMD, not wanting to be outdone, introduced its Ryzen 6000 series of mobile processors, which finally move their integrated graphics to RDNA 2m, and are the first to include Microsoft’s Pluton security chip.
Yesterday AMD disclosed that they would be launching the new Ryzen 6000 Mobile series today – updated cores, better graphics, more features, all in a single monolithic package a little over 200 mm2. There will be 10 new processors, ranging from the traditional portable 15 W and 28 W hardware, up to 35 W and 45 W plus for the high-end gaming machines. AMD is expecting 200+ premium systems in the market with Ryzen Mobile in 2022.
Finally, we have NVIDIA, with the smallest announcement of new high and low-end mobile GPUs.
Asus Zephyrus Duo 16
https://www.youtube.com/watch?v=urtv2OZoXw8
“security chip” from microsoft. yeah, no thanks.
NaGERST,
This quote from the article itself says it succinctly:
Winter is coming! Er… I mean walled gardens.
It’s a slow transition but I think both apple and microsoft could ultimately succeed to coerce ordinary home users into walled gardens controlled by them with sideloading limited to pro/enterprise versions. Neither of them want to do it alone since it would make them look bad, but they both want to do it. They’re both implementing the technology to lock out owner control, it’s just going to be a matter of turning the key one day and saying its necessary to keep owners safe even from themselves.
I was trying to send this as a separate guest story. I believe it requires more attention.
Not only AMD, but future Intel and Qualcomm chips will also carry the same technology. It will be a cloud updated black box security add-on for future “Windows” PCs.
Another data point: Microsoft is disabling “dev mode” on personal Xbox consoles. Previously you could pay a one time $20 fee, and play with a semi-open mode. This enabled emulators, among other things to work on the device.
Not anymore: https://www.digitaltrends.com/gaming/xbox-disabling-dev-mode/
I was hoping Xbox to become more open like Windows, and now Windows is becoming more closed like the Xbox.
What is worse? This is pretty much coming with all future PCs, even for Linux ones. Does anybody expect Intel to develop a separate chip for open source operating systems?
sukru,
🙁
One would hope that any integrated security features are vendor neutral because the last thing we need is more vendor specific keys in hardware granting them more privileges over our hardware. I don’t oppose security features as long as the owners are genuinely in control in practice, which is crucial! Unfortunately though it’s hard to have faith in their motives. It wouldn’t take much to harm competing platforms in the name of “security”.
Alfman,
I am not worried too much about today, but for the future.
Case in point: they reversed the Xbox dev decisions: https://mobile.twitter.com/tomwarren/status/1478858431314509829 . Apparently some unrelated team decided to do a cleanup, and the Xbox folks did not know about it.
That sounds good… However…
Let’s say 5 or 10 years from now the Xbox team does not have the same clout. Or they changed management and now focus more on “extracting value”.
Or… what if Netflix and other providers say “you need to have the default keys and signed windows drivers to watch our content”? It could leak to other services, like online banking, healthcare, or even workspace applications.
Samsung already does that on my phone. If I were to “root” it, I will lose all access to advanced features, media capabilities, or even my own fitness applications.
Trust me I get it. It’s all too easy for “security” to become a euphemism for DRM intentionally designed to lock out the owner. It’s why I’m so adamant that owners stay in control, but I think many of us are already doubting that things would play out this way. PR will put on a positive spin like always and the masses won’t pay attention to the control they’re loosing until it’s too late and owner restrictions become the norm.
Yes we’ve already seen that in some places where root access makes you a second class citizen. The threat of mission creep on this front is very real. It stops being optional when you are forced to give up owner rights. Corporations can and will use their power unilaterally to force everyone to give up their rights even when owners don’t agree to.
It reminds me of IRS and other government agencies literally forcing users to “consent” to id.me, a biometric data company, starting this year in order to get access to the services and benefits that the public are entitled to.
https://www.wraltechwire.com/2021/07/25/id-me-more-states-requiring-facial-recognition-for-unemployment-benefits/
Until mentioned companies start to offer affordable discrete GPUs again i am not all that interested in their PR or product line announcements.
buy a NV 1050 for about £80 on amazon and use Xbox Game pass. The problem is their 5 year old models are plenty good enough to stream to at a quality that would otherwise cost £500+
Exactly. All this fancy PR and tons of products announced and we should go to some flea market and buy a pricey used old graphic card. Until that changes i for sure won’t take Xbox Game pass. I will just take a pass. Once mentioned companies will again be relying on us, to buy their products. Then they will likely pay the price of their current short sighted behavior.
For used graphics cards (especially at the high-end); I’d be tempted to assume it was used for crypto-currency mining and had the living daylights pounded out of it for 24 hours a day, and is half a hair’s width away from hardware failure.
Brendan,
That’s true, I’d also be very wary of buying such an expensive product unless the discount was really substantial. But from a theoretical point of view cards that were used in large scale hashing operations are often run on open air racks that helps keep them cooler than normal gaming rigs, many of the gaming cases have bad temps. Also miners rarely power cycle. Power cycling causes a lot more thermal stress on electronics than staying on 24/7. I’ve also heard that undervolting GPUs increases cryptomining profits more than overclocking, which should increase life as well.
I found some articles talking about this…
https://www.howtogeek.com/342079/is-it-safe-to-buy-used-gpus-from-cryptocurrency-miners/
https://www.pcworld.com/article/395149/should-you-buy-a-used-mining-gpu.html
But I have not found any data with large sample sizes and rigorous testing methodology, so it’s just anecdotal. The other problem is that even if this were all true, buyers generally have to take the sellers at their word. There’s no way to prove how a GPU has been used and under what conditions.
Bingo we have a winnar! I used to follow tech religiously but after the RX 580 8GB I lucked out in getting brand new for just $150 pre batshittery suddenly shot up to $600 for a one that had been mined to death and anything newer I looked at costing more than a fricking used car in my area I simply gave up and talking to many friends on discord around the planet they all have the same attitude, hang on tight to watch you have and ignore any new products because pointless.
I mean seriously what good would be buying a Ryzen 6000 or Alder Lake CPU when I can’t get a single GPU newer than my RX 580 for less than 4 figures? All I would be doing is giving myself a massive bottleneck which would make the new CPU totally pointless when my R5 3600 pairs just fine with the RX580 and with DDR 5 being nearly as batshit as the GPUs I see no point in giving up the 32Gb of DDR 4 I only paid $100 USD for when the same amount in DDR 5 will cost me several times that if I can even find any.
If anything I’ve started watching tech shows that focus on what cards that miners do not want can still do decently at gaming so at least I can throw together something useful for the grandkids, trying to build a new PC in 2022 unless you just have piles and piles of money to piss down the drain to pay scalpers? Just an exercise in frustration.
I’m pretty excited about the new AMD chips. RDNA integrated graphics + support for USB4 are the two features I’ve been waiting for, that have been in the pipeline for ages. On the other hand the new Intel chips are nothing to sneeze at either. I’ve held out on upgrading from my 2014 MacBook Air for what feels like forever – whenever Framework announces its next upgrade (hopefully with AMD and touchscreen options) I’ll be very tempted to bite.
Will AMD support Thunderbolt at some point in the future ?
only if they pay intel and/or apple to licence it. Can’t see it happening anytime soon….
Thunderbolt is royalty free now I think.
As I understand it the thunderbolt spec has been subsumed into USB4. However all older thunderbolt <= 3 devices are not guaranteed to work without having the host also be explicitly certified. New devices should theoretically be able to take advantage of the same throughout levels, though, assuming that USB4 actually means something and isn't just a set of optionally implementable stuff as is the case with HDMI 2.1.
I find it really interesting how GPU is still such a hot space for development. I appreciate its mostly part of a crypto-mining bubble, but the reality of modern computing is the services like XBox Game Pass mean the GPU is becoming superfluous in a cloud connected world. There will always be a tiny niche that want to spend thousands on their top top end GPU, but the majority of gamers could ironically downgrade their hardware and play to a higher framerate via the cloud.
Adurbe,
I’m confused, don’t you still need local hardware for xbox games pass?
https://www.howtogeek.com/317745/what-is-xbox-game-pass-and-is-it-worth-it/
Streaming games have a latency problem. It’s less important for causal games but with RTS and FPS games it’s not ideal. I haven’t been following this closely, but stadia was criticized for not being able to provide a consistent high end gaming experience.
In this review of playstation now, the experience still involves compromise.
https://www.howtogeek.com/311217/what-is-playstation-now-and-is-it-worth-it/
I see there’s a beta streaming service that I wasn’t aware of.
https://www.xbox.com/en-US/xbox-game-pass/cloud-gaming
Obviously I haven’t seen it. My guess is that it probably the same issues as other streaming services, but for some people I imagine it could be good enough. I’m not a fan of monthly charges, but obviously that comes with the territory.
Yes it’s £10 a month, but if it’s the diffence between a £100 GPU and a £350 one you have a couple of years before the cost of one exceeds the other.
Alfman,
There are techniques to combat latency on streaming services. Online competitive games already does some “predictive input”, i.e.: they assume you’d go in one direction, or continue holding the gas pedal trigger.
Those are not new, and there is a lot of research on the subject.
https://en.wikipedia.org/wiki/Client-side_prediction
https://www.diva-portal.org/smash/get/diva2:1560069/FULLTEXT01.pdf
even Stadia had that since launch: https://www.pcmag.com/news/google-stadia-will-use-input-prediction-to-reduce-game-lag
There are other problems though, like jitter, wifi interference, or overall connection reliability. But those too are active research topics.
I don’t know which individual services will eventually stay, though. Time will tell.
sukru,
Yes these tricks have long been deployed to compensate for lag going as far back as quakeworld where both your position and object positions would get estimated to counteract the effects of multiplayer latency. This makes gameplay look much smoother visually but it also means the positions and motions you are seeing on your screen aren’t necessarily the real positions and motions at the server. For example a “headshot” on your screen can be a miss on the server or another player’s screen.
In any case though the predictive solutions available to stadia are inherently more limited for generic games that are not written for stadia. Game clients that are designed to run locally will be oblivious to stadia’s end to end rendering and input delays. Games typically render screens with the assumption they will be displayed in the next frame or two, but with stadia that’s not the case. For example if you pan left or right, you’d see it “immediately” in local game play, but with stadia there’s an additional round trip delay. Also even for objects who’s motion can be predicted in a multiplayer server connection will experience more delay on stadia because the frame was predicted for a time that has gone and past. So while I agree with you there are methods to compensate for server lag, stadia can’t really apply a generic fix without reworking the games themselves. Even in the best case visual feedback lag will always remain a shortcoming of remote rendering.
It will probably be more a matter of being “good enough” than matching the experience of a high end local system. For better or worse I suspect my current ISP solution wouldn’t be able to handle it well, but what else is new, haha.
I can play Halo quite happily on Microsofts cloud service. I’m certainly no pro. But I don’t feel like it’s latency or similar that is limiting my skills This may be thanks to UK infrastructure, but i think its just the service(s) have improved beyond where their old “label” was 5 or 6 years ago. It’s certainly good enough that Microsoft sell an entire Xbox model around the concept (which is ace BTW!)
That would be true if most in the US had access to reliable low latency broadband, but they don’t.
Do any of the private US ISP even lay new fiber these days besides for specific businesses? There is a city near me that has amazing municipal broadband with 1000 mbps up/down but there’s nothing close for most American’s. I’m very jealous of them. Some states in the south were looking at banning similar municipal broadband last I heard so I don’t see that changing any time soon. The private market is not creating similar infrastructure for consumers despite what we heard when it came to the benefits of removing net neutrality.
kepta,
Even with the best broadband, you’d still get additional latency with compression/internet routing/wifi broadcasting/decompression/etc that you don’t incur with local rendering, but I agree that internet quality is likely the weakest link.
Same.
Haha. This was/is so unsurprising. It’s sad that with all the public support we had for net neutrality that the government was willing and able to throw it away so quickly 🙁
Internet quality is the weakest link with local hardware too) assuming you’re playing against online opponents) . Online play has many of the limitations as equivalent with all online play if I have a top end GPU, the Internet is still the bottleneck. Same is true of most people since the death of LAN parties
Adurbe,
Well, there are the predictive solutions that sukru brought up in another post. Consider that often in game objects have momentum, which makes it’s possible to predict an object’s path to mask latency. But here’s the thing: local games are designed to mask network latency in the networking code generally are not designed to mask network latency in local peripherals. Also like I mentioned earlier, round trip interaction latency cannot be masked at all and that can get annoying. It’s weird because you can watch a screen and not notice latency at all, but as soon as you control it with the mouse you can feel it right away. Consider a big reason games and VR systems benefit from high framerates is not merely about the eye perceiving smaller changes between frames but it’s about the perception of responsiveness. VR headsets with too much latency can even give people headaches because the feedback delay is unnatural.
When I played multiplayer games over dialup 50ms pings were typical and still fun. Now days it’s realistic to get <20ms pings so I don't think this is bad. Although that depends on routing and datacenter locations obviously, rural areas may not be so fortunate. Another thing to keep in mind is a local game only needs to sends object coordinates in a few UDP packets versus an HD screenshot 60 times per second. Pings and jitter can get worse under loads, especially during prime time when everyone is using high bandwidth applications at the same time.
Anyways, my point isn't that streaming can't be compelling for some, only that it cannot match the responsiveness of a high end system and not everyone is going to be near a suitable data center. Some people may still find it good enough for them though and that's fine.
Latency is easily fixable with fixed routing, and player catchment area, and local placement of servers at the network level. Required latency upper and lower boundaries, or required quality of service, also varies depending on the level of formal competitiveness, player expertise, and game type.
Basically, it’s a solved problem and has been for years. Nobody talks about it because the media are lazy and vendors don’t want to give away what they perceive is a competitive advantage and marketers liking to sell the idea that gamers can have everything they want all the time even when constrained by reality. Vendors are also constrained by money. It’s only the likes of Microsoft et al (or governments) who can afford the necessary bribes to receive this level of provision. Pretty much everyone else including major game developers won’t get a look in.