Linked by Thom Holwerda on Wed 13th Sep 2017 16:40 UTC

With the iPhone X revealed, we really have to start talking about its processor and SoC - the A11 Bionic. It's a six-core chip with two high-power cores, four low-power cores, and this year, for the first time, includes an Apple-designed custom GPU. It also has what Apple calls a Neural Engine, designed to speed up tasks such as face recognition.

Apple already had a sizeable performance lead over competing chips from Qualcomm (what Android phones use) in single-core performance, and the A11 blasts past those in multicore performance, as well. Moreover, the A11 also performs better than quite a number of recent desktop Intel chips from the Core i5 and i7 range, which is a big deal.

For quite a few people it's really hard to grasp just how powerful these chips are - and to a certain extent, it feels like much of that power is wasted in an iPhone, which is mostly doing relatively mundane tasks anyway. Now that Apple is also buildings its own GPUs, it's not a stretch to imagine a number of mobile GPU makers feeling a bit... Uneasy.

At some point, these Apple Ax chips will find their way to something more sizable than phones and tablets.

Permalink for comment 648833
To read all comments associated with this story, please click here.
Member since:

I am, 100% aware of the popularity of embedded applications and systems. I am also aware that Linux has completely failed to make a dent into desktop market share. This idea that people will magically drop Photoshop to take up Krita and GIMP is false. AND in the interest of full disclosure, I only use Linux systems at home and where possible at work. I've been using Linux for 16 years now. I can compile my own kernels, and I write my own desktop software using GTK. I also live with an artist who would burn the house down before giving up Photoshop and the Autodesk suite of programs. This idea that ARM will magically take over everything is a fantasy from people that hate Intel. Legacy applications are a massive driver of IT purchasing decisions. Sure ARM is popular on mobile/tablet devices where content is consumed and barely created, raw capture is one thing, deep editing is another. ARM isn't going to suddenly storm into workstations and servers because people want it to.

Edited 2017-09-14 04:24 UTC

Reply Parent Score: 4