Linked by Thom Holwerda on Wed 13th Sep 2017 16:40 UTC
Apple

With the iPhone X revealed, we really have to start talking about its processor and SoC - the A11 Bionic. It's a six-core chip with two high-power cores, four low-power cores, and this year, for the first time, includes an Apple-designed custom GPU. It also has what Apple calls a Neural Engine, designed to speed up tasks such as face recognition.

Apple already had a sizeable performance lead over competing chips from Qualcomm (what Android phones use) in single-core performance, and the A11 blasts past those in multicore performance, as well. Moreover, the A11 also performs better than quite a number of recent desktop Intel chips from the Core i5 and i7 range, which is a big deal.

For quite a few people it's really hard to grasp just how powerful these chips are - and to a certain extent, it feels like much of that power is wasted in an iPhone, which is mostly doing relatively mundane tasks anyway. Now that Apple is also buildings its own GPUs, it's not a stretch to imagine a number of mobile GPU makers feeling a bit... Uneasy.

At some point, these Apple Ax chips will find their way to something more sizable than phones and tablets.

Thread beginning with comment 648833
To view parent comment, click here.
To read all comments associated with this story, please click here.
Darkmage
Member since:
2006-10-20

I am, 100% aware of the popularity of embedded applications and systems. I am also aware that Linux has completely failed to make a dent into desktop market share. This idea that people will magically drop Photoshop to take up Krita and GIMP is false. AND in the interest of full disclosure, I only use Linux systems at home and where possible at work. I've been using Linux for 16 years now. I can compile my own kernels, and I write my own desktop software using GTK. I also live with an artist who would burn the house down before giving up Photoshop and the Autodesk suite of programs. This idea that ARM will magically take over everything is a fantasy from people that hate Intel. Legacy applications are a massive driver of IT purchasing decisions. Sure ARM is popular on mobile/tablet devices where content is consumed and barely created, raw capture is one thing, deep editing is another. ARM isn't going to suddenly storm into workstations and servers because people want it to.

Edited 2017-09-14 04:24 UTC

Reply Parent Score: 4

woegjiub Member since:
2008-11-25

It's not going to matter much what CPU it's running soon enough.

You may not have noticed, but there are proof-of-concept in-browser versions of the heavy hitters like CAD and photoshop.

With either wasm or electron, it's trivial to use the full extent of a platform's power, and do you really think Adobe is going to keep letting their software get pirated like it does?

Google office and MSOffice Online are only the start, it's all going completely subscription-based - and what better way than by requiring everyone use an OTA-updated universally compatible app?

Hell, look at desktop and mobile already - huge percentages of modern software are webapps in wrappers.


Even if you don't see the writing on the wall for desktop apps, there are large numbers of "professionals" for whom an ipad pro is already sufficient. Stick a keyboard on it and resurrect the "iBook" branding or something, and you really think the Adobes of the world are going to stand around while competitors like Sketch eat their breakfast?

Edited 2017-09-14 05:00 UTC

Reply Parent Score: 2

shotsman Member since:
2005-07-22

Ah, the mythical Web based thingy.

Naturally it relies upon an always on internet connection that charges by the bit for data going over it.

So there I am on a shoot and take a whole bunch of images with my new Nikon D850 (47MP). Say around 24Gb for a decent day in the field.
1) How long to copy that lot up to the cloud for the cloud version of Lightroom to work on it?
2) How much will that cost me from the middle of the Amazon rain forrest? Do you want an extra arm with that?

Sure, for a lot of people the cloud/web versions will work. But for a huge percentage of Photographers out of the studio? Forget it.
Have laptop, will travel and process images.

Reply Parent Score: 4

ahferroin7 Member since:
2015-10-30

With either wasm or electron, it's trivial to use the full extent of a platform's power


Haha, no.

You can't even use the full extent of a platform's power in Java (at least, if you plan on compiling to JAR files instead of native executables), using it in Electron is a complete joke, and anyone trying to say otherwise either is being paid to do so, or has no idea what that many levels of indirection does to performance. Electron is why VS Code and Discord's Desktop app get such crap performance compared to natively compiled tools. The same has conventionally applied to things built on Adobe AIR and Mozilla's XULRunner.

WebAssembly makes things better, but it's still limited in performance because of the translation overhead.

Portability is the enemy of performance. Portable builds of software written using Java, or a CIL language, or even things like WebAssembly, are all interpreted code, not machine code. That hurts performance. In fact, the only case I've ever seen where that can perform better is Lua, and that only applies in very specific situations where the algorithm being used happens to be more efficiently implementable in Lua's interpreter runtime than whatever native runtime you're comparing it to.

By the same virtue, iOS is so efficient because it's single platform. macOS is as well. Conversely, Android supports at least 3 CPU architectures, not accounting for varying bit width (x86, MIPS, and ARM), and it runs on a kernel that supports a lot more (SPARC, POWER, SH, Alpha, HPPA, M68k, S/390, RISC-V, and more than a dozen architectures most people have never heard of).

Note that I'm not saying that portability is bad, just that it's not always the best answer (especially if it's not a client node you're running on).

Reply Parent Score: 4

Kochise Member since:
2006-03-03

Then why is there ARM based server ? At least attempts ? What is so particular to x86 that ARM cannot do ? It plays video and game, display pictures and the internet, can be used for office use, even lightweight clients runs on ARM.

The virtual machines (Net, Java) and JIT makes things so easy to port or even run any kind of software on ARM it is baffling you believe it won't work because desktop linux failed. I'm pretty sure Adobe could port Photoshop and whatever suits them if they find an economical interest doing so.

I was a 68k fanboy but turned agnostic because flaws in x86 were slowly removed and because the cpu implementation isn't really important provided it counts reliably and accurately. Pretty sure ARM will slowly take over the world, sooner or later. I'm not even sorry for the 50 year old x86.

Reply Parent Score: 2

Sidux Member since:
2015-03-10

RISC/ARM has been available for decades. Even Apple was not using Intel x86 cpus. The main reason they did this was for compatibility purposes and attracting lots of developers.
Main problem with Unix these days is that Linux does pretty much everything at fraction of a cost.
IBM, Oracle, HP and I guess even Apple are fully aware they have to let it go at some point.

Reply Parent Score: 2