Linked by Thom Holwerda on Wed 13th Sep 2017 16:40 UTC
Apple

With the iPhone X revealed, we really have to start talking about its processor and SoC - the A11 Bionic. It's a six-core chip with two high-power cores, four low-power cores, and this year, for the first time, includes an Apple-designed custom GPU. It also has what Apple calls a Neural Engine, designed to speed up tasks such as face recognition.

Apple already had a sizeable performance lead over competing chips from Qualcomm (what Android phones use) in single-core performance, and the A11 blasts past those in multicore performance, as well. Moreover, the A11 also performs better than quite a number of recent desktop Intel chips from the Core i5 and i7 range, which is a big deal.

For quite a few people it's really hard to grasp just how powerful these chips are - and to a certain extent, it feels like much of that power is wasted in an iPhone, which is mostly doing relatively mundane tasks anyway. Now that Apple is also buildings its own GPUs, it's not a stretch to imagine a number of mobile GPU makers feeling a bit... Uneasy.

At some point, these Apple Ax chips will find their way to something more sizable than phones and tablets.

Thread beginning with comment 648835
To view parent comment, click here.
To read all comments associated with this story, please click here.
woegjiub
Member since:
2008-11-25

It's not going to matter much what CPU it's running soon enough.

You may not have noticed, but there are proof-of-concept in-browser versions of the heavy hitters like CAD and photoshop.

With either wasm or electron, it's trivial to use the full extent of a platform's power, and do you really think Adobe is going to keep letting their software get pirated like it does?

Google office and MSOffice Online are only the start, it's all going completely subscription-based - and what better way than by requiring everyone use an OTA-updated universally compatible app?

Hell, look at desktop and mobile already - huge percentages of modern software are webapps in wrappers.


Even if you don't see the writing on the wall for desktop apps, there are large numbers of "professionals" for whom an ipad pro is already sufficient. Stick a keyboard on it and resurrect the "iBook" branding or something, and you really think the Adobes of the world are going to stand around while competitors like Sketch eat their breakfast?

Edited 2017-09-14 05:00 UTC

Reply Parent Score: 2

shotsman Member since:
2005-07-22

Ah, the mythical Web based thingy.

Naturally it relies upon an always on internet connection that charges by the bit for data going over it.

So there I am on a shoot and take a whole bunch of images with my new Nikon D850 (47MP). Say around 24Gb for a decent day in the field.
1) How long to copy that lot up to the cloud for the cloud version of Lightroom to work on it?
2) How much will that cost me from the middle of the Amazon rain forrest? Do you want an extra arm with that?

Sure, for a lot of people the cloud/web versions will work. But for a huge percentage of Photographers out of the studio? Forget it.
Have laptop, will travel and process images.

Reply Parent Score: 4

Alfman Member since:
2011-01-28

shotsman,

Ah, the mythical Web based thingy.

Naturally it relies upon an always on internet connection that charges by the bit for data going over it.

So there I am on a shoot and take a whole bunch of images with my new Nikon D850 (47MP). Say around 24Gb for a decent day in the field.
1) How long to copy that lot up to the cloud for the cloud version of Lightroom to work on it?
2) How much will that cost me from the middle of the Amazon rain forrest? Do you want an extra arm with that?

Sure, for a lot of people the cloud/web versions will work. But for a huge percentage of Photographers out of the studio? Forget it.
Have laptop, will travel and process images.


Yeah, I was hearing on the news about people who got their power back after the hurricanes but still had no internet or cell service. Internet may be extremely limited for some until the infrastructure is fixed. Obviously this is a major failure mode for "cloud apps" that would otherwise work if they were local apps (things like GPS could be extremely useful, but your screwed if you rely on an online service like google maps).

Of course these are drastic circumstances, but the cloud services can and do fail under normal circumstances too, ie amazon outages, google outages, microsoft outages, isp outages... One of the very few games I reluctantly bought off steam was a multiplayer party game from jackinthebox games. I thought would be fun to use during a holiday party. Low and behold, the jackinthebox cloud service was connecting and disconnecting all night. This failure mode would not have been an issue with a local version. And despite the fact that I own a perpetual license, it will stop working whenever they deem to take down the service. With local software, you can run it on your terms, but with remote "cloud" software, you become completely dependent.


Technology has gone between local versus "cloud" since the easiest mainframe days. The main difference was that back then the trends were motivated by cost and technological factors. These days the decision to have remote services is often made for advertising, snooping, and marketing reasons even when it conflicts with robust engineering.

Edited 2017-09-14 13:42 UTC

Reply Parent Score: 3

woegjiub Member since:
2008-11-25

Service Workers.



There's no reason those exact same apps won't work without the internet as long as you've loaded the page even once.

Reply Parent Score: 2

tylerdurden Member since:
2009-03-17

Jesus, the more things change... the more they stay the same.

I remember reading similar posts where people used to film cameras could not envision the feasibility of digital imaging and processing of photos on the field.

When the auto was introduced, people used to horses were wondering about what happens when you run out of hay.

Reply Parent Score: 2

ahferroin7 Member since:
2015-10-30

With either wasm or electron, it's trivial to use the full extent of a platform's power


Haha, no.

You can't even use the full extent of a platform's power in Java (at least, if you plan on compiling to JAR files instead of native executables), using it in Electron is a complete joke, and anyone trying to say otherwise either is being paid to do so, or has no idea what that many levels of indirection does to performance. Electron is why VS Code and Discord's Desktop app get such crap performance compared to natively compiled tools. The same has conventionally applied to things built on Adobe AIR and Mozilla's XULRunner.

WebAssembly makes things better, but it's still limited in performance because of the translation overhead.

Portability is the enemy of performance. Portable builds of software written using Java, or a CIL language, or even things like WebAssembly, are all interpreted code, not machine code. That hurts performance. In fact, the only case I've ever seen where that can perform better is Lua, and that only applies in very specific situations where the algorithm being used happens to be more efficiently implementable in Lua's interpreter runtime than whatever native runtime you're comparing it to.

By the same virtue, iOS is so efficient because it's single platform. macOS is as well. Conversely, Android supports at least 3 CPU architectures, not accounting for varying bit width (x86, MIPS, and ARM), and it runs on a kernel that supports a lot more (SPARC, POWER, SH, Alpha, HPPA, M68k, S/390, RISC-V, and more than a dozen architectures most people have never heard of).

Note that I'm not saying that portability is bad, just that it's not always the best answer (especially if it's not a client node you're running on).

Reply Parent Score: 4