Linked by Thom Holwerda on Sat 27th May 2017 09:26 UTC
Apple

Apple is working on a processor devoted specifically to AI-related tasks, according to a person familiar with the matter. The chip, known internally as the Apple Neural Engine, would improve the way the company's devices handle tasks that would otherwise require human intelligence - such as facial recognition and speech recognition, said the person, who requested anonymity discussing a product that hasn't been made public. Apple declined to comment.

It's interesting - and unsurprising - that while Google is investing in server-side AI by developing its own custom AI hardware, Apple is apparently investing in keeping AI local. It fits right into the different approaches to privacy by these two companies, which is why I find this entirely unsurprising.

As a sidenote - isn't it interesting how when new technologies come around, we try to offload it to a specific chip, only to then bring it back into the main processor later on?

Order by: Score:
no hope
by unclefester on Sat 27th May 2017 09:36 UTC
unclefester
Member since:
2007-01-13

Apple's culture of secrecy prevents them doing any bleeding edge research. The top neuroscientists all have an academic background. They aren't going to put up with rules preventing them publishing in journals or appearing at academic conferences.

Reply Score: 5

RE: no hope
by cranfordio on Sun 28th May 2017 04:59 UTC in reply to "no hope"
cranfordio Member since:
2005-11-10
RE[2]: no hope
by kwan_e on Sun 28th May 2017 08:22 UTC in reply to "RE: no hope"
kwan_e Member since:
2007-02-18



A slide is no substitute for actual papers. Until then...

Reply Score: 2

RE[3]: no hope
by cranfordio on Sun 28th May 2017 15:41 UTC in reply to "RE[2]: no hope"
cranfordio Member since:
2005-11-10
Math Coprocessor
by Isolationist on Sat 27th May 2017 10:00 UTC
Isolationist
Member since:
2006-05-28

Reminds me of the days when a math coprocessor as an optional add-on for the 8086 etc.

Reply Score: 2

RE: Math Coprocessor
by JLF65 on Sat 27th May 2017 14:16 UTC in reply to "Math Coprocessor"
JLF65 Member since:
2005-07-06

It's more like the early MP3 decoders for computers. They were a separate chip since no processor (you could afford) had the power to decode MP3s in real-time.

Reply Score: 2

RE[2]: Math Coprocessor
by Drumhellar on Sat 27th May 2017 19:08 UTC in reply to "RE: Math Coprocessor"
Drumhellar Member since:
2005-07-12

Um... what? The MP3 format was created the same year the Pentium was released, and the Pentium was fast enough to play MP3s. By the same time MP3 saw any reasonable level of adoption, CPUs that were more than adequate to play MP3s were already quite cheap


You must be thinking of MP3 encoders - Creative Labs had a model of their SB Live card that had an on-board encoder, that could encode faster than real time (IIRC, even when the first Athlons were released, mp3 encoding time was still often used as a benchmark)

Reply Score: 4

RE[3]: Math Coprocessor
by JLF65 on Sat 27th May 2017 23:09 UTC in reply to "RE[2]: Math Coprocessor"
JLF65 Member since:
2005-07-06

No, I meant decoders. I did say "YOU COULD AFFORD". ;) I don't know about you, but I wasn't about to spend more on my computer than my car at the time, so I didn't get a Pentium, and neither did many others until the price dropped. Most folks were still using 386 and 486 processors when mp3 came out. Hell, I knew folks using the 386SX and even a few 286s. The Pentium was hideously expensive until the P2 came out and drove the price down.

Edited 2017-05-27 23:10 UTC

Reply Score: 1

RE[4]: Math Coprocessor
by Drumhellar on Sat 27th May 2017 23:30 UTC in reply to "RE[3]: Math Coprocessor"
Drumhellar Member since:
2005-07-12

Except MP3 didn't receive widespread adoption until several years AFTER the format was created. At that point, even the lowly Pentium 75MHz was dirt cheap (Hell, it was pretty cheap when my parents bought one in '95), and it was powerful enough to play MP3s.

For comparison, the Diamond Rio PMP300 didn't come out until late '98, and Napster wasn't released until '99. Before that, there wasn't much interest in MP3, and certainly not enough for dedicated decoder cards.

Are you sure you're not thinking of MPEG cards for video? Those did MPEG audio, too, but were mostly for video.

I mean, I'm trying to find an example of a card with MP3 decoding, besides MPEG video cards, the only one I can find is one that was released in 2001 - well past the point where MP3 decoding was any sort of burden.

Do you have any examples of such hardware?

Reply Score: 3

RE[5]: Math Coprocessor
by unclefester on Sun 28th May 2017 02:03 UTC in reply to "RE[4]: Math Coprocessor"
unclefester Member since:
2007-01-13


I mean, I'm trying to find an example of a card with MP3 decoding, besides MPEG video cards, the only one I can find is one that was released in 2001 - well past the point where MP3 decoding was any sort of burden.

Do you have any examples of such hardware?


I recall separate MPEG decoders. I can't remember MP3 cards.

Reply Score: 3

RE[5]: Math Coprocessor
by JLF65 on Mon 29th May 2017 15:16 UTC in reply to "RE[4]: Math Coprocessor"
JLF65 Member since:
2005-07-06

Here's one you can still get, but stuff like this was out for most low-end computers at the time. Your only other choice (if you had at least a 386 or 68030) was mp2, which almost beat out mp3 for the time because of its lower computational requirements.

http://amigakit.leamancomputing.com/catalog/product_info.php?produc...

Reply Score: 2

RE[6]: Math Coprocessor
by Drumhellar on Mon 29th May 2017 18:24 UTC in reply to "RE[5]: Math Coprocessor"
Drumhellar Member since:
2005-07-12

That was first released in 2012. I'm still waiting for an example of something that people would have used for their 486's or earlier, as you claim existed

Edited 2017-05-29 18:36 UTC

Reply Score: 2

RE[7]: Math Coprocessor
by JLF65 on Tue 30th May 2017 13:45 UTC in reply to "RE[6]: Math Coprocessor"
JLF65 Member since:
2005-07-06

I said LIKE THAT, not that exactly. They've been making chips specifically to decode mp3 audio in hardware since the format came out, and nearly every line of low-end computer, from the C64 to PCs, had at least one or two of these things you could plug into it to listen to mp3s.

Reply Score: 2

RE[3]: Math Coprocessor
by Carewolf on Mon 29th May 2017 14:32 UTC in reply to "RE[2]: Math Coprocessor"
Carewolf Member since:
2005-09-08

Even a 486 could play MP3 no problem. I remember doing that, though encoding just a few songs had the computer working more than 12 hours, and the resulting 40Mbyte of MP3s was 25% of my diskspace and wouldn't compress with Stacker (runtime disk compression for DOS).

Reply Score: 2

RE[4]: Math Coprocessor
by JLF65 on Mon 29th May 2017 15:18 UTC in reply to "RE[3]: Math Coprocessor"
JLF65 Member since:
2005-07-06

Yeah, but that was ALL you could do. You needed a Pentium or a 68060 to do anything else other than play mp3s while playing an mp3.

Reply Score: 2

RE[2]: Math Coprocessor
by Isolationist on Sat 27th May 2017 21:52 UTC in reply to "RE: Math Coprocessor"
Isolationist Member since:
2006-05-28

That one must have passed me by.

Reply Score: 3

RE[3]: Math Coprocessor
by dionicio on Mon 29th May 2017 16:11 UTC in reply to "RE[2]: Math Coprocessor"
dionicio Member since:
2006-07-12

Isolation consequences have ;)

Reply Score: 2

RE: Math Coprocessor
by Lennie on Tue 30th May 2017 07:34 UTC in reply to "Math Coprocessor"
Lennie Member since:
2007-09-22

This is where it is going, because single chips are not getting much faster.

Reply Score: 2

CPU as resources manager?
by BlueofRainbow on Sat 27th May 2017 12:17 UTC
BlueofRainbow
Member since:
2009-01-06

The general purpose nature of the CPU tends to using it more as a resources manager and have dedicated chips for specific functions.

For now, it will ease the implementation of AI derived methodologies/tools by having it residing on a separate chip. This allows the AI processing units to be optimized for the relevant data types and instruction structures. Later, one could easily see integration with the CPU die similarly to what has occurred for the math co-processor and the graphics co-processor.

Apple and Google are taking different paths for implementation according to their business models. This is a dedicated local chip for Apple for future devices to sell. In comparison, this is a dedicated "cloud" server for Google to attract traffics and sell more advertisements.

Reply Score: 4

Future CPUs can be reprogrammed.
by Alfman on Sat 27th May 2017 16:12 UTC
Alfman
Member since:
2011-01-28

I've said it before, but I'm certain future CPUs will include large areas dedicated to FPGA such that they can be reprogrammed to run specialized routines extremely quickly. Fixed function coprocessors quickly become obsolete, but with an FPGA, developers will keep finding clever ways to reprogram them for years, even adding functionality that was unanticipated by manufacturers!

Reply Score: 3

dionicio Member since:
2006-07-12

My hope you're right. Local "Apple style" AI will bring that much closer. On it "perceiving" benefit, will move code to FPGA array.

Reply Score: 2

dionicio Member since:
2006-07-12

Easy things, like "Paths" you use to travel each journal, camera settings you prefer for differing scenarios, etc. Could be handled Directly by small AI ;)

Note, Draw and Photo taking and recalling can benefit a lot from small AI :O

Reply Score: 2

dionicio Member since:
2006-07-12

On device agents starting to "seriously" guess your future wishes and needs, going to bring a new evolutive phase to UI :/

Reply Score: 2

Lennie Member since:
2007-09-22

I think this is an idea many would like to see, but so far. FPGAs are still pretty slow in comparison to cost.

Reply Score: 2

Alfman Member since:
2011-01-28

Lennie,

I think this is an idea many would like to see, but so far. FPGAs are still pretty slow in comparison to cost.


I know, I find myself very intrigued with possibilities for FPGAs, yet I can't justify the hardware and licensing costs to acquire one for myself.

Perhaps we're in the "chicken vs egg" stage, but I believe scales of economy will rapidly drive prices down as has happened with other integrated circuits in consumer electronics. It just makes sense for tech to evolve in this direction.

Reply Score: 2

Lennie Member since:
2007-09-22

Seems kind of affordable:

"The $99 Arty Evaluation Kit enables a quick and easy jump start for embedded applications ranging from compute-intensive Linux based systems to light-weight microcontroller applications. Designed around the industry’s best low-end performance per-watt Artix®-7 35T FPGA from Xilinx."

https://www.xilinx.com/products/boards-and-kits.html

https://www.xilinx.com/products/boards-and-kits/arty.html

But maybe you still need software ?

Which needs a license.

Edited 2017-05-30 19:08 UTC

Reply Score: 2

Alfman Member since:
2011-01-28

Lennie,

Seems kind of affordable:

"The $99 Arty Evaluation Kit enables a quick and easy jump start for embedded applications ranging from compute-intensive Linux based systems to light-weight microcontroller applications. Designed around the industry’s best low-end performance per-watt Artix®-7 35T FPGA from Xilinx."


Nice find.

It interests me a great deal but these skills are nowhere close to the PHP & sql & business app clients I typically get. I don't know if there'd be any demand for FPGA accelerated hosting services, haha, but I'd be happy to give it a shot if they'll pay for it!

Edit: Obviously dedicated FPGA processors could be put to better use in other domains, but that's the awesome thing about an FPGA, it can be re-purposed for all kinds of tasks.

Edited 2017-05-31 01:57 UTC

Reply Score: 2

Lennie Member since:
2007-09-22

It truly is nothing like the others. Because the ones you mentioned are software development. Programming an FPGA is hardware design. This means understanding electronics and chip design. The languages they use are the same as used for generating hardware designs: verilog and vhdl. An example https://github.com/sergeykhbr/riscv_vhdl/tree/master/rocket_soc/rock...

Reply Score: 2

Alfman Member since:
2011-01-28

Lennie,

It truly is nothing like the others. Because the ones you mentioned are software development. Programming an FPGA is hardware design. This means understanding electronics and chip design. The languages they use are the same as used for generating hardware designs: verilog and vhdl



I've read that they are making progress with FPGA C & opencl compilers.

https://www.altera.com/products/design-software/embedded-software-de...

I believe this will eventually replace VHDL for most "commodity" FPGA applications.

Reply Score: 2

Lennie Member since:
2007-09-22

Yes, I know, but I would expect you still need to understand hardware design to understand it / do something really useful with it ?

Reply Score: 2

tomchr
Member since:
2009-02-01

I would speculate that Apple is interested in more vendor lock-in / control to increase the difficulty of hackintoshing and jailbreaking iOS and MacOS. They will provide a secure platform for Apple Pay, hardware locks and possibly APFS locks, as well.

This AI chip is NOT good news to me.

Reply Score: 3

MysterMask Member since:
2005-07-12

Is the lock-in in Google / Facebook / Amazon / Microsoft / <insert any cloud service you depend on> service any better?

If the Apple way can keep the data local instead of spilling it into the cloud, I'd rather go with that. So yes: it might be good news to get an alternative to whatever those companies riding the cloud money machine provide today - even if you end up paying more for Apple hardware.

Reply Score: 2

tomchr Member since:
2009-02-01

Proprietary hardware - with the ability to uniquely identify you, your traits and transactions and the ability to shut down hardware - is not a good thing. It is naive to believe that it is just an "AI" processor to enhance battery life and userability. This not a question about whether Apple will be using cloud vs. local services - they are using both.

Reply Score: 1

Pattern
by przemo_li on Sun 28th May 2017 13:33 UTC
przemo_li
Member since:
2010-06-01

Pattern of first developing separate chip then integrating that technology into main CPU comes from physics of electronics.

You maxed out number of transistors You have in Your chip doing useful stuff.
Then You have this nice idea about something extra, so You can only add that something extra as a separate chip.
Then total practical number of transistors in a chip rise, and You can integrate Your chip. If fact that may be very good idea, since total of transistors rose, but not all of them can work at the same time (power budget is too small).
So adding that extra functionality to main chip is very good idea.

What is amazing here is that industry was able to pace itself in a way that this very pattern occurred for decades.

Reply Score: 2

why are custom chips bad?
by derstef on Sun 28th May 2017 17:26 UTC
derstef
Member since:
2012-07-27

the amiga had some custom chips which gave it some lightyears of advantage over all other computers available these days. why does everybody think its bad to go that way? look at sound for example, the onboard-codec-stuff still is total crap while special soundcards are much much better.

Reply Score: 1

Pavlovian reflexes...
by dionicio on Mon 29th May 2017 14:45 UTC
dionicio
Member since:
2006-07-12

Always needed local. Your hope also mine. Nothing excludes Apple from working AI at both extremes.

On pro-positive: Apple is at a right path, this time.

Reply Score: 2

not silicon, binary tech.
by dionicio on Mon 29th May 2017 14:53 UTC
dionicio
Member since:
2006-07-12

"...only to then bring it back into the main processor later on?"

The math Co-processor. Also remembering Phys, and Security Companions.

The few science headlines I have seen about low power neural chips, seem to be not silicon, binary tech.

Reply Score: 2

Local and AI ?
by Lennie on Tue 30th May 2017 07:40 UTC
Lennie
Member since:
2007-09-22

So far modern AI research shows you need a lot of data to train these AI properly. You can't do that on a just a small device.

So my prediction is, the type of features/products Google will deliver will be superior to what Apple will deliver for their customers.

Reply Score: 2

RE: Local and AI ?
by Alfman on Tue 30th May 2017 18:54 UTC in reply to "Local and AI ?"
Alfman Member since:
2011-01-28

Lennie,

So far modern AI research shows you need a lot of data to train these AI properly. You can't do that on a just a small device.

So my prediction is, the type of features/products Google will deliver will be superior to what Apple will deliver for their customers.


A neural net needs to be trained with a lot of data, but once trained, it's trivial to deploy a pre-computed neural net to a consumer device. For example, a driving car neural net could be trained with terrabytes worth of image/sensory data, but the neural net itself might only be million neurons.


Personally I prefer this approach of running things locally much better than in "the cloud". A neural net running on the phone can handle extremely high bandwidths with very little latency. Algorithms running in the cloud requires some inherent compromises: reduced data fidelity/compression, lag, less reliable. Let's not forget the incident when steve jobs didn't have enough bandwidth to demo his own iphone, bandwidth intensive functionality isn't ideal for the cloud, but is ideal for local neural nets.

So I think local neural nets will be beneficial for real-time interaction. That said, who knows if apple will actually do a good job or not. ;)

Reply Score: 2