Apple Inc. is designing a new chip for future Mac laptops that would take on more of the functionality currently handled by Intel Corp. processors, according to people familiar with the matter.
The chip, which went into development last year, is similar to one already used in the latest MacBook Pro to power the keyboard’s Touch Bar feature, the people said. The updated part, internally codenamed T310, would handle some of the computer’s low-power mode functionality, they said. The people asked not to be identified talking about private product development. It’s built using ARM Holdings Plc. technology and will work alongside an Intel processor.
And before you know it, you have a MacBook ARM.
And it’ll probably cost an ARM and a leg
They’ll probably rename the Intel unit to Leg. It’ll probably be lopsided in the costing – with the Leg costing more than the ARM.
And before you know it, you have a MacBook ARM.
God, i hope not. ARM will offer no benefit over an Intel Mac.
A Macbook already has all-day battery life. ARM chips simply aren’t as fast as Intel chips at the performance level you’d want in a laptop.
And, there’s no way in hell they’ll match Intel on the desktop in the next decade. Current top-end iMacs are 4x faster than the iPhone 7 (the fastest of the iDevices).
Hell, the previous gen Macbook Pros are more than thrice as fast, and that is with 10 hours of battery life.
Hopefully, this is more along the lines of that laptop somebody made (I forgot who, and my Google Fu is failing me) that had a small ARM computer and external display that functioned as a Window Sideshow device (I think) to check email and play music…
Edited 2017-02-02 05:02 UTC
Yeah I hear ya… But maybe its not a Macbook Pro replacement (even though the Pro tag nowadays has been a bit tarnished).
If it could be something along the lines of the normal Macbook I guess there is a market for that. My mom for one.
Which means they could overcharge for a “chromebook” specced, all day battery, beautifully crafted, locked in machine.
Not for me but it seems they are real popular and if the previous clusterphück proves anything is that the masses are ready to sell their least favorite child to get an under specked machine.
Then, they’d be asking developers to support two architectures for macOS software, which given the small market share of Apple laptops/desktops, seems like a terrible idea.
Same here: twice the cores, ten times (at least) the TDP. Doesn’t look impossible to me.
Saying that, I don’t see the point either. As long as ARM doesn’t stomp X86 in the way X86 did with PPC switching makes no sense. Compatibility and flexibility is to important and Apple should know that from their historic experience.
But there was one thing which the old PPC Powerbooks did much better as their Intel Macbook successors: waking from sleep. Things have improved a lot but still aren’t perfect. Maybe Apple can do something about that when they take back control of some of the chipset.
It is marginally faster than the lowest-end Macbook Air, and the performance delta between single and dual core is smaller on the A10 than on the Intel chips in the Macbooks, so adding cores will help far less than you think. And you can’t just “increase the clockspeed” of a chip that is likely already hitting the thermal limits of what it is designed for.
You can’t take a chip designed for a 3W TDP and just crank up the clock speed to match the performance of a chip designed for a 65W TDP, just as you can’t take a chip designed for a 65W TDP and drop it down to 3W and expect the same performance as a chip designed for 3W
(The Broadwell chip in the MacBook Air, of course, runs at 15W, but Broadwell can ramp up quite a bit higher)
This actually looks like they’re trying to do what they already do with the iPhone, though: Two very small, very low power cores to run lightweight tasks when the phone is in a pocket, because the A10 itself can’t cut power enough to do those tasks efficiently, just as Broadwell can’t cut power enough to do those same tasks. Since Intel has never had a design for that kind of ultra-low power usage, and Apple can’t make their own x86 chips, of course they went with an ARM design.
And, using an ARM chip for this purpose doesn’t indicate an eventual move to ARM – there are already close to half a dozen ARM cores floating around various spots on your motherboard, in the form of tiny controllers for various chips.
Your flash storage controller? ARM. Your network controller? There’s likely an ARM chip embedded somewhere. Two or three, depending if you have WiFi or multiple ethernet ports.
Drumhellar,
Many, including Thom, are implying this processor is intended to threaten/overtake the intel CPU, but I don’t see any evidence this is anything more than another in a myriad of co-processors. I fault the article for the same lack of details as the last time this topic was covered.
Many who are running the latest generations of intel processors already have extra VPro processors on the same CPU die as the 86 cores, which runs independently from the main cores. For example, my newest computer uses this core to run Intel AMT, which hosts a web server that I can connect to and view logs, disks, cpu, memory, remote control power/keyboard/mouse etc, and it’s still there when the main computer is asleep.
It sounds like apple’s co-processor is going to work along the same lines and isn’t intended to replace main x86 cores to become the “Macbook ARM”.
I don’t think it’s so black and white. The Nvidia Shield TV is more powerful than many entry to midrange laptops on the market; it can stream a 1080p60 video source to four devices at once, transcoding two for viewing over the Internet and direct streaming to two on the local network, via the built in Plex server software. It doesn’t even get warm doing this. By contrast, my MacBook Air can barely play the same 1080p60 video locally without occasional stuttering.
I’d dearly love to be able to convert my Shield into a Linux desktop PC, I think it would blow away any comparably priced mini-desktop and would even give the current Intel NUC line a run for their money. I also think the same technology in an entry level MacBook would actually speed up macOS compared to how it currently runs on Intel, granted of course that they optimize the OS for ARM as they have with iOS devices.
Your Shield TV is optimised for streaming, thats hardly a fair comparison. And decoding and transcoding depends on the available hardware acceleration for the chosen codec and not the general speed of the CPU.
But generally I agree, ARM CPUs can be as fast as X86 CPUs nowadays. Big iPads and small Macbooks already play in the same league.
Edited 2017-02-02 14:07 UTC
puenktchen,
I agree, it doesn’t make sense to compare the technology without including consideration of what’s being offloaded to vector processing GPUs. And when talking about macbook air specifically, we’re not talking about high end x86 kit to begin with.
I agree with Drumhellar in terms of absolute CPU performance, ARM has lagged behind, although I haven’t read much about it recently. Theoretically a huge injection of resources from a wealthy company like apple could change these dynamics in the future.
Coming from linux, I’m fairly indifferent to the underlying architecture, so long as it’s open and the device doesn’t put up a fight to keep me from installing my own OS on it. The windows software ecosystem is generally stuck with x86. Look at how ARM based WinRT went down horribly for all the reasons we’d expect. While I can’t speak authentically for macos users, I imagine they are somewhere in between.
Since we’re just speculating, I’ll add this: I don’t think apple has much interest in investing in an ARM laptop to continue where x86 laptops left off. If apple were to shift away from x86 to ARM, I’d predict the end result could look more like IOS than MacOS.
Edited 2017-02-02 15:43 UTC
The difference on macOS is Apple has experience successfully transitioning from one platform to another (just to reiterate- the original article is not about a transition to ARM anyway).
If Apple could scale an ARM chip up to match Intel in some mix of performance characteristics (speed and power efficiency), they could certainly make a go of it, and provide an emulation layer for x86 if they really need it. But they’d really need something compelling in an ARM package, and that looks a long way off.
Edited 2017-02-02 17:21 UTC
Assuming Apple gets as much benefit from each dollar of R&D as Intel, and it would take as much R&D spending to design a chip of equal performance (Not great assumptions, but not terrible, either), Apple would have to support an equal R&D budget to Intel on 5% of the market, versus Intel’s 85% of the market.
Too put it another way, Intel sells roughly 400 million processors each year. I think they spend about 3 Billion per year on x86 R&D. So, that’s about $6 per processor sold.
Apple sells about 16 million Macs each year. To spend as much on R&D, each processor would have to chip in about $180 per chip for R&D.
I get that, but my point was more that, for most uses one would have for a mini desktop, the Shield would be up to the task and would also provide a stellar GPU. My Surface RT with the much older Tegra 3 (precursor to the Shield’s SoC) outperforms my Dell Venue Pro 11 as a laptop style device. The Venue, in turn, makes the MacBook Air feel like a slouch. Granted, part of that may be OS related, but in that case Apple should come out on top given their tight control over their devices.
You can do what I did: Get a Jetson TX1 and put it in a SuperMicro desktop chassis. It’s the same hardware as the Shield, but runs regular Linux.
mack,
Like Morgan, I’m always impressed at how much punch you can get out of cheap consumer electronics. But I frequently find it takes way too much time/effort to hack anything onto them, so I was very intrigued when you said this! Just now I looked it up and it’s 3 times the price of the consumer shield, is this right?
http://www.nvidia.com/object/jetson-tx1-dev-kit.html
I admit it seems like a nice piece of kit, but at $600 without even a chassis and only a small 16GB disk, it would be expensive for me personally.
I had trouble finding performance benchmarks for the cortex a57 to give a baseline of it’s performance compared to other desktop processors, so I don’t really know how well it would fair compared to a $600 x86 desktop system.
I’ve considered it, and if not for the cost involved I would do it. But, I also like the size and layout of the Shield case and using a larger case would negate a lot of the reasoning behind it.
No, it isn’t. The slowest 6th gen (7th gen are out now) mobile Intel i3 is faster than the Shield TV, by about 20%
The graphics are better, sure, but we’re comparing CPU cores here, not graphics.
I don’t recall limiting the discussion to just CPUs. I was talking about the Shield as a mini-desktop PC doing desktop PC tasks, not synthetic CPU benchmarks. In real world tasks the Shield, if allowed to be a Linux PC, would run circles around laptops costing twice as much. I’m basing that on reviews of the X1 dev board which is the closest we’ll get to having something like the Shield as a desktop or laptop machine.
Performance-per-dollar, the Shield as a mythical complete system beats pretty much anything else on the market, and it’s going on three years old now. Unfortunately the cost of the X1 board based on the same SoC negates that performance/price advantage.
Well, considering we’re talking about Apple replacing x86 with ARM chips in Macbooks, we should be limiting the discussion to CPUs.
It doesn’t make sense not to do so.
Thanks for telling me what to say, m’lord. May I go back to scrubbing your floors now, master?
My point stands that the comment you replied to was about the Shield as a potential desktop PC, compared to existing laptops. Again, if Apple were to replace x86 with ARM in their entry level hardware (MacBook/MBA) I predict that the performance would be just fine or perhaps even better, especially given how good Apple is at optimizing for a specific platform.
I’m well aware that ARM CPU performance can be both better than and worse than x86 depending on the specific task, and you can’t simply say “x86 is superior to ARM full stop”. There, I discussed CPU performance, you can do your happy dance now.
Morgan,
Do you have any data at all comparing the TX1 CPU to other general purpose x86 CPUS? I specifically looked for this data for my last post but could not find it.
In my experience the ARM cores in my mobile devices are slower than the x86 cores in my PCs, but I concede I don’t have any high end ARM devices that I can run benchmarks on.
Edited 2017-02-03 15:19 UTC
It outperforms several low to midrange laptop and Ultrabook CPUs in this batch of tests by Phoronix:
http://www.phoronix.com/scan.php?page=article&item=nvidia-tegra-x1&…
Morgan,
Thanks for the info. My exact CPU is actually on their benchmarks: core I7-3770, how handy is that! It’s quite close to what I would have guessed, actually. My desktop is about 2-5X faster for most of those benchmarks.
I wonder if OpenSSL’s code was hand optimized for x86 because it totally tanked with over a magnitude of difference on the OpenSSL RSA benchmark on page 4:
Tegra X1 Shield: 44.70
Core i7 3770K: 552.17
It seems closer to my laptop’s performance, which to be honest I do find slow at times with large projects. Never the less, it probably beats the pants off of my computers in accelerated gaming and video streaming because I’ve neglected to upgrade that over the years.
I would be interesting in having a general purpose ARM PC, not merely a repurposed a set top box/mobile device, but something designed to push the performance boundaries as intel has done.
That’s exactly the point I was making, that a X1 powered laptop would be no worse than a low-to-midrange ultrabook or laptop, specifically a MacBook (which uses the Core M CPU) or a MBA. It would in fact be much faster than an otherwise equivalent x86 laptop in areas such as gaming (admittedly not a macOS priority) and 3D modeling.
There is more to a system than its raw CPU speed though; I’ve built systems with a Pentium G-series CPU that easily outperformed a same-generation Core i5 based system on certain tasks. Raw CPU cycles are one thing, but for a general purpose machine I think the X1 would more than deliver.
It’s looking more like management. That’s how IBM does it in their servers. There is a little ARM chip and OS running that does all the hardware monitoring and sideband stuff.
That’s how Intel does it with their vPro-enabled chips, too.
Well, they use a third party ARC core, rather than ARM, but still.
Apple does not want to work on macOS, that is clearer every day. They obviously do not want to spend the effort to make it touch-responsive, so they roll an iPod Touch into a long shape and laminate it over the keyboard.
You can buy Windows 10 laptops and tablets with fully touch-responsive displays for the premium Apple asks for a narrow touch pad with a small display. But Apple believes (quite righfully) that their customers will surrender their wallets for the stupidest thing if it glitters and blings the Apple way. So why work a lot on making macOS touch-compliant when you can extort $300 for a touchpad?
FWIW, there are still a lot of people who don’t want touchscreen systems. I’m not buying a laptop with a touchscreen until I have no other choice in the matter to get the other specs I want (which looks like it will be some time long in the future, I usually go for mobile workstations, and they usually only have touchscreens as an option). Yes, it’s nice sometimes to be able to use a touchscreen, but people still haven’t figured out how to make a matte capacitive touchscreen (which is not that hard to do actually), so you get fingerprints and glare all over the place.
You are not *forced* to use a touchscreen with Windows, even if you have it. But I also hate shiny screens with a vengeance: is it not possible to have a matte touch screen, or is it just that they only sell them shiny, for the same unfathomable reason why most laptop screens are shiny?
It’s very hard to make a reliable capacitive touchscreen that has a matte finish, and the only practical type of touchscreen currently for a laptop is a capacitive one (resistive has poor image quality and imposes durability constraints on the enclosure for the screen, optical and infrared are too bulky and have poor accuracy, acoustic is impacted by the cleanliness of the screen, APR is impractically expensive, and piezoelectric can’t detect motionless fingers reliably).
As far as the shiny screen thing, there’s two parts to it:
1. It’s eye catching. Glare, as much as it’s horrendous for doing actual work, really catches people’s attention. It’s the same reason that all kinds of cheap electronics have high gloss enclosures these days. In other words, it’s a marketing gimmick used on people who don’t think about the impact it has on usability (which is sadly a lot more people than you might think).
2. Durability. This sounds odd at first, but it’s actually a factor. More and more laptop screens are using some type of high durability glass instead of the transparent plastic they traditionally have used, and it’s insanely expensive to make glass that has a matte finish and doesn’t impact the color saturation of the image. This is even more important for a touchscreen, because people are actually touching the screen regularly.
… and you will see:
“Intel dipped on the news and ended the day down 0.8 percent at $36.52. Apple rose 6.1 percent to $128.75 in New York.”
Someone made a lot of money on this news.
ARM macOS is not what this article is about (only certain sub systems would run on the specialized chip – and that’s just smart engineering), but I’d be pretty excited to see a full on ARM macOS (or even something hybrid like is possible with AMD’s x86 chips).
Things have been so boring with the chipzilla monopoly of the past few years..
I’d even love to see them scale up their PowerVR GPU for desktop use – that’d be a bigger story IMHO.
Edited 2017-02-02 16:38 UTC
I can’t see this taking off. Mainly because there will be a shitload less programs running on OSX ARM vs OSX X86. Good luck porting code to a niche platform with 0 benefits. Why bother?
I can see this driving more devs/users back to Windows as a bunch of companies just drop Apple support rather than spending money on porting. Not to mention they have to get Adobe on board or else the whole platform will just get dumped. Maybe it’s time Apple just bought out Autodesk and Adobe. It’d cost $75 billion but it’d make Apple the one stop shop of graphic design forever.
Edited 2017-02-03 20:59 UTC