Regardless, the Fifth Age of the Macintosh is at hand. We just don’t know what form it’ll take. The first age began with the original 1984 Mac. The second age was marked by maturity and stability of the environment that came with Mac System Software 6 in 1988. 2001’s OS X did nothing less than save the entire platform. And when Apple finally figured out notebooks – around 2006-2008, with the introductions of the MacBook Pro and the MacBook Air – the company brought the sexy back to the Mac.
The next major step could be a revolutionary spin on the Mac that goes way beyond merely keeping pace with modern computing and makes the Mac into an influential platform once more. We can even dare to hope that by building its own CPUs, consolidating the Mac’s hardware design further, and incorporating iPad manufacturing methods, Apple can finally produce a great Mac that sells for way under $900.
Or, it could be equally significant as The Last Version Of MacOS That Apple Ever Ships.
I have a distinct feeling – and I’ve had that feeling for years now – that something big is about to happen to the Mac. I do not believe that the Mac as we know it today will be around for much longer; what form it will take, exactly, is up for debate, but I wouldn’t be surprised to see the platform slowly but surely move towards ARM, probably from the bottom (MacBook Air) to the top (Mac Pro). MacOS and iOS aren’t going to become unified in the sense they’re the same on an iPhone and a Mac, but they will run the exact same applications, just with different UIs depending on the input method (and screen size) used.
The upcoming Mac Pro might very well be the last traditional x86 Apple workstation.
What you’re really saying is that Apple will copy Microsoft and its Universal App initiative.. It hasn’t worked well for Microsoft so I can’t see it working well for Apple either… The reality is the software market place is over-saturated with software so I don’t believe that that will be the answer.. Personally, I do not know what they will do with Mac – maybe even kill the product line off?
Like when Microsoft’s efforts at tablets failed over 15 years ago and all us geeks (me included) pointed and laughed at the oversized iPhone that was the first iPad?
When people talk about Apple innovating they often inadvertently give credit in the true sense of the word; taking something already invented, make it a palatable to the masses and very profitable for them.
Not “all us geeks”, while my track record for predicting fails and successes is not perfect, i think a lot of things has been obvious.
The original windows tablets had to fail, they thoroughly sucked, i still to this day do not understand why this was not totally obvious to anyone at the time.
The original iPod had to succeed, i am a geek, but i want my accessories to just work, those, like the famous slashdot comment, focussing on silly specs and capacity really had no clue how normal people use tech.
The original iPhone instantly made all other phones seem outdated. At that time i had tried different “smart” phones, and the software utterly sucked and they were a pain to use. Again, i am a geek, but my tools has to just work unless i specifically want to tinker with them. (and yes, i know some people liked their windows mobile and their nokia stuff, for me all of them were painful)
The original iPad was a device with obvious use cases from the start. Companies are today heavily trying to push them into content creation instead of consumption, but so far with limited success. For me, a tablet today is also mainly a consumption device. I got the surface pro 4 in the hope that it would be a nice hybrid device, but it showed up to be a great laptop but a really really bad tablet.
Now back in the mid 90s i thought desktop Linux stood a chance, but it of course didn’t. Today it is fine as a desktop for many people, though they dont need it as whatever OS they have is fine. Same with modern smartphones, mostly geeks care about new OS versions, i would be fine with Android 4.2, though i prefer the look of Android 5+
Troels,
The original iphone wasn’t much more than a feature phone with a web browser though. The second generation iphone was more compelling. One thing that made the iphone special was that they had an unlimited data plan when users of other devices were paying egregious prices for data caps. I think apple deserves a lot of credit for disrupting carrier business models that discouraged data utilization on smart phones.
Personally, I hated the original IPhone, it was too limiting in what it would do for me.
But as a computer tech who had to explain over and over again how to do even simple functions on computer CLIs and later GUIs I quickly realize how people like my mom, aunts, and all the other people who were always asking the same questions over and over that the IPhone was a God-sent for them.
Edited 2018-04-24 21:14 UTC
Apple was the first to hint about universal apps when they launched the iPhone, one of the points they emphasized was that the iPhone used the same OS core and Development tools and API’s as the Desktop. Apple backpedaled after Windows 8 since a lot of user complained about it and it’s tablet like UI “mess” (that I kinda like) Microsoft clearly rushed the product in order to outsmart Apple.
You don’t do a transition by moving from the bottom of the platform up.
You do it by moving from the top down.
Apple, of all companies, is good at transitions. Be they OS, or CPU arch, they do them frequently, and they’ve done them well.
Name one other major outfit that’s pulled this kind of thing off and is still a player. I’ll wait.
Now that that’s out of the way. Apple has _never_ done a transition successfully where they started with the base model. It’s always been with the power-user machines first, or the entire product line at once. That strategy works. Because the content creators, developers, and makers are the ones that jump to the new stuff first, and then you have additional market forces helping thrust the new tech toward the bottom.
At least, that’s how it’s seemed to work in the past.
When they tried other methods, (anyone remember the ebook?) or other products introduced at the bottom of the price range of a category, things just don’t go well.
Migration from Power to Intel started with Macbook and Mac Mini and finished with Mac Pro…
What you said makes sense, but Apple did exactly the opposite.
You might want to check your history books on that one.
https://en.wikipedia.org/wiki/Apple%27s_transition_to_Intel_proc…
Transition started with the laptops. MacBook Pro to replace the Powerbook. Then the MacBook to replace the iBook.
Then came the desktops. x86 iMac to replace the PowerPC iMac. Education iMac to replace the eMac.
Finally came the workstations. MacPro and Xserver to replace the PowerMac G5.
IOW, what you are arguing against is actually the way things worked (from the bottom to the top).
And, if you go back to the 68000–>PowerPC days, they did it the same way.
The Performa desktops were released first. Then the PowerMac desktops. Then the Workgroup Servers.
First the low-end PowerBook laptops. Then the high-end PowerBook Duos.
IOW, every CPU transition that Apple has done started at the bottom and worked it’s way up the product lines.
Edited 2018-04-25 20:09 UTC
Is just a personal opinion you hope will be vindicated at some point, nothing more.
The article misses one big thing – using ARM on both phones and desktops will allow to turn every iPhone into a Mac. With USB-C they won’t even need a docking station. Connect a hub with a keyboard, mouse and screen, and the apps and OS will instantly switch to desktop mode. And suddenly, the Mac becomes the biggest desktop platform. This is where Google is heading too – with the merge of Android and ChromeOS.
Or looked at another way: it will turn every macbook into an iPhone complete with a socket for a phone sim, possibly 5G for endemic connectivity.
Think about eSIM
I can see Apple moving their laptops to ARM & iOS. I also can see they keeping the prices up, the devices would be stylish, very thin, with beautiful display and huge battery life. But they would be about as useful as Chromebooks. Which may work for some of their users, even if from the outside may look ridiculous.
They should not be any less powerful (relatively speaking) than any other machine on the market right now.
Their SOCs are quite impressive, and equal current low to mid-end desktop processors in some benchmarks. And that’s operating within the very limited thermal and power envelopes in phones.
There is no doubt in my mind that the performance of these machines would be quite good. And that’s not even considering a dedicated SOC line for desktop/laptop applications.
Performance shouldn’t really be a problem.
OTOH …
Apple just _loves_ to tighten down their hardware.
While these machines would perform quite well and be perfectly adequate (if not downright impressive) performance wise, i doubt they would actually be very versatile, or even useful at all outside of the apple eco-system.
I doubt you could just go out and install an armhf, arm64 or aarch64 distro on these.
It is very likely that they will come with a locked bootloader, much like an iphone, ipad or whatever other ithing they will come up with.
And that, my friends, would be very, very sad.
Add to that their push for running signed binaries only (through the app store), and … well … seems we might end up with some sort of ipad pro pro computing appliance.
I do hope i’m wrong, though.
From what I see, desktop ARM gets comparable with Intel low-end and it wins when you talk about performance per watt or the ability to add multiple cores.
From the Mac users I know (the sample is small, they are not popular around here), most of them go for high-end tasks, like photo/video editing, where the gap is large.
How many ARM based desktops have used chips functionally equivalent to x86 ones though? I have yet to see an ARM-based ‘desktop’ style system running a Snapdragon 845 or an Exynos 9 (and even those aren’t quite equivalent to a Core i7, let alone a Ryzen 7). I’ve seen some with Tegra SoC’s, but they invariably use low-end ones. Similar with Allwinner’s A-series SoC’s. I’ve also seen a number based on i.MX or OMAP SOC’s, but those are nowhere near x86 chips in terms of raw processing power.
However, there’s not really anything actually preventing using a really nice ARM chip for such a system. If someone were to, for example, make such a device with 16 ARMv8.3-A cores running at more than 3 GHz with good PCI-e support and the ability to use commodity GPU’s and NIC’s, and then made sure it wasn’t restricted to running Windows, it would sell amazingly well and very likely match a nearly equivalently clocked Ryzen 7 for performance.
Something like this : https://www.amazon.com/HiKey-Single-Board-Computer-Octa/dp/B073P8BW9…
Some / most of that heavy processing can be offloaded to a GPU, though, which makes the raw computational power of the main CPU less relevant.
Apple could start by working together with AMD to create a Mac Pro with x86 and Apple Arm cores. This gives Apple and developers the chance to learn how to make desktop apps on Arm. They are also working on a GPU so maybe take the Mac Pro as a place to try huge GPU’s and 65Watt Arm Soc’s.
For most developers it shouldn’t be more than a recompile away…
My assumption is that Arm cores behave differently from x86 cores. So if you want to go to 35Watt Arm cores on the desktop you may want to use them already to optimize your compiler but also look at optimizations in how the OS and programs work.
That assumption might not be entirely accurate… once the underlying infrastructure is there for multi-architecture support, it’s fairly straightforward from the point of view of application devs – during the Mac PPC->x86 transition they didn’t have to care about it much; and devs of most Linux apps hardly even think about all the archs that, say, Debian supports.
Sure, Apple may move aaway from x86, but i don’t think macOS is going to fade away. I expect you’ll see low wend macs running on ARM, with higher end workstation macs running ARM with an x86 co-processor. You will probably also see a software-based comapatability layer, much like Microsoft has implemented on ARM Windows, and Apple have implemented twice before, during the 68>PPC and then the PPC>x86 transitions.
Apple are betting on ARM, because they don’t want to be caught up with a poor performing and power-hungry architecture, like they did in the PPC era. Intel came and blew PPC out the water when it came to processing per watt, and now ARM is doing the same to Intel. Apple doesn’t want to be caught out with a platform that’s horribly power hungry and produces enough heat to warm a small house, so they’re going over to ARM.
This doesn’t mean the end of the mac. No, it’s just the beginning.
Why would they bother with translation level if all they need to do is to use already present fat binary (multiarch) support, like it have been done before during PPC->x86 transition?
AppStore apps can be downloaded for OS target arch, non-AppStore apps would ship with fat binary executables.
So that Apple actually sells a reasonable amount of the first generation of non-Intel systems? It’s not a case like Linux where you can just rebuild the world for whatever new architecture you are using. Existing software won’t run right away on the new systems without a compatibility layer, and a lot of people aren’t going to switch until their preferred applications run on the new system.
Think for example of why Intel had to do an x86 compatibility layer in their Itanium chips (yes, I know it was a crap compatibility layer, and that Itanium was a flop, but that’s beside the point).
Fat Binaries are primarily there to support older hardware during the transition. An x86 emulator would be implemented on ARM machines for running older non-fat binaries, whereas x86 machines would require a fat binary to work with modern software. They’re 2 sides of the same coin, but defi9nitely not interchangeable
ARM is very much low power, low processing. It does not compare to x86 in terms of raw processing power. Switching to ARM would solidfy Apple products as expensive boutique toys instead of work machines.
That’s hilarious that anyone would think Apple would release cheaper devices. Apple’s essentially a fashion boutique now… why would they charge less when people are more than willing to pay more?
The current slew of MacBooks is already as sealed as the iPads. Not upgradable nor really repairable.
I agree Apple is more likely than most to do something different and big again. However it will primarily mean something to the user. Any impact on the motherboard, or bootloader will be a means to an end. I am thinking cloud, voice control, machine learning, FPGA, ultra secure identity, eternal data, or whatever combo blows the world away. Chances are lower without Jobs, but still higher than for any of the others IMO. It will be existing innovations combined in a crazy insane way, hopefully.
Moving from x86 to ARM would kill Bootcamp. Are Mac users ready for that ?
ultrabill,
They could always emulate x86 on ARM, which would not be any worse than what microsoft decided to do with it’s own ARM derivative. It wouldn’t be the most performant thing, but it just has to be “good enough”. They’ve gone the emulation route before, modern CPUs are often over-provisioned for the tasks users put them to anyways. Supporting “bootcamp” through emulation could fill that checkbox.
I think this whole thing is getting over-hyped by people who want to get excited but there isn’t much to actually get excited over. Supporting ARM is fine, operating systems and software really should be portable (proprietary software keeps hold that back), but from a user perspective switching CPU architectures doesn’t add much wow factor. It all looks the same on the surface. All processors, including x86 and ARM have been capable of delivering more than most users need for quite some time now.
My prediction is that macos will continue to get IOSified. Maybe there will be some flashy bling that comes with it, but the platform is likely to become more restrictive and controlled, because why not…people will buy it anyways. A company with so much money has a lot of marketing options, some kind of exclusive content deals wouldn’t surprise me.
It would be worse, the OS would also run on an emulator… (with Windows on ARM, at least the OS is always native)
zima,
True, but in that case bootcamp on ARM really ought to support the native (ARM) version of windows.
Also, for the kinds of low intensity software that are good candidates for emulation, full OS emulation is probably ok too as long as it’s not too bloated
I doubt Apple would care enough though, especially since it would likely be harder than Bootcamp for x86 Macs/Windows – ARM devices probably wouldn’t be based on the same standards like x86 Macs and PCs. And who knows if/when MS will even offer boxed Windows for ARM…
I’ll just install Windows 10 for ARM. Problem solved.
Assuming it will be even available stand-alone… And don’t forget about likely hardware differences much larger than between ordinary PCs and Macs now.
So many developers use macOS, and iOS can’t program an app for iOS so… Fix that, AND then all the other developers, using like Android Studio (Java required) etc. They will drive developers away unless they allow x86 apps to run on a non-macOS OS. It’s seriously a popular platform for developers now a days. Runs every tool needed, from heavy IDEs to BASH CLI tools built in or Homebrew or similar. It would be serious. Linux is my favorite developing platform but macOS is great on the go for its battery life etc.
Anyway, first XCode must run on their new architecture, or must developers run Windows to develop iOS apps?
Well I’m sure Apple is acutely aware of this, but it’ll be hard. No big IDE will want to develop their own version just for a new non-Intel x86 based platform.
Baby steps maybe. Like macOS still supports X11.
Full BASH CLI, freedom to install third party software (macOS App Store is trash) and full file system control minimum.
I don’t get it really. Intel works great ror Apple. They manage 15 hour battery life etc. Everywhere I see, in the University or Developers, there are mostly MacBooks. It’s only enterprise keeping the
userbase percentage I don’t really understand. Maybe because I’m from a rich country, and 80-90 % my non-tech friends as well have MacBooks.
And in all series and movies you see them. I know it’s because 1) Hollywood has a lot lying around so easier unless MS pay them and 2) Apple probably pay, but really I can only remember Dr. House showing Apple in the credits.
Like in old episodes of the Big Bang Theory, Sheldon is using a case (or just when looking from behind) some Alienware(like), but when they zoom in it’s clearly a MacBook inside a case. But the funny thing is when he’s using Windows XP or some custom UI they often put in series, often looking like XP, which is ridiculous as Sheldon several times has said he’s a Linux fan. Even MSIE he uses several times seen. They have science people making sure things are more or less accurate, but fail in those aspects.
My dream would be a change from Darwin to Linux, already great Intel support, and improve that support for longer battery life and also AMD drivers, so everyone, including Linux users could benefit from Apple’s contributions. Basically the only FOSS project they have left after leaving CUPS behind is WebKit since taken from KHTML, and therefore the license. That’s why we have gtkwebkit, qtwebkit etc, but more moving to CEF (Chromium Embedded Frame), like Spotify did.
Edited 2018-04-24 17:30 UTC
olejon,
The simpsons writers are known for their scientific backgrounds, which comes through in the series. However most of hollywood is total garbage when it comes to accuracy. When actors are raking in millions of dollars per episode, they could afford to pay a crack team of good science writers a whole year with that kind of money, but it would mostly get lost on their audiences so they just don’t bother.
Personally, I enjoy “the IT crowd” more than “the big bang theory”. While both shows use gimmicks, the dynamics in the IT crowd seems more relatable to actual tech workers.
Norway. University is free so there isn’t that “ah dude’s in university so dude can afford more”. I was studying medicine, but we were like those who cared less about clothing for example among the studies that is hard to get in to (top grades), far away from economy and law, but it was general, also in humanities studies.
But on Norwegian TV shows as well, like talk and fun/stupid Norwegian shows with guests etc and such, unless it’s a news program, where they use Windows PCs, easy to see, but some fun program or anything. If a PC is on the table or popped out it’s a MacBook, a personal or it’s standing there, a guest’s for instance. But like many other news channels, also international I watch, there are several that use iPads + AirPlay when they show something on a bigger screen. They don’t hide it. The UI is there, AirPlay icon and they even say “let’s just connect the iPad”. Weird because, at least sometimes they mess up and go to the home screen or activate some gesture, well but iOS is simple in the sense they know the “rescue button”, just press Home and find Safari for instance. Seems like they don’t know macOS does it as well, we’re talking mirroring. They need a dead easy technology to send JUST the selected content if wanted, with powerful tools (could be everything from a browser tab to a selected rectangle ATM easily changed to new rectangle when ready), like no battery indicators or chrome or glaring white Safari in my face, to a bigger screen, like super easy “wireless HDMI”, just a plug in the PC and a plug in the big screen, all open source and open protocol, and a pairing (sounds like Chromecast but can’t be used like that, for instance mirroring from PC from not-Chrome without much lag, and still proprietary). I know there is Miracast and such… Doesn’t seem to make any change, know nobody that use it. I guess the boss just have seen his iPad and Apple TV work fine at home so… And not to mention iPad is still used so many times when they want to say tablet, when they’re talking tablets in general. State TV news doesn’t, they have rules to follow news anchors seem to have actually learned.
Well if proper PCs become just for developers and those needing high end software that just needs more power and a proper keyboard and mouse, prices for a proper PC will skyrocket as demand declines, and us hobby developers will disappear and become Angry (Birds) idiot consumers only. Only with a super idea and an investor would that be possible to buy theb, or rich parents or whatever. Like going back to the 80s and early 90s “/ So bye bye hobby developers and gamers that use a Mac for gaming I suppose (not that many but for some it’s what they got). Doubt big games, not mobile games made for touch and movement of the device, will be ported to run on both architectures.
Also at the official government IT agency, among those using Windows (most) among developers and system admins, tech support etc, with some 2-5 monitors each, it’s pretty common that when there’s a break or nothing to do, popping out the MacBook, if it isn’t already there, and connect to the guest network (still 300 Mbps per client). Pretty insane with all those big monitors available, but probably a way to disconnect from work… At first I thought why not an iPad? They’re just browsing, mostly, but with the feet on the desk no iPad is comfortable to use, as it’s on the LAP and even with a good/big iPad a good accessory for keyboard/trackpad the touch would be unusable as reaching for the screen would be very weird thinking ergonomic. I used Linux on my workstation. Brought my Mac for testing mostly.
If they replace Intel I suppose they will go one OS only, not like Ubuntu’s effort for instance where the device turns into macOS when connected to a keyboard (MS Surface style), but just iOS made more keyboard and mouse friendly. But everything non-iOS already, made for macOS would need a drastic design change, as I’m probably sure Apple won’t then allow apps that sucks with touch in the Store, like MS + Proper Office when Windows 8 came out, and speaking about the Store, Apple would probably “finally” get, or saying “take” control over the entire ecosystem, Store only apps, 33 % for all purchases of apps and in-app purchases which would be the thing for a lot of power software (unlocking features, unlocking when trial mode runs out…)
Edited 2018-04-25 13:10 UTC
And we’re talking several years down the road. People wont exactly jump ship from a perfectly working Mac, and if software starts to demand the new arch, people will be pissed.
Who knows what Intel has achieved in a few years? Maybe surpassed ARM by far in performance and still as good or better than ARM regarding battery, or new battery tech will make thinking and tweaking all day long for battery life obsolete.
Then Apple will just do it out of total self interest, full ecosystem control, everything must go through the Store etc.
And if Apple suddenly becomes the only one using ARM what then? Back to PowerPC. Intel would probably go all in to make their chips superb for whatever Google’s OS or OS’ will be in the future, and such have almost monopoly on both desktop and mobile, hopefully then AMD going after to have competition.
you also fall under selection bias – because for example Statcounter shows ( http://gs.statcounter.com/os-market-share/desktop/norway ) that, while macOS does have somewhat higher usage share in Norway than world average, Windows is by far the dominant OS in Norway…
That’s the weird thing. I know, I know. It’s kind of when you see around you, and it seems like everyone uses iPhones, even in poor countries (in war documentaries etc), while Android has like 90 % market share worldwide.
Would like to see “People not at office or working by VPN to office (enterprise) between 15-35 years” stats. Basically young consumers. Can barely a home party, nachspiel or vorspiel with a Windows laptop connected for Spotify or something. Not to bash Windows.
I know tech guys who buy Windows laptops because they work from home to office, and it requires Windows ornthey are have an autonomous company and use some Windows only software in telecom etc. And many regular just don’t seeing why spend all that money on the half eaten apple as they don’t care about the logo or fashion or whatnot, and find a decent PC in a physical store. Many cheap decent ones now so for many a good choice. And for pro users wanting Windows, nothing wrong with that. I’m not regular, I build my desktop and use Linux. But when starting university and needed a new laptop, no one was close to the MacBook Air’s 12-15 hour battery life. I could go all day as programs used aren’t exactly video rendering programs. Never brought charger, great because not enough outlets in many auditoriums, no charging in the hall in breaks.
About the above users I usually I end up cleaning their machines from OEM bloatware and programs installed by mistake (“That yes button must be right to click”). Just a few weeks ago I had a run in my new building, rumours had spred I knew stuff… as usual… Elderly people, not to generalize, but to say they know little about tech so needed my help with something, all with Windows 10 PCs, not shabby ones generally either, ASUS stylish ones and such. Just some printer that didn’t work or something. Lots of bloatware unfortunately. Had hoped MS would ban it for OEMs to take money to include unnecessary programs from third parties if they came with Windows 10, and OEMs wouldn’t have much choice than to refuse money then from bloatware makers and make the product somewhat more expensive, but users happier and everything snappier.
But pretty obvious these guys chose Windows as they were used to it, as they used MSIE, even almost all used the homepage almost all Norwegians used back in year like 1996-2002 (like homepage.no). Haven’t seen in it more than a decade, hasn’t changed much except tons of more ads to take advantage of these poor guys. Probably why it was so slow to start. with zero control over added plugins and stuff, AND still had Firefox and Chrome installed and on the taskbar. Asked them if they used them? Nope. Just uninstalled as those guys would hesitate alot over a browser change. Although PC not shabby and after cleaned MSIE ran better but still very slow compared to Edge for instance, so after a clean up and usual tweaks that tend to help depending on HW specs, not Windows’ fault.
Not all of us fall under such bias / I can probably count on the fingers of one hand the number of times I’ve seen an iPod in my country… (not counting my own iPod) Similar with iPhones / I see mostly Androids.
Also BTW not just Norway. Looked at hacker conventions etc? Some run Linux on them yes, they usually don’t film the screens of course, but sometimes it’s possible to see, and most macOS as I’ve seen. And other PCs with Linux mostly. A strange amount of Macs. Even on Linux conventions, with the Macs that run Linux with all HW supported (mine just lack webcam support, Wi-Fi needs a little hack since it’s proprietary IIRC, or something, but it’s doable, it boots and works perfectly if just using Ethernet and no webcam). The latest usually don’t work. TouchBar will probably never, or who knows there may be very dedicated Linux hackers doing everything to make it usable, more usable than Apple has made it. As various > 2012/2011 MacBooks usually require at least one hack to make all internal HW work, there are some very dedicated guys out there already it seems, with wiki pages for the different models. Maybe for the credits, or because they like Apple’s laptops, which there are many great of.
Edited 2018-04-25 13:30 UTC
olejon,
I do see many macs at linux user groups, but nowhere near the “80-90%” figure.
I feel like apple’s been ignoring my demographic for years. I don’t mind having a little weight for a more professional machine with more expand-ability, fewer dongles, a good user-replaceable battery, good keyboard, solid construction, etc. These are all things apple has moved away from in order to cater to the ultra-thin/light/small device category, but they’ve made too many sacrifices, especially when dongles are needed for things I still use. I can’t stand the new keyboards and non tactile controls, it gives it a low quality feel IMHO. It’s funny because I used to use apple as an example of how all laptop manufacturers should make their keyboards. Well, no longer. It’s fine that apple offers laptops in the ultra-light category, but IMHO they have lowered the bar for power users.
My 80-90 % was in no way targeted to Linux conventions if you read that post. I just said there’s also a surprising amount at such conventions, also hacker conventions, meaning those who haven’t installed Linux and use macOS trust the OS not to talk too much to the “mothership”, and I doubt they’ve read ALL CAPS 40 pages macOS Terms of Conditions. Goes for MS too of course. But they’re more, or used to be before Windows 10, careful what they send back as they’re so into enterprise that no company would use Windows if they sent something remotely personal. But with 10 there’s basically a ton of Privacy settings to definitely check out, not all in one place. Enterprise tools for group policies are much more powerful with latest Windows Server and Windows 10 and probably the privacy invading parts are disabled when joining a domain by default anyway.
I agree I don’t like Apple’s new keyboards and lack of ports etc either, but just tested in stores. I don’t mind a little thicker laptop if you give me at least three USB ports, at least one of those legacy, or USB 3. The DisplayPort works directly into Thunderbolt, they’re the same, physically compatible, so that’s ok, at least Thunderbolt 1.0, which my MacBook Air 13″ 2013 top spec has, and soo glad I have that keyboard and not the newer, I don’t miss anything from the newer regarding the outside. I have more port, just as good or better battery life, and MORE PORTS. But yeah, if I want HDMI I need a dongle to the Thunderbolt port, and the dongle, sold by Apple but Belkin, kills HDMI-CEC so all my HDMI equipment that talk together, like amp turning on when turning on TV, must be scanned again after dongle disconnect. And yes, it’s the dongle (which obviously has chips in it as it’s somewhat big just to convert DisplayPort to HDMI, probably some Apple certified chip shit) not the MacBook or macOS, since disconnecting the MacBook from the dongle, so just the dongle is connected to TV by HDMI for instance, still HDMI-CEC is broken. I really haven’t checked for firmware updates for it but doubt it. Have double checked and it screws up both Sony’s and Samsung’s HDMI-CEC. If there was a firmware, it should be detected and updated by macOS as it after all is connected to a Thunderbolt port… unless the dongle only uses the DisplayPort part of the port…
All in all I agree, don’t know what the last good MacBook Air was, or the MacBook Pro, IMO regarding keyboard, trackpad and ports (and no TouchBar), but I remember MacBook Air 2013 was praised (while the generation before was a revolution but noise was horrible, and they were pretty slow), which is why I bought it, with top specs except storage space as most is in the cloud or external drive.
But like at my parent’s house, using it as a desktop, with an external 27″ monitor, Logitech MX mouse and Razer keyboard, it uses all those USB ports. Really just 2, enough when used as laptop. A Thunderbolt dock is something I’m probably going to buy. Just need to check if Thunderbolt 3.0 for instance is compatible with 1.0, the physical port for instance, because if not there are probably just docks/dongles without USB-C…
Edited 2018-04-25 16:15 UTC
I formatted my work Macbook Pro laptop with Linux. I work for a Managed Services Provider and I’m basically given free reign to do whatever I want with my computers. Management hasn’t got any issue and I’m able to get all my work done so I have the freedom to. The build quality of the old Macbook pros is why I use them the 13″ 2011/2013 models are great. All the hardware works in Linux, with a few tweaks to get rid of some annoyances. Switching away from X86 is going to be a big deal breaker for a lot of people. ARM Linux is not the same as X86 Linux, all the steam apps will go away and the big packages like Maya and Photoshop won’t exist for ages if at all.
Edited 2018-04-26 03:25 UTC
Just out of curiosity, which MacBook exactly is it and did you have to configure something especially for some HW to work? As said on mine one needs a hack for Wi-Fi and the webcam last I checked there was no solution for. MacBook Air 2013 13″ top specced.
I’m also given totally freedom in that sense when working for the state IT agency. We manage all public services, so it’s big. They have to physically connect me to a different network for my Mac to get any connection, as both Ethernet and Wi-Fi networks uses some AD certificate that is installed on first login for access. It doesn’t work on any other OS than Windows, or I’m pretty sure it’s possible if they give me the certificates as files, since NetworkManager supports that, but they don’t do that.
So they just connect me to some test/lab network which bypasses this, switch only available in one office, which is where I’m working when there anyway, probably what consultants get if they absolutely must use their own equipment, but that’s rare, never seen it. Otherwise I would be stuck with the guest Wi-Fi which can not access anything, servers etc… and I’m developer and systems administrator so that’s essential. Glad they always give me freedom. They know very well I work best under Linux and since all servers except AD and file are Linux, it just works better. We’re required to use at least 25 % FOSS, so they’re always happy when I do my assignment, and set some system they want using a stable FOSS one or develop it on my own, usually a web solution then, so open. They even have a GitHub. Mostly internal systems, on the external front it’s all Linux, OAuth etc, so, no problem with the percentage.
Hm, when I stopped watching (gotta start again) Big Bang Theory ~4 years ago, I don’t think there even was a mention of Linux, and Sheldon definitely uses Windows on some occasions (I remember in one ep when he says ~”my new laptop came with Windows 7… Windows 7 is much more user friendly than Windows Vista… I don’t like that” ). Anyway, would be cooler if Sheldon used something more obscure, like Haiku.
BTW I remember an ~indy movie with BeOS prominently visible, I think it might have even been reported here…
Edited 2018-04-27 00:09 UTC
Yeah, one of Sheldon’s better comments regarding IT. Also when he finally had freed up enough space to install Linux, something like “come to papa my dear Ubuntu”.
At the very least it should be the distro of physicists, Scientific Linux, not Ubuntu…
zima,
Indeed, I haven’t watched the show in several years, but what’s used on the show doesn’t seem to mesh with Sheldon’s “outlier” character. BeOS or Haiku would be more fitting! I’d even take a VAX for ridiculous comedy effect. It’s the difference between a real computer geek versus someone who just pretends to be one on tv, haha. It’s what you get when hollywood targets general audiences and things get dumbed down.
It’s a shame but it could push current technology far ahead. It is posible to create a full computer of the size of stamp but only if you own the IP over all the parts of a computer and the factories of course. This could mean way battery life and better performance since all the different components are closer to each other. Apple should start with the Apple Watch and then move up but initially they are going to mix their own parts with other manufacturers .
The next major step could be a revolutionary spin on the Mac that goes way beyond merely keeping pace with modern computing and makes the Mac into an influential platform once more.
since the second, third and fourth “step” didn’t make Mac an influential platform chances are, the fifth won’t either.
Desktop computing for me died once an external GPU became available. A laptop with an upgradeable graphics card/permanent dock for keyboard/mouse/drives/other devices completely eliminated any need for me to maintain a gaming PC.
I think for most people there’s no separate “desktop” and “laptop” – just “PC”…
They have positioned themselves as an expensive platform for wealthy fashion victims. I don’t see revolution coming from the heights.
I really don’t see it as a viable solution to specialized /professional/heavy/graphic use. I can’t imagine rendering, heavy 3d modeling, cad applications (full, not the basic ones), and other professional tools from other areas running on ARM. Not in a timeframe like 5 or 7 yrs. If so, I think these machines will not be competitive like in x86-64.
The article defines an “age” in a very weird fashion. As a dev who’s worked on nearly every type of Mac, I’d define it this way:
1st age: all-in-one 68K Macs
2nd age: NuBus based 68K Macs
3rd age: NuBus based PPC Macs
4th age: PCI based PPC Macs
5th age: x86 based Macs
6th age: x86_64 based Macs
So will ARM based Macs be the seventh age? We’ll see…
OSX 10.4 was the last release for me and I still run it on PPC.
I wanted a UNIX dev friendly workstation and laptop, preferably PPC based. 10.4 was the last release where I recognised and the porting challenge between Solaris/HP-UX/AIX and the then ailing IRIX wasn’t such a time-suck.
Bye bye Apple. Your focus is media consumption devices to earn profits not the UNIX and Workstation markets. Good luck. Wish you well, bye.
IMO it is far more likely that Apple will drop desktops and MacOS completely in a few years. The replacements will be locked down ARM powered portable devices.