Linked by Thom Holwerda on Sat 3rd Feb 2018 14:15 UTC, submitted by Drumhellar
Mac OS X

When users attempt to launch a 32-bit app in 10.13.4, it will still launch, but it will do so with a warning message notifying the user that the app will eventually not be compatible with the operating system unless it is updated. This follows the same approach that Apple took with iOS, which completed its sunset of 32-bit app support with iOS 11 last fall.

This is good. I would prefer other companies, too, take a more aggressive approach towards deprecating outdated technology in consumer technology.

Thread beginning with comment 653509
To view parent comment, click here.
To read all comments associated with this story, please click here.
RE[2]: Typical Apple
by Kochise on Mon 5th Feb 2018 06:11 UTC in reply to "RE: Typical Apple"
Kochise
Member since:
2006-03-03

Probably because not every software needs more than 4GB of memory. Things worked pretty well until recently, 32 bits is enough for Word and stuff. If you need them to be 64 bits then there are questions pending. Not everyone runs a server farm in their garage. At least 32 bits apps are more easily sand-boxed into a 64 bits system. Better security it is.

Reply Parent Score: 2

RE[3]: Typical Apple
by bert64 on Mon 5th Feb 2018 08:18 in reply to "RE[2]: Typical Apple"
bert64 Member since:
2007-04-23

You don't always need 64bit, but the frequently with which you do need it are increasing every day. Consumer laptops are now frequently coming with >4GB of ram, a browser with many tabes open can easily consume more than 4GB. And remember the 4GB limit is address space, not total ram usage of a process.

Having both 64bit and 32bit support requires support in the kernel, 2 sets of userland libraries etc, and the 32bit libraries will contain support for more legacy features (ie anything that got deprecated before 64bit was introduced likely wont have been compiled into the 64bit builds).

So yes individual 64bit apps may consume more resources than 32, but having a mix of 32/64 and all the legacy baggage associated with 32bit libs going back 20+ years could actually result in higher resource usage than a pure 64bit system.

Then there are the quirks of amd64, where the 64bit mode adds a lot more registers for example... The lack of registers in 32bit mode can be a performance bottleneck, which is eliminated by running in 64bit mode. Many programs run faster, even if they don't take advantage of any other 64bit features.

By only supporting 64bit you also rebase the lowest common denominator, there are more cpu features that you can take for granted and use without having to have multiple code paths to support older processors.

There are many benefits to moving towards pure 64bit... The stupid thing for Apple is that they never should have supported 32bit x86 at all... Microsoft has a long legacy of 32bit x86 support, but Apple moved from powerpc to x86 *after* amd64 was already established. They could very easily have made OSX 64bit-only right from the very first non-powerpc version.

Reply Parent Score: 2

RE[4]: Typical Apple
by Kochise on Mon 5th Feb 2018 10:56 in reply to "RE[3]: Typical Apple"
Kochise Member since:
2006-03-03

I understand your concerns about 64 bits performance and less 32 bits support bloat. However I would have to wonder why browsers needs so much memory nowadays, web pages doesn't features 4K pictures. Coders should be more frugal about memory consumption.

Apple chose Intel because of deal, better overall performance and power economy in 2006 face to AMD's offering, and also the integrated Wifi AMD was lacking (the whole Centrino stuff).

Apple made the transition in early 2006 when the Core 2 Duo would only be available later that year, once the first Intel Mac were shipped with 32 bits cpus.

Reply Parent Score: 1