Talk, Rumors, X Versus Y Archive

The jury is in: monolithic OS design is flawed

The security benefits of keeping a system's trusted computing base (TCB)small has long been accepted as a truism, as has the use of internal protection boundaries for limiting the damage caused by exploits. Applied to the operating system, this argues for a small microkernel as the core of the TCB, with OS services separated into mutually-protected components (servers) - in contrast to "monolithic" designs such as Linux, Windows or MacOS. While intuitive, the benefits of the small TCB have not been quantified to date. We address this by a study of critical Linux CVEs, where we examine whether they would be prevented or mitigated by a microkernel-based design. We find that almost all exploits are at least mitigated to less than critical severity, and 40% completely eliminated by an OS design based on a verified microkernel, such as seL4.

Apple and Google are heading in the same direction

But I think the reason that this year's WWDC felt a little Googley is that both companies are trying to articulate a vision of computing that mixes AI, mobile apps, and the desktop. They're clearly heading in the same general direction.

It's becoming ever harder to distinguish the two companies. They are clearly trying to work towards the same future, but they're coming at it from different directions. It's fascinating to watch.

Apple: Macintosh a better platform for 3D animation than Amiga

This is quite a find by Cabel Sasser. Apparently, Apple is still hosting an article dedicated to arguing the Macintosh is a better platform for computer-generated video content than the Amiga (part 1 and part 2). It does so by explaining how easy it supposedly was to create Pencil Test, a short 3D animated video made on the Macintosh II.

Some have seen non-Apple solutions that include a single, Amiga-based package with automated, three-dimensional, frame-by-frame generation of NTSC video sequences. The package also handles the problems of hiding window boarders/title bars, genlocking, and so on.

Most have seen the "Pencil Test" video and feel that the quality of this video is acceptable, but they were told from one of the other vendors that Apple invested incredible resources into creating "Pencil Test" and that the process used for "Pencil Test" was very time-consuming and inefficient.

What was the exact process for the creation of "Pencil Test"? How many people worked for how long to produce the video?

The publish date at the bottom of the currently published version of the two-part article is 2012, but this is clearly just the result of some automated migration process from an old database to a new one. The actual publishing date of the article is probably around from when Pencil Test was published - so somewhere between 1988 and 1990.

The Amiga had carved out a decent niche for itself as a 3D animation and special effects platform in the late '80s and early '90s. Famously, the science fiction TV series Babylon 5 used Amiga Video Toasters for its special effects in its first few seasons, making it one of the first TV series to move to digital special effects over the use of models. Apple clearly wanted in on this market, and the support article is part of that effort.

And the article is bizarre. In it, Apple argues the merits of the open, modular system, the Macintosh, and condemns the integrated, hardware-and-software-designed-together approach of the Amiga.

There are advantages and disadvantages both to the totally integrated systems and the open modular systems. Totally integrated system's advantages include having hardware and software tied directly together and having one place to get support. Disadvantages include being locked into the one company's point of view about how to do things, working only with their tools, and, often, being locked into that company's software. An integrated solution on non-Macintosh systems is most likely pieced together from a variety of third-party products.

I can't value the merits all the technical claims being made about the capabilities of the Macintosh and its software at the time compared to that of the Amiga, since that's way beyond my area of expertise. Still, this article is a deeply fascinating relic from a bygone era, and I can't believe Apple is still hosting it.

A year of Google and Apple maps

Shortly after I published my Cartography Comparison last June, I noticed Google updating some of the areas we had focused on.

Coincidence or not, it was interesting. And it made me wonder what else would change, if we kept watching. Would Google keep adding detail? And would Apple, like Google, also start making changes?

So I wrote a script that takes monthly screenshots of Google and Apple Maps. And thirteen months later, we now have a year's worth of images.

This is a fascinating article. Google is changing the look of the actual maps in Google Maps a lot, and improving its data all the time - whereas Apple seems to lag behind, and contains far less places of interest, stores, and so on.

The search for the killer bot

Enter the message bots. As 2016 dawns, there's a sense in Silicon Valley that the decades-old fantasy of a true digital assistant is due to roar back into the mainstream. If the trend in past years has been assistants powered by voice - Siri, Alexa, Cortana - in 2016 the focus is shifting to text. And if the bots come, as industry insiders are betting they will, there will be casualties: with artificial intelligence doing the searching for us, Google may see fewer queries. Our AI-powered assistants will manage more and more of our digital activities, eventually diminishing the importance of individual, siloed apps, and the app stores that sell them. Many websites could come to feel as outdated as GeoCities pages - and some companies might ditch them entirely. Nearly all of the information they provide can be fed into a bot and delivered via messaging apps.

This seems a bit... Overblown. Bots are going to revolutionise a lot over the coming decades, but messaging bots replacing the point and click interface we've been using ever since Xerox invented it?

Much like the death of the PC or Apple, the end of our current HUI metaphor has been predicted more times than I can remember - I don't see how this one is any different.

Why Linux Is More Practical Than OS X

How can we pass up a title like that? The article takes an interesting approach on practicality. Linux's pros: it runs on so many kinds of hardware, installing software is easy, variety of file managers and desktop environments. The Mac is popular because is has "strong software titles" and good support. The kicker: "If Linux distributions had the same level of consumer tech support available that Windows and OS X does, we'd see adoption number exploding." To be blunt, I find this essay unpersuasive. However, if you look at the examples where Linux has been successful in the market, such as embedded systems like set-top boxes and heavily customized OS variants with their own software ecosystem like Android, it's precisely Linux's esoteric strengths that made those platforms' developers choose it. And what did those platforms have that made them successful? Strong software running on top of the OS along with a worry-free onboarding and maintenance process, usually with professional support for end-users. What do you know?

On Google, Apple, data, privacy, rhetoric

Over the past few weeks - following an important-but-barbed talk from Apple CEO Tim Cook - the rhetoric has turned to privacy and security and data and how only products you pay for are good and any sort of free services are inherently bad and basically whore out what's left of your post-Snowden soul.

It's an important discussion to have. And one we'll continue to have. But it's not one-sided. It's not binary.

And, actually, it's interesting to see how the rhetoric has changed recently.

Ouch.

Privacy vs. user experience

The real issue that Apple is trying to address is not really privacy, but rather security. Though Google has all of my data, it is still private. Google does not sell access to my data; it sells access to my attention. Advertisers do not get my information from Google. So as long as I trust Google's employees, the only two potential breaches of my privacy are from the government or from a hacker. If we accept this as a fact, the fundamental privacy question changes from, "Do you respect my privacy?" to "Is the user experience improvement worth the security risk to my private information?"

Dustin Curtis hits the nail on the head so hard the nail's on its way to Fiji.

Apple, Microsoft’s visions for the future are delightfully different

This is a nice article overall, but this part stood out to me.

The history of Apple and Microsoft’s relationship has often been one of direct confrontation. Whether it’s Surface vs. iPad, Zune vs. iPod, or the classic PC vs. Mac, the two American giants have often competed for the same clientele, trying to sate the same needs.

This is a common misconception. While the two companies certainly had their tussles (the look and feel lawsuit being a major one), most of it was nothing but marketing - riling up their own fanbases. During most of their history, these two companies have had close ties, working together very closely on many projects. The supposedly great rivalry between these two companies existed mostly between its fans, not between the companies themselves. They've always needed each other, and continue to need each other to this day.

In fact, in fighting Google, these two companies have been working together more closely than ever before. If you think the sudden onslaught of patent abuse against Android and its OEMs from Microsoft and Apple (and Oracle, another company with close ties to Apple and Microsoft) was a coincidence, I have a bridge to sell you.

I always find it fascinating that the idea that Apple and Microsoft are bitter rivals has survived to this day.

Apple Watch vs. Moto 360

A detailed, complete, and fair (and not overly long) comparison between the Apple Watch and the Moto 360. While it's unlikely you're deciding between the two - unless you have both a recent iPhone and an Android phone - it may still be useful if you're up for a new phone and want to take the Wear/Apple Watch accessory into account for your purchase.

Personally, I wouldn't buy either of these two devices at this very moment. It's too early days, and they're not exactly cheap, either - especially taking into account that a new Moto 360 is probably around the corner already, and you'll see a new Apple Watch within around 12 months, too.

‘To beat the iPhone, you have to beat the iPhone’s camera’

For a show overrun with various visions of smart drones and smarter homes for the future, the present of CES was remarkably uniform. I saw more iPhones in the hands of CES attendees than I did Android phones across the countless exhibitor booths. From the biggest keynote event to the smallest stall on the show floor, everything was being documented with Apple's latest smartphone, and it all looked so irritatingly easy. I don't want an iPhone, but dammit, I want the effortlessness of the iPhone's camera.

I really don't give a rat's bum about my phone's camera (does it take pictures? Yes? Okay I'm good), so I'm about as interested in this as watching grass grow, but it's a consistent iPhone strong point according to iOS and Android users alike. Since I like science: are there any proper tests concerning this?

Apple CarPlay vs. Google Android Auto – comparison

Interesting video comparing Android Auto with Apple's CarPlay (via Daring Fireball).

The takeaway for me is clear - CarPlay looks like a mess, with iOS 6 stuff intermingled with vague iOS 7+ designs, but without any clear vision tying it all together. In short, it's ugly as sin. Android Auto looks fantastic and coherent - but it seems far too distracting to be safe to use while driving. It looks too good to be in a car in which it is very easy to either kill yourself or someone else - or both.

Interesting, though, that car makers are simply putting both systems in their cars.

Copycats and crapware in application stores

Two related stories.

Microsoft's Windows Store is a mess. It's full of apps that exist only to scam people and take their money. Why doesn't Microsoft care that their flagship app store is such a cesspool?

It's now been more than two years since Windows 8 was released, and this has been a problem the entire time, and it is getting worse. If Microsoft was trying to offer a safe app store to Windows users, they've failed.

And:

Flappy Bird wasn't the first game to spawn an entire ecosystem of me-too clones, nor will it be the last. And now that the developer of the insanely difficult but addicting game has released the even more insanely difficult and even more addicting (is that even possible?) Swing Copters, well, we're seeing it again.

This applies to all application stores. They are filled to the brim with crapware nobody wants, making the experience of using them pretty unappealing. Since Apple, Google, and Microsoft care about quantity instead of quality, I don't think this will change any time soon.

Google wants to reinvent transportation, Apple sells headphones

There were two striking pieces of business news this week from America's leading technology brands. On the one hand, Google unveiled a prototype of an autonomous car that, if it can be made to work at scale, promises to end mass automobile ownership while drastically reducing car wreck fatalities and auto-related pollution. Meanwhile, Apple bought a company that makes high-end headphones.

Which is to say that Apple's playing checkers while Google plays chess.

For better or worse, this is exactly why many people seem to hold Google in higher regard than they do Apple. Both Apple and Google are rich and wealthy beyond average-person-measure. Now, which company will be liked more: the one that uses said wealth to develop crazy may-or-may -not-work technologies that can change the world at a massively substantial scale, or the one that stuffs $150 billion in shady bank accounts to avoid having to pay taxes?

The more wealth you hoard, the less sympathetic people will be towards you. Unless, of course, you use that wealth in a very public way.

‘The great works of software’

So I set myself the task of picking five great works of software. The criteria were simple: How long had it been around? Did people directly interact with it every day? Did people use it to do something meaningful? I came up with the office suite Microsoft Office, the image editor Photoshop, the videogame Pac-Man, the operating system Unix, and the text editor Emacs.

Each person has his or her own criteria for these sorts of things, but in my view, this list is woefully inadequate. If it were up to me, I would pick these, in no particular order:

  • A-0 System: the first ever compiler, written by Grace Hopper in 1951 and 1952, for the UNIVAC I.
  • UNIX: This one's a given.
  • WorldWideWeb/CERN HTTPd: the first web browser and the first web server, both written by Tim Berners-Lee. Also a given.
  • Xerox Star: this one is actually a tie between the Star, its research predecessor the Alto, and Douglas Engelbart's NLS. These three combined still define the way we do computing today - whether you look at a desktop, a smartphone, or a tablet. I decided to go with the Star because it was the only one of the three that was commercially available, and because it's so incredibly similar to what we still use today.
  • Windows: you cannot have a list of the greatest software of all time without Windows. You may not like it, you may even hate it, but the impact Windows has had on the computing world - and far, far beyond that - is immense. Not including it is a huge disservice to the operating system that put a computer on every desk, in every home.

This leaves a whole bunch of others out, such as Lotus 1-2-3, DOS, the Mac OS, Linux, and god knows what else - but such is the nature of lists like this.

‘Android is for poor people’

I'm using the URL slug headline for this one (check the link).

This map showing the locations of 280 million individual posts on Twitter shows a depressing divide in America: Tweets coming from Manhattan tend to come from iPhones. Tweets coming from Newark, N.J., tend to come from Android phones.

If you live in the New York metro area, you don't need to be told that Manhattan is where the region's rich people live, and the poor live in Newark. Manhattan's median income is $67,000 a year. Newark's is $17,000, according to U.S. Census data.

This fascinates me, as it seems to be a very American thing. In The Netherlands, Android has an 80% market share, and we have far lower poverty rates than the US (that Newark median income is crazy low by Dutch standards). I'm pretty sure the situation is similar for many other West-European nations.

This raises an interesting question: is it 'Android is for poor people' - or is it 'Android is for poor people in America'?

‘More evidence that Apple won the application wars’

Stuff such as United's new offering generally arrives on Android sooner or later, and there are whole categories of apps - such as alternative keyboards - that are Android-only.

Much of the time, I'm an Android user myself, so I'm happy when something is available for Google's operating system and sorry when it isn't. But despite the fact that iOS's market share is much smaller than that of Android, and has been for years, Apple devices are still nearly always first in line when a major company or hot startup has to decide where to allocate its development resources. That's a dynamic that pundits keep telling us makes no sense - but it's happening, and its an enormous competitive advantage for Apple. 'Sounds like a victory to me.

iOS has won the application wars.

Sure, you have to disregard those gazilion Android applications iOS could never support (keyboards, launchers, SMS applications, browsers, task switchers, lock screens, etc., and so on, and so forth), but if you do that, then yes, iOS has won.

The tortoise is faster than the hare. Sure, you have to cut off the hare's legs first, but then, sure, yeah, the tortoise is faster.

Samsung Galaxy Gear ad copies original iPhone ad

Samsung's Galaxy Gear television advertisement bears a resemblance to the original iPhone advertisement.

This made me smile, though:

There is just no shame - or original ideas - in this company at all.

Yeah! Except for display technology. Oh, and except for microprocessor design. And, of course, they are a driving force in memory chip design. Well, yeah, except for all those things from which virtually every computer product today benefits - Apple or otherwise - Samsung has absolutely no innovative ideas at all. What have the Romans done for us, indeed.

I don't care about Samsung any more than I care about other companies, but to shove the company's contributions to technology aside just because you lack the capacity to grasp the kind of more bare metal innovation they do just makes you look like an idiot.

Inside the YouTube battle between Microsoft, Google

In the past two months, Microsoft and Google have been bickering over one central issue: HTML5. The Verge has learned that Google is forcing Microsoft to build its YouTube Windows Phone app in HTML5, despite its own Android and iOS versions using superior native code. Although Microsoft has offered to build ad support along with making other tweaks as Google has requested, a full HTML5 app isn't currently possible on the platform.

The difficult thing here is that Google actually has a very good case; it's their API, their service, their rules. On top of that, YouTube publishers - big and small - need to earn money from advertisements too, and incorrect implementations make that harder. Microsoft's mafia practices regarding patents, extorting companies to pay for Android use even though Microsoft has contributed zero code to Android plays a role too. Lastly, Windows Phone is essentially irrelevant with 3% market share - it's not as if Microsoft ever concerned itself with minority platforms.

Still, all this does is hurt consumers, no matter how few Windows Phone users there are. Just work this out, please, you bunch of children.