How far does 20MHz of Macintosh IIsi power go today?

Years later, I had that story on my mind when I was browsing a local online classifieds site and stumbled across a gem: a Macintosh IIsi. Even better, the old computer was for sale along with the elusive but much-desired Portrait Display, a must-have for the desktop publishing industry of its time. I bought it the very next day.

It took me several days just to get the machine to boot at all, but I kept thinking back to that article. Could I do any better? With much less? Am I that arrogant? Am I a masochist?

Cupertino retro-curiosity ultimately won out: I decided to enroll the Macintosh IIsi as my main computing system for a while. A 1990 bit of gear would now go through the 2018 paces. Just how far can 20MHz of raw processing power take you in the 21st century?

The Macintosh IIsi is such an elegant machine, a fitting home for the equally elegant System 7.x.

Design case history: the Commodore 64

We've been on a bit of a history trip lately with old computer articles and books, and this one from 1985 certainly fits right in.

In January 1981, a handful of semiconductor engineers at MOS Technology in West Chester, Pa., a subsidiary of Commodore International Ltd., began designing a graphics chip and sound chip to sell to whoever wanted to make "the world's best video game". In January 1982, a home computer incorporating those chips was introduced at the Winter Consumer Electronics Show in Las Vegas, Nev. By using in-house integrated-circuit-fabrication facilities for prototyping, the engineers had cut design time for each chip to less than nine months, and they had designed and built five prototype computers for the show in less than five weeks. What surprised the rest of the home-computer industry the most, however, was the introductory price of the Commodore 64: $595 for a unit incorporating a keyboard, a central processor, the graphics and sound chips, and 64 kilobytes of memory instead of the 16 or 32 that were considered the norm.

A fully decked-out Commodore 64 with all the crucial peripherals - tape drive, disk drive, printer, joysticks, official monitor - is still very high on my wish list.

Economists: we aren’t prepared for the fallout from automation

Are we focusing too much on analyzing exactly how many jobs could be destroyed by the coming wave of automation, and not enough on how to actually fix the problem? That's one conclusion in a new paper on the potential affects of robotics and AI on global labor markets from US think tank, the Center for Global Development (CGD).

The paper's authors, Lukas Schlogl and Andy Sumner, say it's impossible to know exactly how many jobs will be destroyed or disrupted by new technology. But, they add, it's fairly certain there's going to be significant effects - especially in developing economies, where the labor market is skewed towards work that require the sort of routine, manual labor that's so susceptible to automation. Think unskilled jobs in factories or agriculture.

As earlier studies have also suggested, Schlogl and Sumner think the affects of automation on these and other nations is not likely to be mass unemployment, but the stagnation of wages and polarization of the labor market. In other words, there will still be work for most people, but it'll be increasingly low-paid and unstable; without benefits such as paid vacation, health insurance, or pensions. On the other end of the employment spectrum, meanwhile, there will continue to be a small number of rich and super-rich individuals who reap the benefits of increased in productivity created by technology.

Whether masses of people become unemployable or are forced to accept increasingly crappier and lower-paying jobs, while a rich few get ever richer, the end result will be massive social upheaval. We're already seeing the consequences of mass inequality in many countries in the world, and it isn't pretty. Expect things to get worse.

Much worse.

Performance of the 8088 on PC, PCjr, and Tandy 1000

It's well-known that you should measure the performance of your code, and not rely only on the opcode's "cycle counts".

But how fast is an IBM PC 5150 compared to a PCjr? Or to a Tandy 1000? Or how fast is the Tandy 1000 HX in fast mode (7.16Mhz) compared to the slow mode (4.77Mhz)? Or how fast is a nop compared to a cwd?

I created a test (perf.asm) that measures the performance of different opcodes and run it on different Intel 8088 machines. I run the test multiple times just to make sure the results were stable enough. All interrupts were disabled, except the Timer (of course). And on the PCjr the NMI is disabled as well.

There's no point in any of these benchmarks, but that doesn't make them any less interesting.

The life and death of teletext, and what happened next

That, so the story goes, was the remit given to BBC engineers in the late 1960s: find a way to transmit a printable page of text so that the corporation’s transmitters weren't simply left to idle overnight. Their efforts would eventually give rise to an iconic medium that would span five decades, become the basis for a global standard and - perhaps most importantly - let you check the lottery numbers on Sunday morning. (Well, you never knew.)

As is so often the case when a revolutionary technology's lingering just over the horizon, it's difficult to know precisely where the tale of teletext truly begins. Engineers at several different corporations were already experimenting with ways of transmitting text remotely, each with different goals in mind. The Post Office, who at that time were responsible for the telephone system, naturally wanted to use their infrastructure to boost the number of phone owners across the country. Boffins back at the BBC, meanwhile, had begun investigating ways to provide subtitled television programmes for the deaf.

Teletext (or Teletekst in Dutch) is still active here, and lots of people have the smartphone app for Teletekst installed as well. Fascinating technology that I used all the time when I was younger.

“Google is planning a game platform” to rival Playstation, Xbox

Over the past few months, the wildest rumors in video game industry circles haven't involved the PlayStation 5 or Xbox Two. The most interesting chatter has centered on a tech company that's been quietly making moves to tackle video games in a big way: Google, the conglomerate that operates our email, our internet browsers, and much more.

We haven't heard many specifics about Google's video game plans, but what we have heard is that it's a three-pronged approach: 1) Some sort of streaming platform, 2) some sort of hardware, and 3) an attempt to bring game developers under the Google umbrella, whether through aggressive recruiting or even major acquisitions. That's the word from five people who have either been briefed on Google's plans or heard about them secondhand.

Cracking the gaming market is hard. Over the past few decades, only two companies succeeded in entering the gaming market: first Sony, then Microsoft. Virtually all other attempts either flopped hard, or started lukewarm only to quickly peter out. Hence, I have a lot of reservations about Google's supposed plans here, especially since they seem to involve streaming. Even streaming on my local LAN using PS4 Remote Play, while passable, is clearly not even remotely as good as the "real thing".

We definitely need more concrete information.

Linux Mint 19 released

Linux Mint 19 is a long term support release which will be supported until 2023. It comes with updated software and brings refinements and many new features to make your desktop experience more comfortable.

In Linux Mint 19, the star of the show is Timeshift. Although it was introduced in Linux Mint 18.3 and backported to all Linux Mint releases, it is now at the center of Linux Mint's update strategy and communication.

Thanks to Timeshift you can go back in time and restore your computer to the last functional system snapshot. If anything breaks, you can go back to the previous snapshot and it's as if the problem never happened.

This new release is jampacked with new features and improvements, and I must say this looks mightily intriguing.

Microsoft details its ‘pocketable’ Surface device in leaked email

Microsoft has been working on a new mysterious Surface device for at least two years. Codenamed Andromeda, the device has appeared in patents, reports, and in operating system references multiple times and will include a dual-display design. According to a Microsoft internal document obtained by The Verge, it's also going to be a pocketable Surface device.

I find this a very exciting device, and I can't wait to see its final incarnation. It's supposed to be released this year, so expect it to appear somewhere this Autumn.

Google invests $22 million in KaiOS

KaiOS Technologies Inc., developer of the emerging operating system for smart feature phones, KaiOS, today announced a $22M Series A investment from Google to help bring the internet to the next generation of users.

In addition to the investment, Google and KaiOS have also agreed to work together to make the Google Assistant, Google Maps, YouTube, and Google Search available to KaiOS users. These apps have been developed specifically for the KaiOS platform, which is entirely web-based, using open standards such as HTML5, JavaScript, and CSS.

You're probably not aware of KaiOS, since it only runs on feature phones most of us don't use. KaiOS is a fork of FirefoxOS, and is actually quite popular - it's already on 40 million feature phones, including various Nokia phones. Google's investment makes sense here, and ensures its services are available on these devices. I'm still contemplating buying the 8110 (in yellow, of course) to get acquainted with KaiOS.

Microsoft pulls “tabbed windows” feature from next release

Peter Bright at Ars Technica:

Sets - a new Windows interface feature that was first previewed in November 2017 and will make every window into a tabbed window - has been removed from the latest Insider Preview build of Windows 10. Moreover, the Verge is reporting that the feature won't be coming back in this year's next major update, due in October.

This marks the second time that Sets have been included in a preview release only to be removed at a later stage prior to the release of an update. When first announcing Sets, Microsoft was careful to note that it wasn't promising Sets for any particular release - or possibly even ever, given the complexities of application compatibility and uncertainty about how people will actually use the feature.

This is a feature I'm really looking forward to, and it sucks to see it pulled like this, for the second time. I understand the complexities of a feature like this - especially with the vast library of software Windows supports - but that does raise the question if Microsoft's openness regarding Windows development was a bit too much for this particular feature.

Apple engineers its own downfall with the MBP keyboard

A titan of tech and industrial innovation has been laid low by a mere speck of dust. Last week, Apple quietly announced that they were extending the warranty on their flagship laptop's keyboard by four years. As it turns out, the initial run of these keyboards, described by Jony Ive as thin, precise, and "sturdy", has been magnificently prone to failure.

When you see it all spelled out like this, it makes Apple look either incredibly incompetent, or astonishingly arrogant.

A small look into the GameCube’s copy filter

A while back I was going through Dolphin's issues page out of sheer boredom.

I don’t know anything about coding to fix any of this stuff, but I do like to test really old issues sometimes to see if the hundreds of changes made over the years has produced any change or potentially even fixed some of the issues. After a few pages, I eventually came across issue 726 - Gamma setting has no effect. Out of curiosity, I clicked.

This is a great story about a very obscure and technical bug in the Dolphin GameCube emulator.

Talking to Duplex: Google’s phone AI feels revolutionary

At Google I/O, Google demonstrated Google Duplex, an AI-generated voice assistent that can make phone calls for you to perform tasks like making a restaurant reservation or booking a hair salon appointment. After the event, a whole Google Duplex truther movement sprung up, who simply couldn't believe technology could do anything even remotely like this, and who accused Google and its CEO Sundar Pachai of lying on stage.

Today, a whole slew of media outlets have published articles about how they were invited to an event at a real restaurant, where the journalists themselves got to talk to Google Duplex. The journalists took on the role of restaurant workers taking reservations requested by Google Duplex. The results? It works exactly as advertised - better, even. Here's Ars Technica's Ron Amadeo:

Duplex patiently waited for me to awkwardly stumble through my first ever table reservation while I sloppily wrote down the time and fumbled through a basic back and forth about Google's reservation for four people at 7pm on Thursday. Today's Google Assistant requires authoritative, direct, perfect speech in order to process a command. But Duplex handled my clumsy, distracted communication with the casual disinterest of a real person. It waited for me to write down its reservation requirements, and when I asked Duplex to repeat things I didn't catch the first time ("A reservation at what time?"), it did so without incident. When I told this robocaller the initial time it wanted wasn't available, it started negotiating times; it offered an acceptable time range and asked for a reservation somewhere in that time slot. I offered seven o'clock and Google accepted.

From the human end, Duplex's voice is absolutely stunning over the phone. It sounds real most of the time, nailing most of the prosodic features of human speech during normal talking. The bot "ums" and "uhs" when it has to recall something a human might have to think about for a minute. It gives affirmative "mmhmms" if you tell it to hold on a minute. Everything flows together smoothly, making it sound like something a generation better than the current Google Assistant voice.

One of the strangest (and most impressive) parts of Duplex is that there isn't a single "Duplex voice." For every call, Duplex would put on a new, distinct personality. Sometimes Duplex come across as male; sometimes female. Some voices were higher and younger sounding; some were nasally, and some even sounded cute.

And The Verge's Dieter Bohn:

Duplex conveyed politeness in the demos we saw. It paused with a little "mmhmm" when the called human asked it to wait, a pragmatic tactic Huffman called "conversational acknowledgement". It showed that Duplex was still on the line and listening, but would wait for the human to continue speaking.

It handled a bunch of interruptions, out of order questions, and even weird discursive statements pretty well. When a human sounded confused or flustered, Duplex took a tone that was almost apologetic. It really seems to be designed to be a super considerate and non-confrontational customer on the phone.

All calls started with Duplex identifying itself as an automated service that would also record the calls, giving the person on the receiving end of the line the opportunity to object. Such objections are handled gracefully, with the call being handed over to a human operator at Google on an unrecorded line. The human fallback is a crucial element of the system, according to Google, because regardless of permission, not every call will go smoothly.

Google Duplex will roll out in limited testing over the coming weeks and months.

A legend reborn: Microsoft brings back the Classic IntelliMouse

Inspired by the Microsoft IntelliMouse Explorer 3.0 from 2003, Microsoft has recently released the new Microsoft Classic IntelliMouse. Offering the same classic ergonomic look and feel, the new Microsoft Classic IntelliMouse offers improved performance and additional features made possible by technology today.

In remembering the classic mouse, we sat down with Simon Dearsley, Devices Design Director at Microsoft, to discuss the legacy of the Microsoft IntelliMouse and what you can expect from the newest version of the iconic IntelliMouse range.

The IntelliMouse is iconic - I don't know anyone who hasn't used one at some point in their lives. I used them in the various schools and university I attended, at my DIY store job, at friends' places - this thing was everywhere.

Qualcomm announces Snapdragon 632, 439, and 429

A month ago we saw Qualcomm release a new "upper mid-range" SoC with the announcement of the Snapdragon 710 - the emphasis was on the fact that this was a new market tier aiming slightly below the top-tier flagship chipsets. Today, we're seeing Qualcomm expand the traditional mid-tier and also what can be considered the low-end for smartphone devices. The Snapdragon 439 and 429 follow in the footsteps of the 435 and 425 and bring FinFET to the low-end; the Snapdragon 632 is more akin to the Snapdragon 652 as it's now the first time we see big cores brought down to the lower mid-tier successor to the Snapdragon 630.

Sure, the high-end Apple and Qualcomm Snapdragon SoCs get all the attention, but it's these mid-to-low-end SoCs that are the real workhorses of the mobile revolution.

Marzipan as a path to ARM-based Macs

Apple has dropped legacy frameworks very easily in the past though. But how exactly did that happen?

CPU changes. Once when MacOS went from PPC to Intel, and then once when MacOS went from 32 bit to 64 bit. Each time that transition happened Apple was able to say "OK, this legacy stuff just isn't going to be there on the new architecture". And since you had to recompile apps anyway to make them run on the new architecture, developers kind of shrugged and said "Well, yea. That's what I would have done too". It made sense.

So are we about to see 128 bit Intel processors anytime soon, to facilitate this change? I doubt it.

OK then, what about a new architecture?

Oh. Hello 64 bit ARM.

The Macintosh platform is going to transition to Apple's own ARM64 architecture over the coming years. The most succinct explanation as to why comes from Steven Troughton-Smith:

Opening ARM-based Macs to the iOS ecosystem to make one unified Apple platform, knowing what we know about Marzipan, makes so much sense that it becomes difficult to imagine it any other way. Apple finds itself completely unable to build the computers it wants to build with Intel.

Windows has already made the move to ARM, and macOS will be joining it over the coming years. There is a major architectural shift happening in desktop computing, and there are quite a few companies who have to worry about their long-term bottom line: Intel, AMD, and NVIDIA.

Tenox.net’s archive of computer books

Yesterday, we linked to a 1997 book about the Windows 95 file system, which is a great read. Don't let the fun end there, though - the site hosting said book, Tenox.net by Antoni Sawicki, is a true treasure trove of in-depth books that while outdated today, are still amazingly detailed reads. I honestly have no idea which to pick to quote here as an example, so out of my own personal interest, I couldn't really pass up "Configuring CDE: The Common Desktop Environment" by Charles Fernandez.

If you spend the major portion of your work day in front of a workstation chasing bits through the electronic networks of cyberspace, aka the information highway, so that your users can be more productive, this book is for you.

If you spend your days (or, thanks to some corporate edict, are about to spend your days) living in the Common Desktop Environment, so that your users can focus on their work and not the mechanics of getting to their work, this book shows you what you can do to make that environment their home.

There's countless other great reads in the list, so peruse them and find your own favourites.

The gospel of Elon Musk, according to his flock

Bijan Stephen, writing about Elon Musk's adoring, unquestioning fans:

Gomez isn't alone. She's one member of a vast, global community of people who revere the 46-year-old entrepreneur with a passion better suited to a megachurch pastor than a tech mogul. With followers like her, Elon Musk - the South African-born multibillionaire known for high-profile, risky investments such as Tesla (electric cars), SpaceX (private space travel), the Boring Company (underground travel), and Neuralink (neurotechnology) - has reaped the benefits of a culture in which fandom dominates nearly everything. While his detractors see him as another out-of-touch, inexpert rich guy who either can't or won't acknowledge the damage he and his companies are doing, to his fans, Musk is a visionary out to save humanity from itself. They gravitate toward his charisma and his intoxicating brew of extreme wealth, a grand vision for society - articulated through his companies, which he has an odd habit of launching with tweets - and an internet-friendly playfulness that sets him apart from the stodgier members of his economic class. Among his more than 22 million followers, all of this inspires a level of righteous devotion rarely glimpsed outside of the replies to a Taylor Swift tweet.

The most vocal of those fans have an impact: they're an army of irregulars waiting to be marshaled via a tweet and sent on the digital warpath against anything Musk decides he doesn't like, the iron fist in Musk's velvet glove. They've become known for haranguing people they believe have crossed him, journalists especially, with relentless fervor. The attacks are standard social media-era fare: free-for-all bombardment across social platforms by people who are not always vitriolic but who nevertheless barrage the perceived enemy with bad-faith questions.

Just to reiterate: this article is about Elon Musk - not somebody else.