Hardware Archive

The engineering miracle of the Sinclair ZX Spectrum

The Spectrum was not the first Sinclair computer to make it big. It was, however, the first to go massive. In the months prior to launching, 'The Computer Programme' had aired on the BBC, legitimising the home micro computer as the must have educational item of the 1980's. For Sinclair and the ZX Spectrum the time was right, parents were keen, and the kids were excited. Games would soon be were everywhere thanks to all the kids programming their brand new Spectrums.

A major success factor, the one that gave the Spectrum its name, is the computer's capacity to generate a spectrum of colours. The micro is capable of generating 16 colours; 8 low intensity colours and 8 matching bright variants. It's hard to imagine now, but in 1982 these 16 colours were enough to start a home computer revolution. Richard Altwasser, the engineer employed by Sinclair to develop the Spectrum's graphic systems, was setting a new benchmark with some very innovative ideas.

I've missed the entire 8 bit home micro revolution - I simply was too young or not even born yet. It must've been such an exciting time.

Our USB-C dongle hell is almost over

It's almost the end of 2018, but I'm finally able to say that almost all of my day-to-day devices have been replaced with a USB-C option, or can be replaced in the near future.

I bought a fully specced out Dell XPS 13, and it's the first laptop I've ever had that charges over USB-C. Cool and all, but I quickly realized that only the 27W charger it came with actually charges it; other USB-C chargers simply don't work because they're not powerful enough.

I'm not quite sure USB-C is there, yet.

Why the future of data storage is (still) magnetic tape

Studies show that the amount of data being recorded is increasing at 30 to 40 percent per year. At the same time, the capacity of modern hard drives, which are used to store most of this, is increasing at less than half that rate. Fortunately, much of this information doesn’t need to be accessed instantly. And for such things, magnetic tape is the perfect solution.

Seriously? Tape? The very idea may evoke images of reels rotating fitfully next to a bulky mainframe in an old movie like Desk Set or Dr. Strangelove. So, a quick reality check: tape has never gone away!

Open source RISC-V implemented from scratch in one night

Developed in a magic night of 19 Aug, 2018 between 2am and 8am, the darkriscv is a very experimental implementation of the opensource RISC-V instruction set. Nowadays, after one week of exciting sleepless nights of work (which explains the lots of typos you will found ahead), the darkriscv reached a very good quality result, in a way that the "hello world" compiled by the standard riscv-elf-gcc is working fine!

I feel incompetent.

Acer wants to sell a dual Xeon Predator X system: please no

Acer's leading gaming branding, Predator, is all about maximizing performance, particularly around gaming. In the modern era, that now extends into content creation, streaming, video editing, and all the sorts of things that drive the need for high performance. As we've seen several times over the years, just throwing more cores at the problem isn't the solution: bottlenecks appear elsewhere in the system. Despite this, Acer is preparing a mind-boggling solution.

The Acer Predator X is the new dual-Xeon workstation, with ECC memory and multiple graphics cards, announced today at IFA 2018. The premise of the system is for the multi-taskers that do everything: gaming, content creation, streaming, the lot. With this being one of Acer's flagship products, we expect it to be geared to the hilt: maximum cores, maximum capacity. There-in lies the first rub: if Acer is going all out, this is going to cost something crazy.

This clearly makes zero sense, but at the same time, it's kind of awesome Acer is doing this. Dual-processor workstations are a bit of an obsession for me, but with dual-processor machines entirely relegated to Xeon systems, they've become quite unaffordable. Even though it makes zero sense, I would love for regular Intel Core and AMD Zen processors to support dual processor setups.

The MRISC32: a vector-first CPU design

In essence, it's a 32-bit RISC ISA designed from a holistic view on integer, floating point, scalar and vector operations. In addition there is a hardware implementation of a single issue, in order, pipelined CPU. The hardware implementation mostly serves as an aid in the design of the ISA (at the time of writing the CPU is still incomplete).

As happens with some articles I post here, this one's definitely a bit over my head.

Why use an FPGA instead of a CPU or GPU?

Recently, Intel bought Altera, one of the largest producers of FPGAs. Intel paid a whopping $16.7 billion, making it their largest acquisition ever. In other news, Microsoft is using FPGAs in its data centers, and Amazon is offering them on their cloud services. Previously, these FPGAs were mainly used in electronics engineering, but not so much in software engineering. Are FPGAs about to take off and become serious alternatives to CPUs and GPUs?

FPGAs are used extensively by e.g. the Amiga community to recreate older chipsets.

How Michael Dell saved his company from the brink

So CEO Michael Dell presented shareholders with a $25 billion buyout that would take the company private, giving it space away from the public limelight (and pressure from investors) to rethink and reposition the struggling computer company for the future.

Fast-forward to 2018, and Dell's prospects seem far better. Dell is now worth an estimated $70 billion - nearly triple what the buyout valued it at five years ago - and it has announced a bid to return to the public sector in a $22 billion buyout. It’s an astounding transformation. Dell and his investment partners at Silver Lake transformed the company from a struggling consumer electronics company into an enterprise powerhouse.

It's indeed a pretty amazing turnaround. A few years ago, I would've never seriously considered a Dell. These days, though, their XPS 13 and 15 laptops are some of the best laptops you can get, with Linux editions available as well.

How the shared family computer protected us from our worst selves

Long before phone addiction panic gripped the masses and before screen time became a facet of our wellness and digital detoxes, there was one good and wise piece of technology that served our families. Maybe it was in the family room or in the kitchen. It could have been a Mac or PC. Chances are it had a totally mesmerizing screensaver. It was the shared family desktop.

I can still see the Dell I grew up using as clear as day, like I just connected to NetZero yesterday. It sat in my eldest sister’s room, which was just off the kitchen. Depending on when you peeked into the room, you might have found my dad playing Solitaire, my sister downloading songs from Napster, or me playing Wheel of Fortune or writing my name in Microsoft Paint. The rules for using the family desktop were pretty simple: homework trumped games; Dad trumped all. Like the other shared equipment in our house, its usefulness was focused and direct: it was a tool that the whole family used, and it was our portal to the wild, weird, wonderful internet. As such, we adored it.

This describes my parental home perfectly, except that our first computer was way earlier than the Napster days - we got our first computer in 1990 or 1991 - and that my brothers and I were way more adept at using the computer than my parents were. Still, this brings back some very old memories.

The Chinese typewriter: a history

Nominally a book that covers the rough century between the invention of the telegraph in the 1840s and that of computing in the 1950s, The Chinese Typewriter is secretly a history of translation and empire, written language and modernity, misguided struggle and brutal intellectual defeat. The Chinese typewriter is 'one of the most important and illustrative domains of Chinese techno-linguistic innovation in the 19th and 20th centuries ... one of the most significant and misunderstood inventions in the history of modern information technology', and 'a historical lens of remarkable clarity through which to examine the social construction of technology, the technological construction of the social, and the fraught relationship between Chinese writing and global modernity'. It was where empires met.

How fast is a PS/2 keyboard?

A few weeks ago, an interesting question cropped up: How fast is a PS/2 keyboard? That is to say, how quickly can it send scan codes (bytes) to the keyboard controller?

One might also ask, does it really matter? Sure enough, it does. As it turns out, the Borland Turbo Pascal 6.0 run-time, and probably a few related versions, handle keyboard input in a rather unorthodox way. The run-time installs its own INT 9/IRQ 1 handler (keyboard interrupt) which reads port 60h (keyboard data) and then chains to the original INT 9 handler… which reads port 60h again, expecting to read the same value.

That is a completely crazy approach, unless there is a solid guarantee that the keyboard can’t send a new byte of data before port 60h is read the second time. The two reads are done more or less back to back, with interrupts disabled, so much time cannot elapse between the two. But there will be some period of time where the keyboard might send further data. So, how quickly can a keyboard do that?

I love these questions.

What is the BASIC Engine?

The BASIC Engine is a very low-cost single-board home computer with advanced 2D color graphics and sound capabilities, roughly comparable to late-1980s or early-1990s computers and video game consoles. It can be built at home without special skills or tools and using readily available components for under 10 Euros in parts, or mass-produced for even less.

What a fascinating little device, and a great idea to boot - BASIC is a great programming language to use as first steps into programming.

RISC-V’s open-source architecture shakes up chip design

But what's so compelling about RISC-V isn't the technology - it's the economics. The instruction set is open source. Anyone can download it and design a chip based on the architecture without paying a fee. If you wanted to do that with ARM, you'd have to pay its developer, Arm Holding, a few million dollars for a license. If you wanted to use x86, you're out of luck because Intel licenses its instruction set only to Advanced Micro Devices.

For manufacturers, the open-source approach could lower the risks associated with building custom chips. Already, Nvidia and Western Digital Corp. have decided to use RISC-V in their own internally developed silicon. Western Digital's chief technology officer has said that in 2019 or 2020, the company will unveil a new RISC-V processor for the more than 1 billion cores the storage firm ships each year. Likewise, Nvidia is using RISC-V for a governing microcontroller that it places on the board to manage its massively multicore graphics processors.

This really explains why ARM is so scared of RISC-V. I mean, RISC-V might not make it to high-end smartphones for now, but if RISC-V takes off in the market for microcontrollers and other "invisibe" processors, it could be a huge threat to ARM's business model.

Dawn of the microcomputer: the Altair 8800

But Popular Electronics readers were introduced to something in the January 1975 issue that they had never encountered before. Below a heading that read "PROJECT BREAKTHROUGH", the magazine's cover showed a large gray and black box whose front panel bore a complicated array of lights and toggles. This was the Altair 8800, the "world's first minicomputer kit to rival commercial models", available for under $400. Though advertised as a "minicomputer", the Altair would actually be the first commercially successful member of a new class of computers, first known as "microcomputers" and then eventually as PCs. The Altair was small enough and cheap enough that the average family could have one at home. Its appearance in Popular Electronics magazine meant that, as Salsberg wrote in that issue, "the home computer age is here - finally".

You can play with the Altair 8800 in your browser.

Do you really need to properly eject a USB drive?

Pull a USB flash drive out of your Mac without first clicking to eject it, and you'll get a stern, shameful warning: "Disk Not Ejected Properly."

But do you really need to eject a thumb drive the right way?

Probably not. Just wait for it to finish copying your data, give it a few seconds, then yank. To be on the cautious side, be more conservative with external hard drives, especially the old ones that actually spin.

That's not the official procedure, nor the most conservative approach. And in a worst-case scenario, you risk corrupting a file or - even more unlikely - the entire storage device.

This is terrible advice for regular users, but I have to admit that I, too, don't really use the safe eject features of operating systems, unless I want to eject right after completing a write operation.

Global PC shipments grew 1.4% in Q2 2018, first time in 6 years

The PC market has seen its first growth quarter in six years, according to research firm Gartner. The streak is over: Gartner found PC shipments were up globally in Q2 2018, the first quarter of year-over-year global PC shipment growth since the first quarter of 2012.

Gartner estimates that worldwide PC shipments grew 1.4 percent to 62.1 million units in Q2 2018. The top five vendors were Lenovo, HP, Dell, Apple, and Acer. Lenovo in particular saw big gains (its highest growth rate since the first quarter of 2015), although that's largely due in part to the inclusion of units from its joint venture with Fujitsu.

The economic crisis is over, and people and companies are buying PCs again.

ARM kills off its anti-RISC-V smear site after own staff revolt

Arm has taken offline its website attacking rival processor architecture RISC-V within days of it going live - after its own staff objected to the underhand tactic.

The site - riscv-basics.com - was created at the end of June, and attempted to smear open-source RISC-V, listing five reasons why Arm cores are a better choice over its competitor's designs. However, the stunt backfired, with folks in the tech industry, and within the company's own ranks, slamming the site as a cheap shot and an attack on open source.

Good on ARM's own employees for speaking up.