Hardware Archive

Acer wants to sell a dual Xeon Predator X system: please no

Acer's leading gaming branding, Predator, is all about maximizing performance, particularly around gaming. In the modern era, that now extends into content creation, streaming, video editing, and all the sorts of things that drive the need for high performance. As we've seen several times over the years, just throwing more cores at the problem isn't the solution: bottlenecks appear elsewhere in the system. Despite this, Acer is preparing a mind-boggling solution.

The Acer Predator X is the new dual-Xeon workstation, with ECC memory and multiple graphics cards, announced today at IFA 2018. The premise of the system is for the multi-taskers that do everything: gaming, content creation, streaming, the lot. With this being one of Acer's flagship products, we expect it to be geared to the hilt: maximum cores, maximum capacity. There-in lies the first rub: if Acer is going all out, this is going to cost something crazy.

This clearly makes zero sense, but at the same time, it's kind of awesome Acer is doing this. Dual-processor workstations are a bit of an obsession for me, but with dual-processor machines entirely relegated to Xeon systems, they've become quite unaffordable. Even though it makes zero sense, I would love for regular Intel Core and AMD Zen processors to support dual processor setups.

The MRISC32: a vector-first CPU design

In essence, it's a 32-bit RISC ISA designed from a holistic view on integer, floating point, scalar and vector operations. In addition there is a hardware implementation of a single issue, in order, pipelined CPU. The hardware implementation mostly serves as an aid in the design of the ISA (at the time of writing the CPU is still incomplete).

As happens with some articles I post here, this one's definitely a bit over my head.

Why use an FPGA instead of a CPU or GPU?

Recently, Intel bought Altera, one of the largest producers of FPGAs. Intel paid a whopping $16.7 billion, making it their largest acquisition ever. In other news, Microsoft is using FPGAs in its data centers, and Amazon is offering them on their cloud services. Previously, these FPGAs were mainly used in electronics engineering, but not so much in software engineering. Are FPGAs about to take off and become serious alternatives to CPUs and GPUs?

FPGAs are used extensively by e.g. the Amiga community to recreate older chipsets.

How Michael Dell saved his company from the brink

So CEO Michael Dell presented shareholders with a $25 billion buyout that would take the company private, giving it space away from the public limelight (and pressure from investors) to rethink and reposition the struggling computer company for the future.

Fast-forward to 2018, and Dell's prospects seem far better. Dell is now worth an estimated $70 billion - nearly triple what the buyout valued it at five years ago - and it has announced a bid to return to the public sector in a $22 billion buyout. It’s an astounding transformation. Dell and his investment partners at Silver Lake transformed the company from a struggling consumer electronics company into an enterprise powerhouse.

It's indeed a pretty amazing turnaround. A few years ago, I would've never seriously considered a Dell. These days, though, their XPS 13 and 15 laptops are some of the best laptops you can get, with Linux editions available as well.

How the shared family computer protected us from our worst selves

Long before phone addiction panic gripped the masses and before screen time became a facet of our wellness and digital detoxes, there was one good and wise piece of technology that served our families. Maybe it was in the family room or in the kitchen. It could have been a Mac or PC. Chances are it had a totally mesmerizing screensaver. It was the shared family desktop.

I can still see the Dell I grew up using as clear as day, like I just connected to NetZero yesterday. It sat in my eldest sister’s room, which was just off the kitchen. Depending on when you peeked into the room, you might have found my dad playing Solitaire, my sister downloading songs from Napster, or me playing Wheel of Fortune or writing my name in Microsoft Paint. The rules for using the family desktop were pretty simple: homework trumped games; Dad trumped all. Like the other shared equipment in our house, its usefulness was focused and direct: it was a tool that the whole family used, and it was our portal to the wild, weird, wonderful internet. As such, we adored it.

This describes my parental home perfectly, except that our first computer was way earlier than the Napster days - we got our first computer in 1990 or 1991 - and that my brothers and I were way more adept at using the computer than my parents were. Still, this brings back some very old memories.

The Chinese typewriter: a history

Nominally a book that covers the rough century between the invention of the telegraph in the 1840s and that of computing in the 1950s, The Chinese Typewriter is secretly a history of translation and empire, written language and modernity, misguided struggle and brutal intellectual defeat. The Chinese typewriter is 'one of the most important and illustrative domains of Chinese techno-linguistic innovation in the 19th and 20th centuries ... one of the most significant and misunderstood inventions in the history of modern information technology', and 'a historical lens of remarkable clarity through which to examine the social construction of technology, the technological construction of the social, and the fraught relationship between Chinese writing and global modernity'. It was where empires met.

How fast is a PS/2 keyboard?

A few weeks ago, an interesting question cropped up: How fast is a PS/2 keyboard? That is to say, how quickly can it send scan codes (bytes) to the keyboard controller?

One might also ask, does it really matter? Sure enough, it does. As it turns out, the Borland Turbo Pascal 6.0 run-time, and probably a few related versions, handle keyboard input in a rather unorthodox way. The run-time installs its own INT 9/IRQ 1 handler (keyboard interrupt) which reads port 60h (keyboard data) and then chains to the original INT 9 handler… which reads port 60h again, expecting to read the same value.

That is a completely crazy approach, unless there is a solid guarantee that the keyboard can’t send a new byte of data before port 60h is read the second time. The two reads are done more or less back to back, with interrupts disabled, so much time cannot elapse between the two. But there will be some period of time where the keyboard might send further data. So, how quickly can a keyboard do that?

I love these questions.

What is the BASIC Engine?

The BASIC Engine is a very low-cost single-board home computer with advanced 2D color graphics and sound capabilities, roughly comparable to late-1980s or early-1990s computers and video game consoles. It can be built at home without special skills or tools and using readily available components for under 10 Euros in parts, or mass-produced for even less.

What a fascinating little device, and a great idea to boot - BASIC is a great programming language to use as first steps into programming.

RISC-V’s open-source architecture shakes up chip design

But what's so compelling about RISC-V isn't the technology - it's the economics. The instruction set is open source. Anyone can download it and design a chip based on the architecture without paying a fee. If you wanted to do that with ARM, you'd have to pay its developer, Arm Holding, a few million dollars for a license. If you wanted to use x86, you're out of luck because Intel licenses its instruction set only to Advanced Micro Devices.

For manufacturers, the open-source approach could lower the risks associated with building custom chips. Already, Nvidia and Western Digital Corp. have decided to use RISC-V in their own internally developed silicon. Western Digital's chief technology officer has said that in 2019 or 2020, the company will unveil a new RISC-V processor for the more than 1 billion cores the storage firm ships each year. Likewise, Nvidia is using RISC-V for a governing microcontroller that it places on the board to manage its massively multicore graphics processors.

This really explains why ARM is so scared of RISC-V. I mean, RISC-V might not make it to high-end smartphones for now, but if RISC-V takes off in the market for microcontrollers and other "invisibe" processors, it could be a huge threat to ARM's business model.

Dawn of the microcomputer: the Altair 8800

But Popular Electronics readers were introduced to something in the January 1975 issue that they had never encountered before. Below a heading that read "PROJECT BREAKTHROUGH", the magazine's cover showed a large gray and black box whose front panel bore a complicated array of lights and toggles. This was the Altair 8800, the "world's first minicomputer kit to rival commercial models", available for under $400. Though advertised as a "minicomputer", the Altair would actually be the first commercially successful member of a new class of computers, first known as "microcomputers" and then eventually as PCs. The Altair was small enough and cheap enough that the average family could have one at home. Its appearance in Popular Electronics magazine meant that, as Salsberg wrote in that issue, "the home computer age is here - finally".

You can play with the Altair 8800 in your browser.

Do you really need to properly eject a USB drive?

Pull a USB flash drive out of your Mac without first clicking to eject it, and you'll get a stern, shameful warning: "Disk Not Ejected Properly."

But do you really need to eject a thumb drive the right way?

Probably not. Just wait for it to finish copying your data, give it a few seconds, then yank. To be on the cautious side, be more conservative with external hard drives, especially the old ones that actually spin.

That's not the official procedure, nor the most conservative approach. And in a worst-case scenario, you risk corrupting a file or - even more unlikely - the entire storage device.

This is terrible advice for regular users, but I have to admit that I, too, don't really use the safe eject features of operating systems, unless I want to eject right after completing a write operation.

Global PC shipments grew 1.4% in Q2 2018, first time in 6 years

The PC market has seen its first growth quarter in six years, according to research firm Gartner. The streak is over: Gartner found PC shipments were up globally in Q2 2018, the first quarter of year-over-year global PC shipment growth since the first quarter of 2012.

Gartner estimates that worldwide PC shipments grew 1.4 percent to 62.1 million units in Q2 2018. The top five vendors were Lenovo, HP, Dell, Apple, and Acer. Lenovo in particular saw big gains (its highest growth rate since the first quarter of 2015), although that's largely due in part to the inclusion of units from its joint venture with Fujitsu.

The economic crisis is over, and people and companies are buying PCs again.

ARM kills off its anti-RISC-V smear site after own staff revolt

Arm has taken offline its website attacking rival processor architecture RISC-V within days of it going live - after its own staff objected to the underhand tactic.

The site - riscv-basics.com - was created at the end of June, and attempted to smear open-source RISC-V, listing five reasons why Arm cores are a better choice over its competitor's designs. However, the stunt backfired, with folks in the tech industry, and within the company's own ranks, slamming the site as a cheap shot and an attack on open source.

Good on ARM's own employees for speaking up.

ARM launches PR attack on RISC-V

Anybody remember Microsoft's "get the facts" campaign? Well, ARM is having its "get the facts" moment, with the British company launching a site to disparage the open source RISC-V architecture.

The instruction set architecture (ISA) is the foundation of all chip or System-on-Chip (SoC) products. It is therefore one of the most fundamental design choices you will make. If you are considering using an open-source ISA, such as RISC-V, it is critical to understand the key factors you should consider as part of your go-to-market strategy.

It seems odd for ARM - riding high as it is - to attack RISC-V like this, which seems to barely be making a dent anywhere.

The Jackintosh: a real GEM – remembering the Atari ST

I promised you an Atari story, so you get an Atari story. How about a history of and ode to the Atari ST, the Amiga and Macintosh competitor?

Surviving on its remaining video-game inventory, the new company went to work developing Tramiel's new 16-bit computer. Based on the same Motorola 68000 processor used in the Apple Macintosh, the Atari ST (the ST apparently standing for "sixteen/thirty-two" although some have speculated it stood for "Sam Tramiel" after Jack's son), was designed to be attractive to a wide variety of computer users. Like the Commodore 64, the ST could be plugged into a television for casual video-gaming, but additionally it could use a colour or monochrome monitor - the latter of which featuring a higher resolution than the Macintosh, an appeal to those in the then-emerging world of desktop publishing. It also came standard with MIDI (Musical Instrument Digital Interface) ports for controlling synthesisers, making it attractive to musicians.

I actually bought an Atari T-shirt last week that I'm wearing right now, which is a tad bit disingenuous since I've never actually used an Atari, be it a console or an ST. The ST is on my wish list, though, alongside an Amiga 1200 and C64. I promise I'll earn the right to wear this shirt.

Design case history: the Commodore 64

We've been on a bit of a history trip lately with old computer articles and books, and this one from 1985 certainly fits right in.

In January 1981, a handful of semiconductor engineers at MOS Technology in West Chester, Pa., a subsidiary of Commodore International Ltd., began designing a graphics chip and sound chip to sell to whoever wanted to make "the world's best video game". In January 1982, a home computer incorporating those chips was introduced at the Winter Consumer Electronics Show in Las Vegas, Nev. By using in-house integrated-circuit-fabrication facilities for prototyping, the engineers had cut design time for each chip to less than nine months, and they had designed and built five prototype computers for the show in less than five weeks. What surprised the rest of the home-computer industry the most, however, was the introductory price of the Commodore 64: $595 for a unit incorporating a keyboard, a central processor, the graphics and sound chips, and 64 kilobytes of memory instead of the 16 or 32 that were considered the norm.

A fully decked-out Commodore 64 with all the crucial peripherals - tape drive, disk drive, printer, joysticks, official monitor - is still very high on my wish list.

The DEC 340 Monitor

My big project this year is to get a DEC 340 monitor working. Here is a picture of one of them.

The DEC 340 was a very early and rare computer monitor dating from the mid '60s used of course, on DEC computers, their PDP series. Two cabinets of rack mounted electronics. The 340 is historic and was used in some early work that pioneered modern computer graphic techniques. It is quite a bit different from Cathode Ray Tube (CRT) monitors used by personal computers we were all familiar with a few years ago. In comparison it is alien technology. All circuits are implemented using discrete components and there are no integrated circuits anywhere in the design. The discrete components themselves are unusual dating from the early days of transistor use.

It always amazes me how fast technology has developed over the past few decades.

The world’s fastest supercomputer is back in America

Last week, the US Department of Energy and IBM unveiled Summit, America's latest supercomputer, which is expected to bring the title of the world's most powerful computer back to America from China, which currently holds the mantle with its Sunway TaihuLight supercomputer.

With a peak performance of 200 petaflops, or 200,000 trillion calculations per second, Summit more than doubles the top speeds of TaihuLight, which can reach 93 petaflops. Summit is also capable of over 3 billion billion mixed precision calculations per second, or 3.3 exaops, and more than 10 petabytes of memory, which has allowed researchers to run the world's first exascale scientific calculation.

The $200 million supercomputer is an IBM AC922 system utilizing 4,608 compute servers containing two 22-core IBM Power9 processors and six Nvidia Tesla V100 graphics processing unit accelerators each. Summit is also (relatively) energy-efficient, drawing just 13 megawatts of power, compared to the 15 megawatts TaihuLight pulls in.

There's something mesmerizing about supercomputers like these. I would love to just walk through this collection of machines.

It’s 2018 and USB Type-C is still a mess

USB Type-C was billed as the solution for all our future cable needs, unifying power and data delivery with display and audio connectivity, and ushering in an age of the one-size-fits-all cable. Unfortunately for those already invested in the USB Type-C ecosystem, which is anyone who has bought a flagship phone in the past couple of years, the standard has probably failed to live up to the promises.

Other than my Nintendo Switch, my back-up phone (a Galaxy S8), and my old Nexus 6P in storage somewhere, I don't use USB-C at all, so I've been able to avoid all of its problems so far. It seems like a real mess.