Hardware Archive

Asus replaced the touchpad with a touchscreen

Unveiled at Computex 2018, the Asus ZenBook Pro is the new pinnacle of Asus' premium laptop range, and it comes with an attention-grabbing new feature: a smartphone-sized touchscreen in the place of the regular touchpad. I got to grips with the two ZenBook Pro models and their so-called ScreenPads here in Taipei, and I was pleasantly surprised by how well implemented and potentially useful this apparent gimmick feature is.

The touchpad seems like such an obvious place to put a secondary display, so I'm glad laptop makers are experimenting with it. That being said, I think I would personally prefer the display itself to not be a full-colour smartphone-like display, but rather a more basic black and white (or grey and white) OLED display that almost visually disappears into the chassis itself. On the Asus Zenbook Pro, the touchpad jumps out at you and demands attention, which I don't particularly like.

Arm Cortex-A76 unveiled: taking aim at the top for 7nm

The Cortex A76 presents itself a solid generational improvement for Arm. We've been waiting on a larger CPU microarchitecture for several years now, and while the A76 isn't quite a performance monster to compete with Apple's cores, it shows how important it is to have a balanced microarchitecture. This year all eyes were on Samsung and the M3 core, and unfortunately the performance increase came at a great cost of power and efficiency which ended up making the end-product rather uncompetitive. The A76 drives performance up but on every step of the way it still deeply focused on power efficiency which means we'll get to see the best of both worlds in end products.

In general Arm promises a 35% performance improvement which is a significant generational uplift. Together with the fact that the A76 is targeted to be employed in 7nm designs is also a boost to the projected product.

This seems like a solid next step.

“Huawei’s new MateBook X Pro is the best laptop right now”

There are many products that cross my desk for review, and very few of those products surprise me. I've been doing this for long enough that I can generally guess how a device is going to perform or work before it even gets to me.

Huawei's new MateBook X Pro is an exception. Even though the MateBook X Pro has a deep bench of specs and an eye-catching design, Huawei is not exactly an established brand in the laptop world. Prior Huawei laptops weren't great, either: they had poor battery life, not enough power, bad design, frustrating trackpads, and were generally not worth considering.

Fortunately, the MateBook X Pro has completely and thoroughly exceeded my expectations. While it is not a perfect laptop and it has a couple of faults that will stop some from considering it, it is still the best laptop I've used all year. That makes it my new recommendation as the productivity and entertainment laptop to buy right now.

This looks like a great all-rounder

USB reverse engineering: down the rabbit hole

I tend to dive down rabbit holes a lot, and given the cost of context switching and memory deteriorating over time, sometimes the state I build up in my mind gets lost between the chances I get to dive in. These 'linkdump' posts are an attempt to collate at least some of that state in a way that I can hopefully restore to my brain at a later point.

This time around I was inspired to look into USB reverse engineering, protocol analyis, hardware hacking, and what would be involved in implementing custom drivers for arbitrary hardware. Or put another way: how do I hack all of the USBs?!??

It seems the deeper I went, the more interesting I found the content, and this post grew and grew. Hopefully it will help to shortcut your own journey down this path, and enlighten you to a whole new area of interesting things to hack!

Let's continue this impromptu series on things I barely understand, shall we?

The AD9361: when microchips are more profitable than drugs

When Analog Devices released their SDR transciever AD9361 in 2013 - it was a revolution in digital radio. SDR's were there before, but only now you can have it all: 2 channels for TX and RX with onboard 12-bit DAC/ADCs with 56MHz of RF simultanious bandwidth, local oscillators, mixers and LNA - all working in the range from 70 (TX from 47) to 6000Mhz. Using AD9361 out of the box one could implement almost any useful digital radio, with the rare exceptions of UWB and 60GHz. You only need to add data source/sink (which is still often an FPGA), external filters and PA if your task requires it.

Finally I was able to take a look inside and peek at manufacturing cost of a microelectronic device with such an exceptional added value.

This is a little over my head, but I love the pretty pictures.

A gorgeous guide to the first wave of personal computers

Photographer James Ball (aka Docubyte) knows what a computer is. He's spent part of career lovingly photographing the machines of yesteryear, from the giant mainframes of the '50s and '60s to the first wave of personal computers in the late '70s and '80s. When he saw Apple's iPad pro advertisement that ended with a young girl asking "What's a computer?" as she typed away on her tablet, it provoked him.

"I'm not some old technophobe, and I get the whole post-computing cloud/device blah blah thing," Ball told Motherboard via email. "But I wanted to pick up an old Mac and say 'Hey! Remember this? This is a computer. The era of crazy shaped beige boxes and clunky clicking keyboards, for me and a lot of other people, that is a computer."

To honor those machines, Ball has created a series of high resolution animated gifs honoring 16 machines from the era of the birth of the personal computer. He calls the project 'I Am a Computer: Icons of Beige.'

These are gorgeous.

Inside the 76477 Space Invaders sound effect chip

The 76477 Complex Sound Generation chip (1978) provided sound effects for Space Invaders and many other video games. It was also a popular hobbyist chip, easy to experiment with and available at Radio Shack. I reverse-engineered the chip from die photos and found some interesting digital circuitry inside. Perhaps the most interesting is a shift register based white noise generator, useful for drums, gunshots, explosions and other similar sound effects. The chip also uses a digital mixer to combine the chip's different sound generators. An unusual feature of the chip is that it uses Integrated Injection Logic (I2L), a type of digital logic developed in the 1970s with the goal of high-density, high-speed chips.

A completely silent computer

I've been trying to make my computers quieter for nearly three decades. Custom liquid cooling loops, magnetically-stabilised fluid-dynamic bearings, acoustic dampeners, silicone shock absorbers, you name it. Well, last week I finally managed to build a completely silent computer. Without further ado...

The Streacom DB4 is an amazing chassis and case, which I am considering for one of my next computer builds. This article provides great insight into building such a fanless PC, with links to additional articles about the system later in its lifespan.

Commodore 64 to Raspberry Pi 3 conversion

So the objective here was to take a C64 breadbin case and keyboard and put a Raspberry Pi 3 into it; keeping the keyboard and joystick ports working, but also giving me HDMI, USB controller support, and modem emulation. While I still have 2 real Commodore 64s (and an Ultimate64 on the way!), I like using the RPi and Vice to play 64 games.

These mounts do not require you to drill or cut your C64 case! The 3D files are provisioned under the creative commons license so they are FREE to use, distribute, modify, or even sell.

Just a fun project.

NeXT Computer replica: Raspberry Pi case

This build is a 10cm x 10cm x 10cm replica of the NeXT Computer to house a Raspberry Pi computer. I designed and built this specifically with the aim of having it run some basic server tasks on my home network, such as storing revision control repositories etc.

The necessary files to make your own are available. What a neat-looking case - I'd love a similar one, but slightly larger so it can house a mini-ITX board. I would love to build a Ryzen II machine in a case like this.

Widescreen laptops are dumb

But a laptop is more than just a video playback machine. For myself and millions of others, it's the primary tool for earning a living. We use these machines to read, write, remember, create, connect, and communicate. And in most of these other applications, a 16:9 screen of 13 to 15 inches in size just feels like a poor fit.

As long as I can easily open more than one document side by side, any aspect ratio gets my blessing. I don't mind black bars on video, especially since today's screens have pretty good black levels, so they're hardly distracting. Still, I'm glad more and more laptop makers are starting to see the benefit in 3:2-like displays.

OLPC’s $100 laptop was going to change the world

It was supposed to be the laptop that saved the world.

In late 2005, tech visionary and MIT Media Lab founder Nicholas Negroponte pulled the cloth cover off a small green computer with a bright yellow crank. The device was the first working prototype for Negroponte's new nonprofit One Laptop Per Child, dubbed "the green machine" or simply "the $100 laptop". And it was like nothing that Negroponte's audience - at either his panel at a UN-sponsored tech summit in Tunis, or around the globe - had ever seen.

The OLPC was all the rage and hype for a few years back then, but it never materialised. Still, while not nearly the same thing, cheap mobile phones and smartphones have played a somewhat similar role.

Cloudflare bets on ARM servers

Cloudflare, which operates a content delivery network it also uses to provide DDoS protection services for websites, is in the middle of a push to vastly expand its global data center network. CDNs are usually made up of small-footprint nodes, but those nodes need to be in many places around the world.

As it expands, the company is making a big bet on ARM, the emerging alternative to Intel’s x86 processor architecture, which has dominated the data center market for decades.

The money quote from CloudFlare's CEO:

"We think we're now at a point where we can go one hundred percent to ARM. In our analysis, we found that even if Intel gave us the chips for free, it would still make sense to switch to ARM, because the power efficiency is so much better."

Intel and AMD ought to be worried about the future. Very worried. If I were them, I'd start work on serious ARM processors - because they're already missing out on mobile, and they're about to start missing out on desktops and servers, too.

HP 9000 and PA-RISC computers story

The HP 9000 Series of computers spanned almost three decades and very diverse platforms of Unix computers. Both RISC and Unix, with a longer history, were developed into coherent products during the 1980s, moving from academia via industrial R&D to productization at a time when much computing was still done on mainframes, minicomputers and time-sharing machines such as DEC PDP, VAX, IBM AS/400 and System/360.

Paul Weissmann tells the story of the development and history of the HP9000.

Booting Windows NT on a DEC Multia

I stumbled upon an absolute gem of a website over the weekend - Sophie Haskins' Pizza Box Computer. On this site, Haskins details a number of ancient non-x86 workstations. All of the posts on the site are fun and interesting reads, so let's pick one of her machines - a DEC Multia running Windows NT for Alpha -

The Multia was an attempt by Digital to make a lower-cost Alpha workstation for running Windows NT. There were Alpha and Intel Pentium models, and they use a lot of off-the-shelf PC components rather than custom Digital ones (hence its later name, the "Universal Desktop Box"). It's quite tiny - so much so that it has laptop PCMCIA slots for expansion!

The latest post details getting Windows NT up and running on the Multia, and is certainly worth a read - like the rest of the site.

Building a Lightroom PC

If there's one thing that will make even the most powerful computer feel like a 7 year old rig, it's Adobe Lightroom paired with RAW files from any high-megapixel camera.

In my case, I spent over a year of spare time editing 848GB worth of 11,000+ 42-megapixel RAW photos and 4K videos from my New Zealand trip and making these nine photosets. I quickly realized that my two year old iMac was not up to the challenge.

In 2015 I took a stab at solving my photo storage problem with a cloud-backed 12TB Synology NAS. That setup is still running great. Now I just need to keep up with the performance requirements of having the latest camera gear with absurd file sizes.

I decided it was time to upgrade to something a bit more powerful. This time I decided to build a PC and switch to Windows 10 for my heavy computing tasks. Yes, I switched to Windows.

I love articles like this, because there is no one true way to build a computer for any task, and everyone has their own opinions and ideas and preferences, making sure not one self-built PC is the same as anyone else's. Add in a healthy dose of urban legends and tradition, and you have a great cocktail for endless discussions that never go anywhere.

It's clickbait without actually being clickbait.

Is it time for open processors?

The disclosure of the Meltdown and Spectre vulnerabilities has brought a new level of attention to the security bugs that can lurk at the hardware level. Massive amounts of work have gone into improving the (still poor) security of our software, but all of that is in vain if the hardware gives away the game. The CPUs that we run in our systems are highly proprietary and have been shown to contain unpleasant surprises (the Intel management engine, for example). It is thus natural to wonder whether it is time to make a move to open-source hardware, much like we have done with our software. Such a move may well be possible, and it would certainly offer some benefits, but it would be no panacea.

Given the complexity of modern CPUs and the fierceness of the market in which they are sold, it might be surprising to think that they could be developed in an open manner. But there are serious initiatives working in this area; the idea of an open CPU design is not pure fantasy. A quick look around turns up several efforts; the following list is necessarily incomplete.

Qualcomm Centriq 2400: the world’s first 10nm server processor

Today marks a major milestone in the processor industry - we've launched Qualcomm Centriq 2400, the world's first and only 10nm server processor. While this is the culmination of an intensive five-year journey for the Qualcomm Datacenter Technologies (QDT) team, it also marks the beginning of an era that will see a step function in the economics and energy efficiency of operating a datacenter.

Replacing x86 firmware with Linux and Go

The Intel Management Engine (ME), which is a separate processor and operating system running outside of user control on most x86 systems, has long been of concern to users who are security and privacy conscious. Google and others have been working on ways to eliminate as much of that functionality as possible (while still being able to boot and run the system). Ronald Minnich from Google came to Prague to talk about those efforts at the 2017 Embedded Linux Conference Europe.

The Xerox Alto struts its stuff on its 40th birthday

The Xerox Alto, widely recognized as the first modern personal computer, pioneered just about every basic concept we are familiar with in computers today. These include windows, bit-mapped computer displays, the whole idea of WYSIWIG interfaces, the cut/paste/copy tools in word processing programs, and pop-up menus. Most of this vision of the "office of the future" was first unveiled at a meeting of Xerox executives held on 10 Nov 1977, which was 40 years ago last week.

To celebrate that birthday, the Computer History Museum in Mountain View, Calif., brought together some of Parc researchers who worked on the Alto on Friday. They put it through its paces in a series of live demos. These demos used an Alto that had been restored to working order over the past eight months.

One of the most important computers ever made.