Hardware Archive

How fast is a PS/2 keyboard?

A few weeks ago, an interesting question cropped up: How fast is a PS/2 keyboard? That is to say, how quickly can it send scan codes (bytes) to the keyboard controller?

One might also ask, does it really matter? Sure enough, it does. As it turns out, the Borland Turbo Pascal 6.0 run-time, and probably a few related versions, handle keyboard input in a rather unorthodox way. The run-time installs its own INT 9/IRQ 1 handler (keyboard interrupt) which reads port 60h (keyboard data) and then chains to the original INT 9 handler… which reads port 60h again, expecting to read the same value.

That is a completely crazy approach, unless there is a solid guarantee that the keyboard can’t send a new byte of data before port 60h is read the second time. The two reads are done more or less back to back, with interrupts disabled, so much time cannot elapse between the two. But there will be some period of time where the keyboard might send further data. So, how quickly can a keyboard do that?

I love these questions.

What is the BASIC Engine?

The BASIC Engine is a very low-cost single-board home computer with advanced 2D color graphics and sound capabilities, roughly comparable to late-1980s or early-1990s computers and video game consoles. It can be built at home without special skills or tools and using readily available components for under 10 Euros in parts, or mass-produced for even less.

What a fascinating little device, and a great idea to boot - BASIC is a great programming language to use as first steps into programming.

RISC-V’s open-source architecture shakes up chip design

But what's so compelling about RISC-V isn't the technology - it's the economics. The instruction set is open source. Anyone can download it and design a chip based on the architecture without paying a fee. If you wanted to do that with ARM, you'd have to pay its developer, Arm Holding, a few million dollars for a license. If you wanted to use x86, you're out of luck because Intel licenses its instruction set only to Advanced Micro Devices.

For manufacturers, the open-source approach could lower the risks associated with building custom chips. Already, Nvidia and Western Digital Corp. have decided to use RISC-V in their own internally developed silicon. Western Digital's chief technology officer has said that in 2019 or 2020, the company will unveil a new RISC-V processor for the more than 1 billion cores the storage firm ships each year. Likewise, Nvidia is using RISC-V for a governing microcontroller that it places on the board to manage its massively multicore graphics processors.

This really explains why ARM is so scared of RISC-V. I mean, RISC-V might not make it to high-end smartphones for now, but if RISC-V takes off in the market for microcontrollers and other "invisibe" processors, it could be a huge threat to ARM's business model.

Dawn of the microcomputer: the Altair 8800

But Popular Electronics readers were introduced to something in the January 1975 issue that they had never encountered before. Below a heading that read "PROJECT BREAKTHROUGH", the magazine's cover showed a large gray and black box whose front panel bore a complicated array of lights and toggles. This was the Altair 8800, the "world's first minicomputer kit to rival commercial models", available for under $400. Though advertised as a "minicomputer", the Altair would actually be the first commercially successful member of a new class of computers, first known as "microcomputers" and then eventually as PCs. The Altair was small enough and cheap enough that the average family could have one at home. Its appearance in Popular Electronics magazine meant that, as Salsberg wrote in that issue, "the home computer age is here - finally".

You can play with the Altair 8800 in your browser.

Do you really need to properly eject a USB drive?

Pull a USB flash drive out of your Mac without first clicking to eject it, and you'll get a stern, shameful warning: "Disk Not Ejected Properly."

But do you really need to eject a thumb drive the right way?

Probably not. Just wait for it to finish copying your data, give it a few seconds, then yank. To be on the cautious side, be more conservative with external hard drives, especially the old ones that actually spin.

That's not the official procedure, nor the most conservative approach. And in a worst-case scenario, you risk corrupting a file or - even more unlikely - the entire storage device.

This is terrible advice for regular users, but I have to admit that I, too, don't really use the safe eject features of operating systems, unless I want to eject right after completing a write operation.

Global PC shipments grew 1.4% in Q2 2018, first time in 6 years

The PC market has seen its first growth quarter in six years, according to research firm Gartner. The streak is over: Gartner found PC shipments were up globally in Q2 2018, the first quarter of year-over-year global PC shipment growth since the first quarter of 2012.

Gartner estimates that worldwide PC shipments grew 1.4 percent to 62.1 million units in Q2 2018. The top five vendors were Lenovo, HP, Dell, Apple, and Acer. Lenovo in particular saw big gains (its highest growth rate since the first quarter of 2015), although that's largely due in part to the inclusion of units from its joint venture with Fujitsu.

The economic crisis is over, and people and companies are buying PCs again.

ARM kills off its anti-RISC-V smear site after own staff revolt

Arm has taken offline its website attacking rival processor architecture RISC-V within days of it going live - after its own staff objected to the underhand tactic.

The site - riscv-basics.com - was created at the end of June, and attempted to smear open-source RISC-V, listing five reasons why Arm cores are a better choice over its competitor's designs. However, the stunt backfired, with folks in the tech industry, and within the company's own ranks, slamming the site as a cheap shot and an attack on open source.

Good on ARM's own employees for speaking up.

ARM launches PR attack on RISC-V

Anybody remember Microsoft's "get the facts" campaign? Well, ARM is having its "get the facts" moment, with the British company launching a site to disparage the open source RISC-V architecture.

The instruction set architecture (ISA) is the foundation of all chip or System-on-Chip (SoC) products. It is therefore one of the most fundamental design choices you will make. If you are considering using an open-source ISA, such as RISC-V, it is critical to understand the key factors you should consider as part of your go-to-market strategy.

It seems odd for ARM - riding high as it is - to attack RISC-V like this, which seems to barely be making a dent anywhere.

The Jackintosh: a real GEM – remembering the Atari ST

I promised you an Atari story, so you get an Atari story. How about a history of and ode to the Atari ST, the Amiga and Macintosh competitor?

Surviving on its remaining video-game inventory, the new company went to work developing Tramiel's new 16-bit computer. Based on the same Motorola 68000 processor used in the Apple Macintosh, the Atari ST (the ST apparently standing for "sixteen/thirty-two" although some have speculated it stood for "Sam Tramiel" after Jack's son), was designed to be attractive to a wide variety of computer users. Like the Commodore 64, the ST could be plugged into a television for casual video-gaming, but additionally it could use a colour or monochrome monitor - the latter of which featuring a higher resolution than the Macintosh, an appeal to those in the then-emerging world of desktop publishing. It also came standard with MIDI (Musical Instrument Digital Interface) ports for controlling synthesisers, making it attractive to musicians.

I actually bought an Atari T-shirt last week that I'm wearing right now, which is a tad bit disingenuous since I've never actually used an Atari, be it a console or an ST. The ST is on my wish list, though, alongside an Amiga 1200 and C64. I promise I'll earn the right to wear this shirt.

Design case history: the Commodore 64

We've been on a bit of a history trip lately with old computer articles and books, and this one from 1985 certainly fits right in.

In January 1981, a handful of semiconductor engineers at MOS Technology in West Chester, Pa., a subsidiary of Commodore International Ltd., began designing a graphics chip and sound chip to sell to whoever wanted to make "the world's best video game". In January 1982, a home computer incorporating those chips was introduced at the Winter Consumer Electronics Show in Las Vegas, Nev. By using in-house integrated-circuit-fabrication facilities for prototyping, the engineers had cut design time for each chip to less than nine months, and they had designed and built five prototype computers for the show in less than five weeks. What surprised the rest of the home-computer industry the most, however, was the introductory price of the Commodore 64: $595 for a unit incorporating a keyboard, a central processor, the graphics and sound chips, and 64 kilobytes of memory instead of the 16 or 32 that were considered the norm.

A fully decked-out Commodore 64 with all the crucial peripherals - tape drive, disk drive, printer, joysticks, official monitor - is still very high on my wish list.

The DEC 340 Monitor

My big project this year is to get a DEC 340 monitor working. Here is a picture of one of them.

The DEC 340 was a very early and rare computer monitor dating from the mid '60s used of course, on DEC computers, their PDP series. Two cabinets of rack mounted electronics. The 340 is historic and was used in some early work that pioneered modern computer graphic techniques. It is quite a bit different from Cathode Ray Tube (CRT) monitors used by personal computers we were all familiar with a few years ago. In comparison it is alien technology. All circuits are implemented using discrete components and there are no integrated circuits anywhere in the design. The discrete components themselves are unusual dating from the early days of transistor use.

It always amazes me how fast technology has developed over the past few decades.

The world’s fastest supercomputer is back in America

Last week, the US Department of Energy and IBM unveiled Summit, America's latest supercomputer, which is expected to bring the title of the world's most powerful computer back to America from China, which currently holds the mantle with its Sunway TaihuLight supercomputer.

With a peak performance of 200 petaflops, or 200,000 trillion calculations per second, Summit more than doubles the top speeds of TaihuLight, which can reach 93 petaflops. Summit is also capable of over 3 billion billion mixed precision calculations per second, or 3.3 exaops, and more than 10 petabytes of memory, which has allowed researchers to run the world's first exascale scientific calculation.

The $200 million supercomputer is an IBM AC922 system utilizing 4,608 compute servers containing two 22-core IBM Power9 processors and six Nvidia Tesla V100 graphics processing unit accelerators each. Summit is also (relatively) energy-efficient, drawing just 13 megawatts of power, compared to the 15 megawatts TaihuLight pulls in.

There's something mesmerizing about supercomputers like these. I would love to just walk through this collection of machines.

It’s 2018 and USB Type-C is still a mess

USB Type-C was billed as the solution for all our future cable needs, unifying power and data delivery with display and audio connectivity, and ushering in an age of the one-size-fits-all cable. Unfortunately for those already invested in the USB Type-C ecosystem, which is anyone who has bought a flagship phone in the past couple of years, the standard has probably failed to live up to the promises.

Other than my Nintendo Switch, my back-up phone (a Galaxy S8), and my old Nexus 6P in storage somewhere, I don't use USB-C at all, so I've been able to avoid all of its problems so far. It seems like a real mess.

ARM Holdings history: from Acorn to giant tree

The computer industry is full of noble failures. Big ones. Little ones. Ideas that were 10 years too early. Ideas that were 15 years too early. Ideas that were 30 years too early. And concepts that, while fundamental to the way that our computing culture works today, hadn’t yet reached their full potential. Though certainly successful in its early years, the ARM processor very much fits in the latter category. Today, variants of these processors are in just about everything, from tiny computers, to smartphones, to video game consoles, to television sets, and even some servers. But the company that initially forged the processor is almost forgotten at this point, seemingly lost to history (especially outside of Europe) despite being an early icon of British computing. Tonight's Tedium ponders the story of Acorn Computers, the long-departed company whose best idea is probably in the device you're using to read this.

This introduction is basically clickbait specifically designed for OSNews readers. Well done.

How people used to download games from the radio

An anonymous user sent this one in, and even though it's old - 2014 - I hadn't read it yet, and I don't think it's ever been posted here.

It's a Monday night in Bristol in July 1983. Your parents are downstairs watching Coronation Street while you skulk in your bedroom under the pretence of doing homework. In reality, you're hunched over your cassette recorder, fingers hovering over the buttons in feverish anticipation. A quiver of excitement runs through you as a voice from the radio announces: "and now the moment you've all been waiting for..." There's a satisfying clunk as you press down on play and record simultaneously, and moments later the room is filled with strange metallic squawks and crackles. "SCREEEEEEEEEEE..."

You're listening to the Datarama show on Radio West and partaking in the UK's first attempt to send a computer program over local radio. Joe Tozer, who co-hosted the show, recalls how it all began: "I think it was just one of those 'ping!' moments when you realise that the home computer program is just audio on a cassette, so why not transmit it over air? It just seemed a cool idea."

I have very little experience with using cassettes as a data storage medium, except for that one time, somewhere in the late '80s or early '90s, where a neighbour kid and I loaded Rambo for the C64 from a cassette tape. That's the only time I ever did such a thing, and in hindsight, I'm glad I got to experience this era of computing, even if it was only once.

Amazon introduces PC designs to integrate Alexa into PCs

Alexa for PCs, announced earlier this year, brings the cloud-based voice service to Windows 10 computers. Today, we introduce Alexa for PC solutions from Original Design Manufacturers (ODMs). Customers use PCs every day for business and entertainment. We believe voice is the next major disruption in the PC category, which is an important part of our "Alexa Everywhere" vision.

Four Windows 10 PCs have been added to our portfolio of qualified ODM solutions integrated with Alexa: An all-in-one desktop from Wistron, and convertible notebooks from Compal, Quanta, and Wistron. All of these pre-tested, final-product designs have been built for a far-field Alexa experience, with Intel CPUs, drivers, wake word engine, and microphone arrays.

I find Amazon devices very off-putting. I know Alexa devices are popular in the US, but does anyone outside of the US use Alexa?

Asus replaced the touchpad with a touchscreen

Unveiled at Computex 2018, the Asus ZenBook Pro is the new pinnacle of Asus' premium laptop range, and it comes with an attention-grabbing new feature: a smartphone-sized touchscreen in the place of the regular touchpad. I got to grips with the two ZenBook Pro models and their so-called ScreenPads here in Taipei, and I was pleasantly surprised by how well implemented and potentially useful this apparent gimmick feature is.

The touchpad seems like such an obvious place to put a secondary display, so I'm glad laptop makers are experimenting with it. That being said, I think I would personally prefer the display itself to not be a full-colour smartphone-like display, but rather a more basic black and white (or grey and white) OLED display that almost visually disappears into the chassis itself. On the Asus Zenbook Pro, the touchpad jumps out at you and demands attention, which I don't particularly like.

Arm Cortex-A76 unveiled: taking aim at the top for 7nm

The Cortex A76 presents itself a solid generational improvement for Arm. We've been waiting on a larger CPU microarchitecture for several years now, and while the A76 isn't quite a performance monster to compete with Apple's cores, it shows how important it is to have a balanced microarchitecture. This year all eyes were on Samsung and the M3 core, and unfortunately the performance increase came at a great cost of power and efficiency which ended up making the end-product rather uncompetitive. The A76 drives performance up but on every step of the way it still deeply focused on power efficiency which means we'll get to see the best of both worlds in end products.

In general Arm promises a 35% performance improvement which is a significant generational uplift. Together with the fact that the A76 is targeted to be employed in 7nm designs is also a boost to the projected product.

This seems like a solid next step.

“Huawei’s new MateBook X Pro is the best laptop right now”

There are many products that cross my desk for review, and very few of those products surprise me. I've been doing this for long enough that I can generally guess how a device is going to perform or work before it even gets to me.

Huawei's new MateBook X Pro is an exception. Even though the MateBook X Pro has a deep bench of specs and an eye-catching design, Huawei is not exactly an established brand in the laptop world. Prior Huawei laptops weren't great, either: they had poor battery life, not enough power, bad design, frustrating trackpads, and were generally not worth considering.

Fortunately, the MateBook X Pro has completely and thoroughly exceeded my expectations. While it is not a perfect laptop and it has a couple of faults that will stop some from considering it, it is still the best laptop I've used all year. That makes it my new recommendation as the productivity and entertainment laptop to buy right now.

This looks like a great all-rounder