Hardware Archive

A touchpad is not a mouse, or at least not a good one

One of the things about having a pretty nice work laptop with a screen that’s large enough to have more than one real window at once is that I actually use it, and I use it with multiple windows, and that means that I need to use the mouse. I like computer mice in general so I don’t object to this, but like most modern laptops my Dell XPS 13 doesn’t have a mouse, it has a trackpad (or touchpad, take your pick). You can use a modern touchpad as a mouse, but over my time in using the XPS 13 I’ve come to understand (rather viscerally) that a touchpad is not a mouse and trying to act as if it was is not a good idea. There are some things that a touchpad makes easy and natural that aren’t very natural on a mouse, and a fair number of things that are natural on a mouse but don’t work very well on a touchpad (at least for me; they might for people who are more experienced with touchpads). Chris Siebenmann makes some good points regarding touchpads here. Despite the fact that touchpads on Windows and Linux have gotten better over the years, they’re still not nearly as good as Apple’s, and will never beat a mouse. I feel like mouse input on laptops is ripe for serious innovation.

The CADR microprocessor

The CADR microprocessor is a general purpose processor designed for convenient emulation of complex order codes, particularly those involving stacks and pointer manipulation. It is the central processor in the LISP machine project, where it interprets the bit-efficient 16-bit order code produced by the LISP machine compiler. (The terms “LISP machine” and “CADR machine” are sometimes confused. In this document, the CADR machine is a particular design of microprocessor, while the LISP machine is the CADR machine plus the microcode which interprets the LISP machine order code.) I’ll admit I have no idea what anything in this long, technical description means, but I’m pretty sure this is right up many readers’ alleys.

LG’s groundbreaking roll-up TV is going on sale this year

LG is going several steps further by making the TV go away completely whenever you’re not watching. It drops slowly and very steadily into the base and, with the push of a button, will rise back up in 10 seconds or so. It all happens rather quietly, too. You can’t see the actual “roll” when the TV is closed in, sadly; a transparent base would’ve been great for us nerds to see what’s happening inside the base as the TV comes in or unfurls, but the white is certainly a little more stylish. Functionally, LG tells me it hasn’t made many changes to the way the LG Display prototype worked aside from enhancing the base. I didn’t get to ask about durability testing — how many times the OLED TV R has been tested to go up and down, for example — but that’s something I’m hoping to get an answer to. We don’t really talk about TVs all that much on OSNews – it’s generally a boring industry – but this rollable display technology is just plain cool.

MIPS goes open source

Without question, 2018 was the year RISC-V genuinely began to build momentum among chip architects hungry for open-source instruction sets. That was then.

By 2019, RISC-V won't be the only game in town.

Wave Computing announced Monday that it is putting MIPS on open source, with MIPS Instruction Set Architecture (ISA) and MIPS' latest core R6 available in the first quarter of 2019.

Good news, and it makes me wonder - will we ever see a time where x86 and x86-64 are open source? I am definitely not well-versed enough in these matters to judge just how important the closed-source nature of the x86 ISA really is to Intel and AMD, but it seems like something that will never happen.

LG Releases Gram 17 laptop: ultra-thin, 17.3″ display

Due to their size and lack of portability, 17-inch notebooks are not exactly popular among road warriors. Instead this is largely the domain of desktop replacement-class machines, which in turn has caused 17-inch laptops to be built bigger still in order to maximize their performance and emphasize the replacement aspect. Every now and then however we see a 17-inch laptop that still tries to be reasonably portable, and this is the case with LG's latest gram laptop, which hit the market this week.

Equipped with a 17.3-inch screen featuring a 2560×1600 resolution, the LG gram 17 comes in a dark silver Carbon Magnesium alloy chassis that is only 17.8 mm (0.7 inches) thick, which is thinner than most 15-inch notebooks (in fact, this even thinner than the ASUS ZenBook Pro 15). Meanwhile, the laptop weighs 1.33 kilograms (2.95 pounds), which is in line with many 13-inch mobile PCs. As a result, while the 17-inch gram still has a relatively large footprint, its still a relatively portable laptop.

I'm genuinely surprised LG decided to put this 17-incher on the market - consider it a sort of spiritual successor to the 17" PowerBook G4, in my view one of the best laptops ever made. It seems like the market has pretty much settled on 12"-13", with a few professional and low-end laptops offering a 15" screen. I hope this LG laptop is at least even a modest success, because I'd love for more 17" laptops to make it to market.

Qualcomm announces the details of the Snapdragon 855

Today is the second day of Qualcomm's Snapdragon Technology Summit in Maui, and while yesterday was all about 5G and a teaser for its new chipset, today is all about the Snapdragon 855. The new chipset is built on a 7nm architecture, promising faster speeds, better battery life, and improved connectivity.

But as far as general performance goes, Qualcomm says that its Kryo 485 cores will offer a 45% boost, and the Adreno 640 GPU will show a 20% increase. With the firm's Snapdragon Elite Gaming Platform, gamers will be able to play in HDR with physically based rendering (PBR).

If these numbers hold up - only independent benchmarking will tell - this will go a ways to closing the wide gap with Apple's current offering at least partially.

Amazon developed its own ARM core for its own cloud services

Today we are launching EC2 instances powered by Arm-based AWS Graviton Processors. Built around Arm cores and making extensive use of custom-built silicon, the A1 instances are optimized for performance and cost. They are a great fit for scale-out workloads where you can share the load across a group of smaller instances. This includes containerized microservices, web servers, development environments, and caching fleets.

Interesting to see Amazon design its own ARM core specifically for its own product.

The vacuum tube’s many modern day uses

Among obscure pop culture tidbits and stories about wacky inventions, Tedium has often documented the continued survival of technology long thought of as obsolete. From calculagraphs to COBOL, we love hearing that ancient tech survives in the 21st century and revel in the uses that keep them around. So it was surprising to dig through the Tedium archives looking for something I expected to find, but didn't. Today, we're righting that wrong and diving into the robust and thriving world of a technology that was foundational to the progress humanity made during the 20th century. Today's Tedium is talking vacuum tubes.

Reusing old hardware

Everybody has one. At least one. Collecting dust in a closet somewhere; waiting to be thrown away. It's not a time capsule per-se, but if you looked at it now it would probably show you a snapshot of a life you lived not that long ago. It was once a source of pride, entertainment, accomplishment or perhaps comfort. Maybe it was a status symbol. Now you would call it useless, worthless, junk.

We're not talking about the photo album from your dormroom party days, although it might still contain a copy. We’re talking about your old PC, laptop, netbook, or computer. That thing you spent hundreds or thousands of dollars on to sit in front of for hours doing whatever it is that you do. Maybe it helped you get a degree, or maybe it was your primary source of income. Doesn’t matter now anyway. Your smart-toaster does more MIPS and FLOPS with half the power! There's no value in an old computer, right?

Wrong! If the commoditization of computing hardware and the steady marching of Moore's law has done anything to old computers it has been to breathe new life into them. How, you ask?

Putting old hardware to new uses is one way of recycling - I tend to give away my "old" smartphones as I buy new ones way too often. Often, a friend's phone stopped working or a family member needs a new one - so I just give them mine.

High-res graphics on a text-only TRS-80

From the Byte Cellar:

What inspired me to pull the Model 4 down off the shelf were a number of tweets from telnet BBS pals showing the system being put to great use logged into various systems across the web. Some of the screenshots showed the machine rendering ANSI "graphics" onscreen and I looked into it. As I suspected, the stock Model 4 is not capable of taking on a custom character set such as is needed by ANSI emulation, and I discovered the system had been equipped with a graphics board and the ANSI-supporting terminal program, ANSITerm, was rendering "text" to a graphics display; the character set was basically a software font.

And I just had to go there.

Why do computers use so much energy?

Microsoft is currently running an interesting set of hardware experiments. The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland's Orkney Islands, and involves a total of 864 standard Microsoft data-center servers. Many people have impugned the rationality of the company that put Seattle on the high-tech map, but seriously - why is Microsoft doing this?

There are several reasons, but one of the most important is that it is far cheaper to keep computer servers cool when they're on the seafloor. This cooling is not a trivial expense. Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers - a huge cost to the economy as whole. Moreover, all that energy used by those computers ultimately gets converted into heat. This results in a second cost: that of keeping the computers from melting.

I use a custom watercooling loop to keep my processor and videocard cool, but aside from size and scale, datacenters struggle with the exact same problem - computers generate a ton of heat, and that heat needs to go somewhere.

The engineering miracle of the Sinclair ZX Spectrum

The Spectrum was not the first Sinclair computer to make it big. It was, however, the first to go massive. In the months prior to launching, 'The Computer Programme' had aired on the BBC, legitimising the home micro computer as the must have educational item of the 1980's. For Sinclair and the ZX Spectrum the time was right, parents were keen, and the kids were excited. Games would soon be were everywhere thanks to all the kids programming their brand new Spectrums.

A major success factor, the one that gave the Spectrum its name, is the computer's capacity to generate a spectrum of colours. The micro is capable of generating 16 colours; 8 low intensity colours and 8 matching bright variants. It's hard to imagine now, but in 1982 these 16 colours were enough to start a home computer revolution. Richard Altwasser, the engineer employed by Sinclair to develop the Spectrum's graphic systems, was setting a new benchmark with some very innovative ideas.

I've missed the entire 8 bit home micro revolution - I simply was too young or not even born yet. It must've been such an exciting time.

Our USB-C dongle hell is almost over

It's almost the end of 2018, but I'm finally able to say that almost all of my day-to-day devices have been replaced with a USB-C option, or can be replaced in the near future.

I bought a fully specced out Dell XPS 13, and it's the first laptop I've ever had that charges over USB-C. Cool and all, but I quickly realized that only the 27W charger it came with actually charges it; other USB-C chargers simply don't work because they're not powerful enough.

I'm not quite sure USB-C is there, yet.

Why the future of data storage is (still) magnetic tape

Studies show that the amount of data being recorded is increasing at 30 to 40 percent per year. At the same time, the capacity of modern hard drives, which are used to store most of this, is increasing at less than half that rate. Fortunately, much of this information doesn’t need to be accessed instantly. And for such things, magnetic tape is the perfect solution.

Seriously? Tape? The very idea may evoke images of reels rotating fitfully next to a bulky mainframe in an old movie like Desk Set or Dr. Strangelove. So, a quick reality check: tape has never gone away!

Open source RISC-V implemented from scratch in one night

Developed in a magic night of 19 Aug, 2018 between 2am and 8am, the darkriscv is a very experimental implementation of the opensource RISC-V instruction set. Nowadays, after one week of exciting sleepless nights of work (which explains the lots of typos you will found ahead), the darkriscv reached a very good quality result, in a way that the "hello world" compiled by the standard riscv-elf-gcc is working fine!

I feel incompetent.

Acer wants to sell a dual Xeon Predator X system: please no

Acer's leading gaming branding, Predator, is all about maximizing performance, particularly around gaming. In the modern era, that now extends into content creation, streaming, video editing, and all the sorts of things that drive the need for high performance. As we've seen several times over the years, just throwing more cores at the problem isn't the solution: bottlenecks appear elsewhere in the system. Despite this, Acer is preparing a mind-boggling solution.

The Acer Predator X is the new dual-Xeon workstation, with ECC memory and multiple graphics cards, announced today at IFA 2018. The premise of the system is for the multi-taskers that do everything: gaming, content creation, streaming, the lot. With this being one of Acer's flagship products, we expect it to be geared to the hilt: maximum cores, maximum capacity. There-in lies the first rub: if Acer is going all out, this is going to cost something crazy.

This clearly makes zero sense, but at the same time, it's kind of awesome Acer is doing this. Dual-processor workstations are a bit of an obsession for me, but with dual-processor machines entirely relegated to Xeon systems, they've become quite unaffordable. Even though it makes zero sense, I would love for regular Intel Core and AMD Zen processors to support dual processor setups.

The MRISC32: a vector-first CPU design

In essence, it's a 32-bit RISC ISA designed from a holistic view on integer, floating point, scalar and vector operations. In addition there is a hardware implementation of a single issue, in order, pipelined CPU. The hardware implementation mostly serves as an aid in the design of the ISA (at the time of writing the CPU is still incomplete).

As happens with some articles I post here, this one's definitely a bit over my head.