Hardware Archive

Can MIPS leapfrog RISC-V?

When Wave Computing acquired MIPS, “going open source” was the plan Wave’s CEO Derek Meyer had in mind. But Meyer, a long-time MIPS veteran, couldn’t casually mention his plan then. Wave was hardly ready with the solid infrastructure it needed to support a legion of hardware developers interested in coming to the MIPS open-source community. To say “go open source” is easy. Pulling it off has meant a huge shift from MIPS, long accustomed to the traditional IP licensing business. MIPS will compete with and exist alongside RISC-V. The future of truly open source hardware is getting more and more interesting.

Nvidia announces $99 AI computer for developers, makers, and researchers

In recent years, advances in AI have produced algorithms for everything from image recognition to instantaneous translation. But when it comes to applying these advances in the real world, we’re only just getting started. A new product from Nvidia announced today at GTC — a $99 AI computer called the Jetson Nano — should help speed that process. The Nano is the latest in Nvidia’s line of Jetson embedded computing boards, used to provide the brains for robots and other AI-powered devices. Plug one of these into your latest creation, and it’ll be able to handle tasks like object recognition and autonomous navigation without relying on cloud processing power. Fascinating little device that could be a great boon for the maker community.

Kryofluxing PC floppies

Last year I finally bought a Kryoflux, unfortunately in the middle of moving house. Now I’m finally able to use it beyond verifying that it’s not completely broken. After imaging a few dozens of floppies, I can say one thing–Kryoflux is surprisingly difficult to use with PC 5¼″disks. There is a distinct impression that Kryoflux was designed to deal primarily with Amiga and C64 floppies, and although PC floppy formats present absolutely no difficulty for the Kryoflux hardware as such, using the software for archiving standard PC 5¼″ media is very far from simple. Let’s start with the easy part. Imaging 3½″ media is relatively simple because PC 3½″drives are straightforward (well, let’s omit the special Japanese 1.6M media). 3½″ drives always rotate at 300 RPM and usually automatically handle media density based on the floppy itself. But if everything were easy, life wouldn’t be very interesting. Preserving the data on these ancient floppies is crucial, and it’s great to see various types of specialised hardware exist just for this purpose.

SweRV: an annotated deep dive

To satisfy the true geeks, Western Digital organized a Swerv Deep Dive at the Bay Area RISC-V Meetup. The meetup was well organized (free food!) and attended by roughly 100 people. A Webex recording of this meetup is currently still available here. (The first 53 minutes are empty. The meat of the presentation starts at the 53min30 mark.) Zvonimir Bandic, Senior Director of Next Generation Platform Technologies Department at Western Digital, gave an excellent presentation, well paced, little marketing fluff, with sufficient technical detail to pique my interest to dive deeper in the specifics of the core. I highly recommend watching the whole thing. There was also a second presentation about instruction tracing which I won’t talk about in this post. In this blog post, I’ll go through the presentation and add some extra details that I noted down at the meetup or that were gathered while going through the SweRV source code on GitHub or while going through the RISC-V SweRV EH1 Programmer’s Reference. This goes way beyond my comfort level.

Thunderbolt 3 becomes USB4, as Intel’s interconnect goes royalty-free

Ars Technica reports: Fulfilling its 2017 promise to make Thunderbolt 3 royalty-free, Intel has given the specification for its high-speed interconnect to the USB Implementers Forum (USB-IF), the industry group that develops the USB specification. The USB-IF has taken the spec and will use it to form the basis of USB4, the next iteration of USB following USB 3.2. Yes, it’s called USB4, which will exist alongside USB 3.2 Gen 1, USB 3.2 Gen 2, and USB 3.2 Gen 2×2. I don’t even know what to say.

USB 3.2 is going to make the current USB branding even worse

USB 3.2, which doubles the maximum speed of a USB connection to 20Gb/s, is likely to materialize in systems later this year. In preparation for this, the USB-IF—the industry group that together develops the various USB specifications—has announced the branding and naming that the new revision is going to use, and… It’s awful. I won’t spoil it for you. It’s really, really bad.

The last POWER1 on Mars is dead

The Opportunity Rover, also known as the Mars Exploration Rover B (or MER-1), has finally been declared at end of mission today after 5,352 Mars solar days when NASA was not successfully able to re-establish contact. It had been apparently knocked off-line by a dust storm and was unable to restart either due to power loss or some other catastrophic failure. Originally intended for a 90 Mars solar day mission, its mission became almost 60 times longer than anticipated and it traveled nearly 30 miles on the surface in total. Spirit, or MER-2, its sister unit, had previously reached end of mission in 2010. And why would we report that here? Because Opportunity and Spirit were both in fact powered by the POWER1, or more accurately a 20MHz BAE RAD6000, a radiation-hardened version of the original IBM RISC Single Chip CPU and the indirect ancestor of the PowerPC 601. There are a lot of POWER chips in space, both with the original RAD6000 and its successor the RAD750, a radiation-hardened version of the PowerPC G3. What an awesome little tidbit of information about these Mars rovers, which I’m assuming everybody holds in high regard as excellent examples of human ingenuety and engineering.

Building a RISC-V PC

While it’s clear that the most significant opportunities for RISC-V will be in democratising custom silicon for accelerating specific tasks and enabling new applications — and it’s already driving a renaissance in novel computer architectures, for e.g. IoT and edge processing — one question that people cannot help but ask is, so when can I have a RISC-V PC? The answer to which is, right now. The result is a RISC-V powered system that can be used as a desktop computer and thanks to the efforts of Atish Patra at Western Digital, installing Fedora Linux is a breeze. This is obviously not exactly commodity hardware, but it does show that the ingredients are there and the combination provides a powerful development platform for anyone who might want to prototype a RISC-V PC — or indeed a vast array of other applications which stand to benefit from the open ISA. This has me very excited. Over the last few decades, virtually all competitors to x86 slowly died out – SPARC, PowerPC, MIPS, etc. – which turned desktop computing hardware into a rather boring affair. Recently we’ve been seeing more and more ARM desktop boards, and now it seems RISC-V is starting to dabble in this area too. Great news.

A touchpad is not a mouse, or at least not a good one

One of the things about having a pretty nice work laptop with a screen that’s large enough to have more than one real window at once is that I actually use it, and I use it with multiple windows, and that means that I need to use the mouse. I like computer mice in general so I don’t object to this, but like most modern laptops my Dell XPS 13 doesn’t have a mouse, it has a trackpad (or touchpad, take your pick). You can use a modern touchpad as a mouse, but over my time in using the XPS 13 I’ve come to understand (rather viscerally) that a touchpad is not a mouse and trying to act as if it was is not a good idea. There are some things that a touchpad makes easy and natural that aren’t very natural on a mouse, and a fair number of things that are natural on a mouse but don’t work very well on a touchpad (at least for me; they might for people who are more experienced with touchpads). Chris Siebenmann makes some good points regarding touchpads here. Despite the fact that touchpads on Windows and Linux have gotten better over the years, they’re still not nearly as good as Apple’s, and will never beat a mouse. I feel like mouse input on laptops is ripe for serious innovation.

The CADR microprocessor

The CADR microprocessor is a general purpose processor designed for convenient emulation of complex order codes, particularly those involving stacks and pointer manipulation. It is the central processor in the LISP machine project, where it interprets the bit-efficient 16-bit order code produced by the LISP machine compiler. (The terms “LISP machine” and “CADR machine” are sometimes confused. In this document, the CADR machine is a particular design of microprocessor, while the LISP machine is the CADR machine plus the microcode which interprets the LISP machine order code.) I’ll admit I have no idea what anything in this long, technical description means, but I’m pretty sure this is right up many readers’ alleys.

LG’s groundbreaking roll-up TV is going on sale this year

LG is going several steps further by making the TV go away completely whenever you’re not watching. It drops slowly and very steadily into the base and, with the push of a button, will rise back up in 10 seconds or so. It all happens rather quietly, too. You can’t see the actual “roll” when the TV is closed in, sadly; a transparent base would’ve been great for us nerds to see what’s happening inside the base as the TV comes in or unfurls, but the white is certainly a little more stylish. Functionally, LG tells me it hasn’t made many changes to the way the LG Display prototype worked aside from enhancing the base. I didn’t get to ask about durability testing — how many times the OLED TV R has been tested to go up and down, for example — but that’s something I’m hoping to get an answer to. We don’t really talk about TVs all that much on OSNews – it’s generally a boring industry – but this rollable display technology is just plain cool.

MIPS goes open source

Without question, 2018 was the year RISC-V genuinely began to build momentum among chip architects hungry for open-source instruction sets. That was then.

By 2019, RISC-V won't be the only game in town.

Wave Computing announced Monday that it is putting MIPS on open source, with MIPS Instruction Set Architecture (ISA) and MIPS' latest core R6 available in the first quarter of 2019.

Good news, and it makes me wonder - will we ever see a time where x86 and x86-64 are open source? I am definitely not well-versed enough in these matters to judge just how important the closed-source nature of the x86 ISA really is to Intel and AMD, but it seems like something that will never happen.

LG Releases Gram 17 laptop: ultra-thin, 17.3″ display

Due to their size and lack of portability, 17-inch notebooks are not exactly popular among road warriors. Instead this is largely the domain of desktop replacement-class machines, which in turn has caused 17-inch laptops to be built bigger still in order to maximize their performance and emphasize the replacement aspect. Every now and then however we see a 17-inch laptop that still tries to be reasonably portable, and this is the case with LG's latest gram laptop, which hit the market this week.

Equipped with a 17.3-inch screen featuring a 2560×1600 resolution, the LG gram 17 comes in a dark silver Carbon Magnesium alloy chassis that is only 17.8 mm (0.7 inches) thick, which is thinner than most 15-inch notebooks (in fact, this even thinner than the ASUS ZenBook Pro 15). Meanwhile, the laptop weighs 1.33 kilograms (2.95 pounds), which is in line with many 13-inch mobile PCs. As a result, while the 17-inch gram still has a relatively large footprint, its still a relatively portable laptop.

I'm genuinely surprised LG decided to put this 17-incher on the market - consider it a sort of spiritual successor to the 17" PowerBook G4, in my view one of the best laptops ever made. It seems like the market has pretty much settled on 12"-13", with a few professional and low-end laptops offering a 15" screen. I hope this LG laptop is at least even a modest success, because I'd love for more 17" laptops to make it to market.

Qualcomm announces the details of the Snapdragon 855

Today is the second day of Qualcomm's Snapdragon Technology Summit in Maui, and while yesterday was all about 5G and a teaser for its new chipset, today is all about the Snapdragon 855. The new chipset is built on a 7nm architecture, promising faster speeds, better battery life, and improved connectivity.

But as far as general performance goes, Qualcomm says that its Kryo 485 cores will offer a 45% boost, and the Adreno 640 GPU will show a 20% increase. With the firm's Snapdragon Elite Gaming Platform, gamers will be able to play in HDR with physically based rendering (PBR).

If these numbers hold up - only independent benchmarking will tell - this will go a ways to closing the wide gap with Apple's current offering at least partially.

Amazon developed its own ARM core for its own cloud services

Today we are launching EC2 instances powered by Arm-based AWS Graviton Processors. Built around Arm cores and making extensive use of custom-built silicon, the A1 instances are optimized for performance and cost. They are a great fit for scale-out workloads where you can share the load across a group of smaller instances. This includes containerized microservices, web servers, development environments, and caching fleets.

Interesting to see Amazon design its own ARM core specifically for its own product.

The vacuum tube’s many modern day uses

Among obscure pop culture tidbits and stories about wacky inventions, Tedium has often documented the continued survival of technology long thought of as obsolete. From calculagraphs to COBOL, we love hearing that ancient tech survives in the 21st century and revel in the uses that keep them around. So it was surprising to dig through the Tedium archives looking for something I expected to find, but didn't. Today, we're righting that wrong and diving into the robust and thriving world of a technology that was foundational to the progress humanity made during the 20th century. Today's Tedium is talking vacuum tubes.

Reusing old hardware

Everybody has one. At least one. Collecting dust in a closet somewhere; waiting to be thrown away. It's not a time capsule per-se, but if you looked at it now it would probably show you a snapshot of a life you lived not that long ago. It was once a source of pride, entertainment, accomplishment or perhaps comfort. Maybe it was a status symbol. Now you would call it useless, worthless, junk.

We're not talking about the photo album from your dormroom party days, although it might still contain a copy. We’re talking about your old PC, laptop, netbook, or computer. That thing you spent hundreds or thousands of dollars on to sit in front of for hours doing whatever it is that you do. Maybe it helped you get a degree, or maybe it was your primary source of income. Doesn’t matter now anyway. Your smart-toaster does more MIPS and FLOPS with half the power! There's no value in an old computer, right?

Wrong! If the commoditization of computing hardware and the steady marching of Moore's law has done anything to old computers it has been to breathe new life into them. How, you ask?

Putting old hardware to new uses is one way of recycling - I tend to give away my "old" smartphones as I buy new ones way too often. Often, a friend's phone stopped working or a family member needs a new one - so I just give them mine.

High-res graphics on a text-only TRS-80

From the Byte Cellar:

What inspired me to pull the Model 4 down off the shelf were a number of tweets from telnet BBS pals showing the system being put to great use logged into various systems across the web. Some of the screenshots showed the machine rendering ANSI "graphics" onscreen and I looked into it. As I suspected, the stock Model 4 is not capable of taking on a custom character set such as is needed by ANSI emulation, and I discovered the system had been equipped with a graphics board and the ANSI-supporting terminal program, ANSITerm, was rendering "text" to a graphics display; the character set was basically a software font.

And I just had to go there.

Why do computers use so much energy?

Microsoft is currently running an interesting set of hardware experiments. The company is taking a souped-up shipping container stuffed full of computer servers and submerging it in the ocean. The most recent round is taking place near Scotland's Orkney Islands, and involves a total of 864 standard Microsoft data-center servers. Many people have impugned the rationality of the company that put Seattle on the high-tech map, but seriously - why is Microsoft doing this?

There are several reasons, but one of the most important is that it is far cheaper to keep computer servers cool when they're on the seafloor. This cooling is not a trivial expense. Precise estimates vary, but currently about 5 percent of all energy consumption in the U.S. goes just to running computers - a huge cost to the economy as whole. Moreover, all that energy used by those computers ultimately gets converted into heat. This results in a second cost: that of keeping the computers from melting.

I use a custom watercooling loop to keep my processor and videocard cool, but aside from size and scale, datacenters struggle with the exact same problem - computers generate a ton of heat, and that heat needs to go somewhere.