Hardware Archive
Gadgets are getting too thin, again. These past few weeks saw some of the latest victims of the seemingly unending drive towards making our devices as thin as possible, no matter the consequences. Samsung’s Galaxy S22 and S22 Plus — what will undoubtedly be some of the most popular Android phones of the year — are thinner than last year’s models and held back by disappointing battery life. The new Dell XPS 15 is “exceptionally thin and light” but barely lasts four hours on a charge and runs nearly as hot as the sun. And the OnePlus 10 Pro is a flagship smartphone that can somehow be snapped in half with your bare hands. It seems that despite over a decade of chasing the thinnest, lightest phones and computers around to the detriment of battery life, cooling, and durability, companies still haven’t learned their lessons. I prefer a few more millimeters if it means better heat dissipation, less fan noise, and better battery life. I’m not entirely sure if consumers in general prefer thinness over these other aspects, but I doubt they do.
Submitted by Mark Dastmalchi-Round
2022-02-24
Hardware
We’re fast approaching the 40th birthday of the Sinclair Spectrum in 2022, and to keep myself occupied during COVID lockdowns I decided it would be a lot of fun to go back and re-visit the computer that started it all for me. I set about coding and building the infrastructure for a Spectrum-based community project (website at tnfs.markround.com) incorporating my current-day tools and knowledge, hence the title of this series of posts. The enterprise grew into a curious mix of old and new: Container-based pipelines with Ruby server-side components, all interacting with Spectrum BASIC and z80 assembly code, running on real 1980s hardware with a TCP/IP connection. If you’ve ever wondered how to unit-test Sinclair BASIC programs in GitOps pipelines running on Kubernetes clusters, this is the set of articles for you. I love it when people push these old machines to their limits with modern knowledge.
Late last year, we reviewed Slimbook’s KDE Slimbook, a special version of the Spanish’ Linux OEM’s 15″ laptop made in collaboration with the KDE project. I found it to be an excellent laptop, which left little to be desired for anyone in the market for a laptop of that size. It came with tons of power, unobtrusive fans, a great design, and a fair price tag. That being said – personally, I prefer smaller laptops. The KDE Slimbook’s 15 inches is just a bit too wide for me to be comfortable, and I’d much rather have something in the area of 13-14 inches. Luckily, Slimbook has an offering in this segment too: the Slimbook Executive. I’ve been using and testing one for the last few months, and I can confidently say the KDE Slimbook was not a fluke. Slimbook is running a special deal just for OSNews readers! When ordering your Slimbook Executive, use the promo code executive-laptop-osnews for a €150 discount!Note: OSNews does not receive any percentage of the sales using this promo code (or sales not using this promo code for that matter). The Slimbook Executive is 14″ ultrabook weighing in at a mere 1kg. Like the KDE Slimbook, it is also made from magnesium, which I find much more pleasant to handle than aluminium. I find magnesium more pleasant to touch and hold – it’s not as cold and harsh as aluminium, and it’s lighter too, which makes sense for an ultraportable laptop like this one. Instead of AMD, the Executive is powered by Intel’s Core i7-1165G7, with 4 cores and 8 threads, paired with Intel’s Iris Xe integrated graphics. It has two RAM slots for a maximum total of 64GB of RAM; my review unit was configured with 16GB of RAM, which is more than enough for a modern Linux distribution on such a portable machine. Despite being relatively small, the laptop has ample room for storage – it comes with two M.2 slots, one at PCIe 4x and one at PCIe 2x, for an out-of-factory configurable total of 4TB of storage. Unlike some of the competition from more established, larger OEMs, there’s no shortage of ports here. It has two USB-A 3.2 Gen1 ports, 1 USB-C 3.2 Gen2 port (with video-out through DisplayPort 1.4), one Thunderbolt 4 port (also with video-out through DisplayPort 1.4a, as well as charging support at 90+W) an SD card reader, a full-size HDMI port, and the usual Kensington lock, barrel plug, and headphone jack. The keyboard is more of a standard affair than the fancy, unique keyboard design found on the KDE Slimbook. This time around, it’s a regular chicklet-style keyboard in its magnesium frame, entirely familiar to anyone who has used an ultrabook in the past five to ten years. It’s excellently boring and familiar, just as you want a keyboard to be. It’s of course also backlit, and luckily does not have the readability issues some of the keys on the KDE Slimbook had. The touchpad feels great, has support for multitouch gestures, but it is of the common diving board design, meaning clicking gets progressively harder the higher you go on the trackpad. I really wish Apple’s fancy force touch trackpads made their way to othe rmanufacturers, too, since it feels nicer to have the same click feeling no matter where you click. The trackpad is huge, but not as over-the-top as Apple’s recent touchpads. The design of the laptop itself is very generic – unlike the KDE Slimbook, there are no flourishes here that set it apart from the rest of the competition (aside from the Slimbook logo, of course). I don’t think that’s necessarily a bad thing – this segment of the market is very mature, and this general design popularised by the MacBook Air is popular for a reason. Crazy and unique design makes sense on a gaming laptop, but on a small ultrabook, I prefer to keep it simple. The display is the real star of the show. It’s a 14″ screen with a resolution of 2880×1800 (Slimbook calls it 3K), and a refresh rate of 90Hz. Between 4K at 60Hz and 1080p at 144Hz, I think this is an excellent middle ground that avoids the pixelated look of 1080p at 14″, while still giving a decently smooth refresh rate. I definitely prefer this display over the 4K 60Hz panel on my Dell XPS 13, which is saying a lot, since that panel is one of the very best you could get at the time. There is one issue with the display I need to talk about. As it turns out, as soon as you try to install a kernel newer than roughly 5.11 or so, you’re going to see major screen flickering and corruption. After talking to Slimbook about this, it turns out this is because of an issue with panel self-refresh, a powersaving feature in Intel’s driver. This is known to cause issues in some cases, and the solution is to disable the feature using GRUB2 (add i915.enable_psr=0 to the kernel parameters). It’s important to note that you only have to apply this fix if you install a Linux distribution by yourself; the preinstalled Slimbook OS – a slightly modified version of Ubuntu – did not experience this problem, and I’m sure if you select any of the other preinstalled Linux distributions during the order process, Slimbook will also make sure the issue is handled before shipping. Slimbook has also told me they are currently beta testing a BIOS update that will fix this problem at the BIOS level, so once that update is released and installed, this issue will disappear. The battery life is exactly as you’d expect – I’m getting about 8 hours with office-type work, video watching, and some browsing. Using Slimbook’s own applications for managing the battery and processor states, you get some decent control over your performance and battery life, but a Debian-based distribution is required to make installation as easy as possible, since otherwise you’ll
Nvidia Corp. is quietly preparing to abandon its purchase of Arm Ltd. from SoftBank Group Corp. after making little to no progress in winning approval for the $40 billion chip deal, according to people familiar with the matter. Nvidia has told partners that it doesn’t expect the transaction to close, according to one person, who asked not to be identified because the discussions are private. SoftBank, meanwhile, is stepping up preparations for an Arm initial public offering as an alternative to the Nvidia takeover, another person said. Look, Nvidia is obviously far from perfect, but the alternatives seem far, far worse. Would you want Arm to end up at Google, Apple, Microsoft, Amazon, or one of the big Chinese players? I’m simply afraid an independent Arm will end up in far worse arms a few years down the line than Nvidia.
Steve Jobs’s NeXT computer company made a keyboard in 1988. With no prior electronics experience, I tried to get it to work over USB. To do so, I had to go way deeper than I ever expected – all the way back over 100 years to broadcast radio standards from the 1920s. I learned tons and tons, and had a lot of fun. The things people do for the perfect keyboard.
In retrospect, it might be a bit tough to put a circle around what constituted a workstation. Is a PERQ a workstation? Probably. Xerox Alto and Star? Definitely. Symbolics Lisp machines? Not sure. Probably? The real success stories came out of Apollo, Sun, HP,IBM,NeXT,DEC and Silicon Graphics. For a time it was a hot market, especially in what was known then as technical computing: research, manufacturing, CAD, graphics, simulations. If you had a job where you were issued a Sun or an Apollo (back in the day) or an SGI, you were elevated. You were no longer some pleb coding in basic on a C64 or a tie wearing IBM clone user. You had entered a rarified sphere with limitless power at your fingertips. An Amiga was a grubby kids toy by comparison and the IBM PC was slow to move to graphical applications. The workstation manufacturers had fancy graphics, 32 bit processors and scarily huge margins. The designs of the boxes could be wild: The SGI Indy didn’t look like anything Bob from accounting had on his desk and you couldn’t buy anything like that at K-Mart. UNIX workstations from the ’90s and early 2000s are definitely my favourite genre of computers. My personal white whale is definitely the SGI Tezro, the last hurrah of SGI before they went all in on Intel, closely followed by Sun’s Ultra 45, its last SPARC workstation. These machines are only getting more expensive by the month now, and people are charging insane amounts of money for these, effectively, useless, dead-end machines. That’s why ordered all the parts for building my own dual-Xeon workstation.
PCI Express technology has served as the de facto interconnect of choice for nearly two decades. The PCIe 6.0 specification doubles the bandwidth and power efficiency of the PCIe 5.0 specification (32 GT/s), while providing low latency and reduced bandwidth overhead. We’re barely seeing the rollout of PCIe 5.0 begin, and we’re already moving ahead. Also, who knew the standards organisation for PCIe is headquartered in Beaverton, Oregon, of all places. Although, to be fair, any city that understands and caters to the beautiful, thrilling, and honest sport of curling is a great city. And I’m not joking here – curling is exquisite, and quite probably the noblest of sports.
It’s not unusual to hear that a particular military technology has found its way into other applications, which then revolutionized our lives. From the imaging sensors that were refined to fly on spy satellites to advanced aerodynamics used on every modern jetliner, many of these ideas initially sounded like bad science fiction. So did this one. I had never heard of this.
So, AMD, Intel, and Nvidis all decided to announce their latest products all on the same day yesterday. Let’s start with Intel, who announced the laptop version of their latest generation of processors, and if the performance claims hold up, they’re some damn good chips – but as always, we’ll have to await proper benchmarks. These laptop chips use Intel’s new hybrid processor architecture, which combines larger, faster performance cores with smaller, more efficient cores (P-cores and E-cores, respectively). How many P-cores and E-cores you get depends on the processor you’re buying, and you’ll need an operating system that supports Intel’s “Thread Director” technology to get the most performance out of the chips. Windows 11 supports it now, Linux support is in the works, and Windows 10 doesn’t have it and won’t be getting it. AMD, not wanting to be outdone, introduced its Ryzen 6000 series of mobile processors, which finally move their integrated graphics to RDNA 2m, and are the first to include Microsoft’s Pluton security chip. Yesterday AMD disclosed that they would be launching the new Ryzen 6000 Mobile series today – updated cores, better graphics, more features, all in a single monolithic package a little over 200 mm2. There will be 10 new processors, ranging from the traditional portable 15 W and 28 W hardware, up to 35 W and 45 W plus for the high-end gaming machines. AMD is expecting 200+ premium systems in the market with Ryzen Mobile in 2022. Finally, we have NVIDIA, with the smallest announcement of new high and low-end mobile GPUs.
The evolution to USB-C connectors just after the release of the USB 3.1 standard promised simplicity. Instead of host device Type-A and peripheral Type-B, Mini-B, Micro-B, and others, a single connector works for both ends of a connection and carries both power and data. Power can flow either way with the same cable: a computer charging a battery or phone; a battery charging a computer. It’s also reversible across its long axis, so it’s impossible to insert it in the wrong orientation. USB-C was supposed to be the last cable you would ever need. It hasn’t worked out that way. Better names for standards, mandatory logos on cables. That’s all we needed from the USB-IF. This has been bungled so hard they couldn’t have messed it up more if they tried.
Repairability of electronics is a hot topic when hardware gets discussed, and Dell produced a concept laptop to explore the idea of a highly repairable Dell laptop. On Tuesday, Dell announced a new design concept for a laptop that’s long lived, easy to take apart and fix, and takes a smaller toll on the climate. It’s a collection of ideas that could go a long way toward making the tech giant’s products more sustainable — depending on whether, and how, Dell decides to implement them. Called “Concept Luna,” the proof-of-concept laptop dreamed up by Dell’s design team has a number of unusual features that are intended to make repair and maintenance easy. No screwdrivers or glue solvents are needed to pry loose a broken keyboard or peel off a cracked screen; both components simply pop free after a pair of keystones holding them in place are removed. The entire system contains far fewer screws than a typical Dell laptop, reducing the time needed to replace components. And you’ll never have to worry about replacing a broken fan, because there isn’t one: a shrunken-down motherboard placed in the top cover allows the laptop to passively cool itself. As good as this sounds, there is a red flag. Dell told The Verge that Concept Luna’s board “doesn’t have any more soldered on or integrated components than a typical laptop we sell today”. That’s right. Dreams of user replaceable RAM, CPU, and storage are probably going to remain dreams, and consumers are going to be stuck with however the machine was provisioned at build time. Like concept cars, this probably isn’t going to go into production, but the ideas could find their way into future products.
If you’ve followed the display, graphics card or games console market at all recently you will surely have heard about HDMI 2.1. It’s the new connection interface standard widely being adopted on new graphics cards, displays, games consoles and other devices; allowing support for improved bandwidths, resolutions, refresh rates and features. It’s one of the hot topics at the moment when it comes to buying a new device, and promoted heavily by manufacturers, often as one of the leading items in their spec. In this article we want to look at what the “HDMI 2.1” term really means, and address a worrying early sign in the market of things to come. We’ve delved in to what is required for this certification and what that means to you as a consumer if you ever want to buy something labelled with HDMI 2.1. Don’t make any assumptions about what that will give you, sadly it doesn’t seem to be nearly as simple as that. Oh good. More weird cable and port specifications to worry about.
So you want to play Adventure, but don’t know how to turn on the PDP-11? These instructions are for booting our dual rack machine from its RL01 drives, although booting the single cabinet machine from the RK05 is very similar. Detailed instructions for booting a PDP-11, including lots and lots of photos.
The other day I asked myself a seemingly trivial question: What was the first ATAPI CD-ROM drive and when was it available? Given that ATAPI was a major technology which instantly obsoleted all proprietary CD-ROM interfaces and made SCSI much less desirable, one might expect that there would have been some press releases touting the advantages of the new technology, articles describing the whys and wherefores, but… nope. There is nothing. And so begins a deep dive into the origins of ATAPI, through examining early drivers and their code.
The Soviet-made 1801VM2 CPU (a binary-compatible implementation of the PDP11 instruction set and QBUS interface) was developed in 1982. The 1801VM2 is a further development of the earlier 1801VM1 doubling the original 5MHz clock speed. From a constructive standpoint this CPU is a completely independent development. There’s a wealth of interesting computer technology in the former USSR, and it’s great to see more of it make its way online.
To capture a composite video signal and display it on my computer’s output, I need to use an upscaler that converts to an HDMI signal, then an HDMI capture device which in turn communicates with my PC over USB. Then, I can overlay my stupid face over it and send it to Twitch or something. But what if it was 1984? Of course, Twitch wouldn’t exist, nor would HDMI. So what’s the next best thing? Ah, the MSX. Most people focus on how popular it was in Japan, but they rarely mention that, because of the involvement of the Dutch company Philips, the MSX was also remarkably popular in my country of origin, The Netherlands. Some of my earliest computer memories took place on an MSX at a friends’ place. The particular model of MSX in this article, however, is something entirely different from the kinds of MSX machines I ran into as a kid. This thing has a considerable number of tricks up its sleeves, and now I just know I’ll be spending considerable time on eBay.
Some time ago, I thought it would be useful to understand exactly what is the difference between CD-ROMs recorded in the old High Sierra format versus the ISO 9660 standard. This was in part spurred by the fact that I have a number of CD-ROMs/images that use the High Sierra format (Microsoft Programmer’s Library, some IBM Developer Connection issues, OS/2 Warp 4, and more) that both macOS and Windows 10 refuse to mount. The other part of my motivation was the usual insatiable curiosity. I had no idea about the existence of this different format.
The 6502 was the CPU in my first computer (an Apple II plus), as well as many other popular home computers of the late 1970s and 80s. It lived on well into the 1990s in game consoles and chess computers, mostly in its updated “65C02” CMOS version. Here’s a re-implementation of the 65C02 in an FPGA, in a pin-compatible format that lets you upgrade those old computers and games to 100 MHz clock rate! Interesting project.
This Atari 1040ST is still in use after 36 years! Frans Bos bought this Atari in 1985 to run his camp site (Camping Böhmerwald). He wrote his own software over the years to manage his camp site, as well as reservations and the registration of the guests. He really likes the speed of the machine compared to newer computers. And 6 months every year the machine is on day and night.
Tracking quantum computing has been a bit confusing in that there are multiple approaches to it. Most of the effort goes toward what are called gate-based computers, which allow you to perform logical operations on individual qubits. These are well understood theoretically and can perform a variety of calculations. But it’s possible to make gate-based systems out of a variety of qubits, including photons, ions, and electronic devices called transmons, and companies have grown up around each of these hardware options. But there’s a separate form of computing called quantum annealing that also involves manipulating collections of interconnected qubits. Annealing hasn’t been as worked out in theory, but it appears to be well matched to a class of optimization problems. And, when it comes to annealing hardware, there’s only a single company called D-Wave. Now, things are about to get more confusing still. On Tuesday, D-Wave released its roadmap for upcoming processors and software for its quantum annealers. But D-Wave is also announcing that it’s going to be developing its own gate-based hardware, which it will offer in parallel with the quantum annealer. We talked with company CEO Alan Baratz to understand all the announcements. I think I understood some of those words because I, too, watch Space Time.